Regeneration in humans is much more limited compared in other animals. Say for instance when one lost a limb, much as well say goodbye to it for the rest of one’s life. Perhaps, it would be nice if we have higher capacity to regenerate many of our indispensable body parts, like head, limbs, and many other “regeneration-incapables”. Then probably, we might not have to worry much about losing any of them knowing that they will eventually re-grow in due time.
Regeneration vs. Healing
Humans have the capacity to regenerate. However, we have a very limited capabiltiy to restoring parts of our skin, hair, nails, fingertips, and liver. At the tissue level, surely we have dedicated cells to replace lost and damaged cells. For instance, our non-injured bone eventually replenishes into a full new bone but in a span of ten years. Our skin naturally renews but give it two weeks. The story swerves differently though in the case of an injury.
Rather than expending energy into having it replaced with a new one, our body directs its efforts into healing it. So when our skin is deeply damaged, our body fixes it with a scar. Tissue repair mechanisms such as wound healing aren’t really a snag. They forestall pathogenic microbes from using an injured body part as an easy gateway into our body. (Besides, we do have ample microbiota naturally thriving inside of us already) The main goal is to fix it efficaciously, with relatively less effort.
Natural regeneration in humans
In humans, the only tissue that regenerates naturally, consistently, and completely is the endometrium.1 After it slough-offs during a woman’s menstrual period, it grows back by re-epithelialization until the next period. Humans can also regenerate an injured liver provided that the restoration involves as little as 25% of the original liver mass. The liver can grow back to its original size but may not to its original shape. Damaged tubular parts of the kidney can also re-grow. The surviving epithelial cells undergo migration, dedifferentiation, proliferation, and re-differentiation to set a new epithelial lining of the tubule.
Animals with higher regeneration capacities
Some animals have higher capacity to re-grow lost body parts. Sharks, skates, and rays can regenerate their kidneys. They can regrow an entire nephron, which humans cannot. A lizard would drop its tail as a mode of escape; its tail will be fully restored over time anyway. Sharks do not have qualms about losing teeth. They can replace any of them more than a hundred times in their lifetime. Axolotl can replace its broken heart. A starfish will once again be stellar upon the return of a lost arm. In fact, even its lost arm can fully regenerate into an entire starfish as long as the central nerve ring remains intact.2 A decapitated planarian worm needs not worry about losing its head; it can grow back, together with its brain, including the memories.2 Without a doubt, many of these animals are simply masters of their craft – regeneration.
Researchers from Harvard University published their new findings on whole-body regeneration capacity of the three-banded panther worm.3 They uncovered DNA switches that seemed to regulate genes that have a role in the regeneration process. Accordingly, they found a section of a non-coding DNA that controlled the activation of a master gene in which they called the “early growth response” (EGR) gene. When active, the EGR gene seemed like a power switch that turns on and off certain genes in the coding region during regeneration. On the contrary, when deactivated, no regeneration occurred.
Surprisingly, humans have EGR gene, too. So why doesn’t it lead to greater regeneration capacities as it does in the three-banded panther worm? The researchers explained that while it works in the worm, it doesn’t work the same way in humans. The wiring may be different. The worm’s EGR gene may have germane connections that are absent in humans.
Switching the gene on
Induced regeneration in humans is one of the goals of regenerative medicine. This field of medicine seeks new ways to give our regenerative capacity a boost. One of the ways is to look “molecularly”. Researchers are looking into the gene “Lin28a“. When active, this gene can reprogram somatic cells into embryonic-like stem cells. Accordingly, it has a role in tissue regeneration and recovery. However, the gene is naturally turned off in adults. Research in boosting our regenerative capacities is ongoing. Switching our organs from being regeneration-incapable to regeneration-capable may just be a matter of discovering the gene switch that could enhance regeneration capacity of humans.
— written by Maria Victoria Gonzaga
1 Min, S., Wang, S. W., & Orr, W. (2006). “Graphic general pathology: 2.2 complete regeneration”. Pathology. pathol.med.stu.edu.cn. Retrieved from [Link]
2 Langley, L. (2013, August 28). “Pictures: 5 Animals That Regrow Body Parts”. National Geographic News. Retrieved from [Link]
Many people are afraid of getting measles vaccine these days. The fear arises from the allegedly adverse effects of it, such as autism. However, this fear comes along with the resurgence of the dreaded measles outbreak. Consequently, measles once again takes many lives, especially of the children, who ought not die from such a preventable disease.
Measles vaccine being linked to autism
In 1998, a team of scientists headed by Andrew Wakefield published a paper in refutable science journals. Accordingly, MMR vaccine — a cocktail of vaccine that protects against measles, mumps, and rubella— seems to have a causal link to autism in children. He and his colleagues reported twelve children that displayed delay in growth development; eight of them had autism a month following MMR vaccine. However, the paper was later retracted. Accordingly, “several elements” of a 1998 paper Lancet1998;351:637–41 “are incorrect, contrary to the findings of an earlier investigation”. The retraction clearly indicated data misconception. However, that did not end there. Wakefield and his team published yet another study. Again, they implicated measles virus to autism.
Second study, still questionable
In 2002, Wakefield and his team biopsied samples from the intestines of two groups of children: with autism and control (without autism). They tested the presence of measles virus genome via reverse-transcriptase PCR and in situ hybridization. They reported 75 of 91 children with autism tested positive for measles virus genome. In the control group, only five of 70 were positive. Accordingly, their findings corresponded to their earlier conjecture linking measles virus to autism in children. However, critics still found critical flaws. For instance, the authors failed to stipulate with proof the origin of the measles virus genome in the patients — whether from nature or from the vaccine.
Studies refuting the link
Two independent large-scale studies (one in California, USA and another in England, UK) denied the link between MMR vaccine and autism. Truly, the number of children with autism dramatically increased. However, the percentage of children receiving MMR vaccine remained constant. The empirical data on a larger scale of population indicated the absence of causal relationship between measles vaccine and autism.
The side effects associated with MMR vaccine are mild symptoms of measles, mumps, and rubella. And not all children administered with it will show symptoms. As for measles, the common symptoms include swelling and redness at the site of injection, fever, and rash. Rare symptoms include anaphylaxis, bruise-like spots, and fits. No categorical study has fully established that MMR vaccine causes autism in children. Nevertheless, many people remain hesitant despite the many years of proven efficacy of measles vaccine. Their worries were aggravated by the likes of Wakefield studies linking MMR vaccine to autism in children.
Dubbed as anti-vaxxers, these people utterly lost their confidence on vaccines so much that they secluded and kept their children from getting vaccinated. The main reason arises from their fear that vaccines would cause more harm than good. Some of them even took a legal step against vaccine manufacturers for allegedly having identified the culprit of their child’s developmental delay. And despite the disavowal of Wakefield’s paper and having been repudiated by ensuing studies dissociating autism from MMR vaccine, many people including autism advocacy groups have not abandoned their skepticism. Some of them even came up with a “conspiracy theory” that vaccine manufacturers may be conspired into hiding the “truth”, i.e. MMR vaccine causes autism.
Pathobiology of measles
The genus Morbillivirus, a single-stranded, negative-sense RNA virus, is the causative agent of measles, the highly contagious airborne disease. Humans are the only known host of the virus. The video below describes how the measles virus infects the host cell.
In summary, the virus infects the epithelial cells lining the trachea or the bronchi upon reaching the mucosa. The virus gains entry into the host cell via its surface protein, hemagglutinin (H protein). The H-protein binds to the receptor (e.g. CD46, CD150, or nectin-4) on the surface of the target host cell. After binding, the virus fuses with the cell membrane to get inside the cell. Then, it makes use of the cell’s RNA polymerase to transcribe its RNA into mRNA strand. After which, the mRNA is translated into viral proteins in which the host cell’s lipid will envelope them for their subsequent release outside the cell. They spread to lymph nodes, and then to other tissues (e.g. brain and intestines). Soon, the disease manifests as fever, cough, runny nose, inflamed eyes, and rash. Common complications include pneumonia, seizures, encephalitis, and subacute sclerosing panencephalitis.
The vaccine that prevented the disease was first made available in 1963. It may be administered solely or in combinations, like in MMR vaccine. MMR vaccine renders protection against measles, mumps, and rubella viruses. The World Health Organization (WHO) recommends that measles vaccine be administered to infants at nine or twelve months of age. A person needs only two doses during childhood for lifelong immunity.
How vaccines work
Measles vaccine contains live but weakened strain of measles virus. Vaccines work by triggering an immune response from the white blood cells. These cells recognize them through the surface proteins of the virus. White blood cells, such as B cells, produce multifarious antibodies. One of the antibodies can fit to the surface protein. This will trigger the B cell to produce clones, called memory B cells, which, in turn, will produce large amounts of antibodies specific to the identified pathogen.
A re-encounter with the virus having the same surface protein would enable the antibodies to respond quickly by binding with and disabling the virus. They can also make it “palatable” to macrophages and other phagocytic cells that engulf and kill pathogens. How come the measles vaccine remain effective for so many years? The surface proteins of the measles virus are not prone to changes as presumed, and any mutation on them may render them dysfunctional. Thus, the immune system will always recognize the measles virus. And the immune response would be so quick that most of the time the vaccinated individual would no longer be ill.
One of the benefits of a rabid immunization program is that the immune protection extends to those who have not received the vaccine yet. Referred to as herd immunity, the community becomes protected from measles when a huge percentage of the population got the vaccine. In a study published in the journal Frontiers in Public Health, measles vaccination in a sequence recommended by WHO apparently helped reduce child mortality. But in order to prevent and ultimately eliminate measles, WHO seeks global immunization coverage of at least 95%.
Recent measles outbreak
Failure to reach the idyllic 95% global coverage leads to the inevitable measles outbreak. For several years, global coverage with the first dose of measles vaccine has stood at only 85% whereas the second dose, at 67%. Thus, measles outbreaks occurred in all regions with over a hundred thousands of fatalities mainly due to serious complications. In 2000, about 21 millions of lives have been saved due to measles vaccine. However, measles cases around the globe surged by more than 30% from 2016.
Dr. Seth Berkley of Gavi, the Vaccine Alliance, elucidated the reasons of the alarming resurgence of measles of recent. He said, “Complacency about the disease and the spread of falsehoods about the vaccine in Europe, a collapsing health system in Venezuela and pockets of fragility and low immunization coverage in Africa are combining to bring about a global resurgence of measles after years of progress. Existing strategies need to change: more effort needs to go into increasing routine immunization coverage and strengthening health systems. Otherwise we will continue chasing one outbreak after another.”
Measles vaccine has indubitably protected millions of lives. However, because of the escalating apprehensions and the reluctance towards measles vaccination, we fell short from achieving the goal of eliminating the disease. If only we could stick by the goal and support local immunization program efforts, we might have already won it over once and for all. Measles is a preventable disease and measles vaccine has already been tried and tested for over so many years. I hope it would not reach to the point whereby an immunization mandate would be the inevitable recourse when in essence we can simply heed the call.
— written by Maria Victoria Gonzaga
1 Wakefield, A.J., Murch, S.H., Anthony, A., et al. (1998). Ileal-lymphoid-nodular hyperplasia, nonspecific colitis, and pervasive developmental disorder in children. Lancet, 351: 637-641.
2 Eggertson, L. (2010). Lancet retracts 12-year-old article linking autism to MMR vaccines. Canadian Medical Association Journal, 182(4), E199–E200. [Link]
3 Uhlmann, V., et al. Potential viral pathogenic mechanism for new variant inflammatory bowel disease. Journal of Clinical Pathology: Molecular Pathology 55:1-6, 2002. [Link]
4 Offit, P.A. (n.d.). Vaccines and Autism. [PDF]
5 NHS Choices. (2019). Vaccinations. Retrieved from [Link]
6 Moss, W.J. & Griffin, D.E. (14 January 2012). “Measles”. Lancet, 379 (9811): 153–64. [doi:Link]
7 Cell Press. (2015, May 21). Why you need one vaccine for measles and many for the flu. ScienceDaily. Retrieved from [Link]
8 Frontiers. (2018, February 12). Measles vaccine increases child survival beyond protecting against measles: New study shows all-cause mortality is significantly lower when a child’s most recent immunization is a measles vaccine. ScienceDaily. Retrieved from [Link]
9 World Health Organization. (2018, November 29). Measles cases spike globally due to gaps in vaccination coverage. Retrieved from [Link]
In the advent of 2019, we are inspired to set new goals, pursue life-long dreams, or simply make better choices. Perhaps, one of the most common reveries we wish to realize is to be able to adopt a healthier kind of lifestyle. With this in mind, some of us look for ways to feel dutifully healthier, such as by managing our weight. So, many would turn to fad diets and caloric restrictions that promise to help. One of them is intermittent fasting. Based on studies, intermittent fasting does not only help trim weight but it seems to offer further health benefits as well.
Intermittent fasting – overview
In May 2018, I wrote the article: “Intermittent Fasting – benefits and caution“. There, I tackled briefly about intermittent fasting, its benefits, and potential risk. In essence, intermittent fasting is a cyclic pattern of a period of fasting and a subsequent period of non-fasting. The most common forms are: (1) whole-day fasting and (2) time-restricted eating. Whole-day fasting entails one-full day of “no eating”, done twice a week (thus, referred to as “5:2 plan“). In time-restricted eating, there is an interval of fasting and non-fasting on a daily basis. It could be half a day of fasting, and then the remaining half as the non-fasting period. With intermittent fasting, it’s not so much about “what to eat…” or “how much…” Rather, it’s more about a question of when.
Intermittent fasting became popular because it does not only help curb weight but it also implicates other health benefits. It apparently slows aging and boosts the immune defense. However, as I pointed out in that article, caution should still be taken. Intermittent fasting is not for everyone, especially those who are immunocompromised and underweight.
Rejuvenating effects of fasting
Previously, I mentioned that studies confirming the health benefits of fasting were done on non-human subjects (e.g. rodent models). Without much scientific proofs of efficacy on humans, what would, therefore, be definite is doubt. However, on January 29 of this year, a team of scientists from Okinawa Institute of Science and Technology Graduate University (OIST) and Kyoto University reported rejuvenating effects of fasting on human subjects. They published their findings in Scientific Reports. Accordingly, they analyzed the blood samples from four fasting individuals. They also monitored the levels of metabolites involved in growth and energy metabolism. What they found was quite interesting and promising.
Dr. Takayuki Teruya, one of the researchers of the team, said that their results implicated the rejuvenating effects of fasting. They found that many metabolites increased significantly, about 1.5- to 60-fold, in just 58 hours of fasting. In their previous study, they identified some of these metabolites (e.g. leucine, isoleucine, and ophthalmic acid), that typically deplete with age. According to Dr. Teruya, they found that the amount of these metabolites increased again in individuals who fasted. Also, they conjectured that fasting could possibly promote muscle maintenance and antioxidant activity based on the metabolites they found. Hence, fasting may probably promote longevity as well. Dr. Teruya further said that this was not yet known until now since most studies that have said so used animal models.
Fasting increased metabolism
During fasting, the body turns to alternate energy stores when carbohydrates are not available. Thus, the less-common metabolites from alternative metabolic pathways superseded the typical metabolites from carbohydrate metabolism. They identified butyrates, carnitines, and branched-chain amino acids as some of the metabolites that accumulated during fasting.  Apart from this, the researchers also found an increase in Citric acid cycle intermediates. This means that aside from prompting alternate metabolic pathways, fasting has also augmented the common metabolic activities. The metabolism of purine and pyrimidine seemed also heightened, indicating an increase in gene expression and protein synthesis. Because of this, the researchers also saw a boost in antioxidants (e.g. ergothioneine and carnosine) that protect cells from the free radicals produced by metabolism. The researchers assume to be the first to provide evidence of antioxidants as a fasting marker. 
This new-found proof infers that fasting seems to have some anti-aging effects, this time, on human subjects. Their next step is to see if they could duplicate the results in a larger-scale study. For now, let us remain cautious, look for indubitable substantiation, and weigh in the benefits and risks of all available options.
— written by Maria Victoria Gonzaga
1 Cohut, M. (2018). Intermittent fasting may have ‘profound health benefits’. Retrieved from [Link]
2 Longo, V. D., & Mattson, M. P. (2014). Fasting: Molecular Mechanisms and Clinical Applications. Cell Metabolism, 19 (2), 181–192. [Link]
3 Teruya, T., Chaleckis, R., Takada, J., Yanagida, M. & Kondoh, H. (2019). Diverse metabolic reactions activated during 58-hr fasting are revealed by non-targeted metabolomic analysis of human blood. ”Scientific Reports, 9”(1) DOI: 10.1038/s41598-018-36674-9.
4 Okinawa Institute of Science and Technology (OIST) Graduate University. (2019, January 31). Fasting ramps up human metabolism, study shows. ScienceDaily. Retrieved from [Link]
Scientists found dead tardigrades beneath the Antarctica based on their report published of recent. It was a surprising discovery since tardigrades have acquired the mark as the tiny infinities. They are so resistant to extreme conditions that they are thought of as some sort of “immortals“. Nonetheless, scientists found remains of tardigrades, together with crustaceans in deep, frozen Antarctic lake.
Antarctic Realm – The Cold Realm
The Antarctic is a region located in the southern-most tip of the Earth. The biogeographic realm that includes the Antarctic is called the Antarctic realm. A biogeographic realm refers to an area of land where similar organisms thrived and then evolved through periods of time in relative isolation. It rouses extensive research with the paramount objective of understanding the extent of biodiversity, especially the distributional patterns of residing organisms and the biological evolutionary history incurred.
The Antarctic biogeographic realm is the smallest of all realms. It spans a total area of about 0.12 million square miles. Its components include the land area, the Antarctic tectonic plate, the ice in the waters, and the ocean itself.  Because of the cold temperature, few floral species are able to persist and thrive. At present, around 250 lichens, 100 mosses, 25-30 livertworts, 700 algal species, and two flowering plant species (i.e. Antarctic hair grass and Antarctic pearlwort) inhabit the region. As for fauna, animal species include the penguins, seals, and whales.
An Icy Surprise
The discovery of the remains of tardigrades was unexpected, according to David Harnwood, a micropaleontologist. Late last year, Harnwood and his research team drilled a hole in the subglacial Lake Mercer. This frozen lake had been undisturbed for millennia. Thus, their research project SALSA (Subglacial Antarctic Lakes Scientific Access) was the first to conduct direct sampling. They were absolutely surprised to find these water bears –frozen and dead.
Astounded, the animal ecologist, Byron Adams, conjectured that these tardigrades might have come from the Transantarctic Mountains, and then carried down to Lake Mercer.  Further, he said, “What was sort of stunning about the stuff from Lake Mercer is it’s not super, super-old. They’ve not been dead that long.”
In September 2015, Jean-Michel Claverie and others reported two giant viruses (i.e. ”Pithovirus sibericum” and ”Mollivirus sibericum”) that they revived from a 30,000-year-old permafrost in Siberia.[3,5] Once revived, the viruses quickly became infectious to their natural hosts, the amoebae.  Luckily, these chilly giants do not prefer humans as hosts. Nonetheless, the melting of these frozen habitats could implicate danger to the public health when pathogens that can infect humans escape the icy trap.
A frozen Pandora’s Box
The frozen regions of the Earth hold so many astonishing surprises waiting to be “thawed”. In August 2016, a 12-year old boy from the Yamalo-Nenets region of Siberia died from anthrax. Reports included a few number of locals and thousands of grazing reindeer as well. Prior to the anthrax outbreak, a summer heatwave caused the melting of the permafrost in the Yamal Peninsula in the Arctic Circle. The thawing of the frozen soil unleashed anthrax bacteria presumed to have come from the carcass of their reindeer host that died over 75 years ago. Their release apparently reached the nearby soil, water, the food supply, and eventually their new hosts. The anthrax bacteria survived because they form spores that can protect them during their dormancy.
A Hotter Earth
Global warming supposedly increases the average temperature of the Earth’s surface enough to cause climate change. Accordingly, the global surface temperature increased 0.74 ± 0.18 °C (1.33 ± 0.32 °F) during the last century. The temperature rise brings threat as it could lead to environmental changes that could cause adverse effects of massive magnitude. One of which is the destruction of habitats due to the subsequent rise of water level from the melting of ice. Deadly pathogens could rise again from their cold slumber and plausibly cause another major mass extinction in no time. So, while we try to explore the deeper mysteries lurking beneath the ice, we should also make sure that we remain a step ahead. Claverie excellently put it:
The possibility that we could catch a virus from a long-extinct Neanderthal suggests that the idea that a virus could be ‘eradicated’ from the planet is wrong, and gives us a false sense of security. This is why stocks of vaccine should be kept, just in case.
— written by Maria Victoria Gonzaga
1 Berman, R. (2019, January 18). Dead – yes, dead – tardigrade found beneath Antarctica. Retrieved from [link]
2 Pariona, A. (2018, May 18). What Are The Eight Biogeographic Realms? Retrieved from [link]
3 CNRS. (2015, September 9). New giant virus discovered in Siberia’s permafrost. ScienceDaily. Retrieved from [link]
4 Wikipedia Contributors. (2018, November 10). Antarctic realm. Retrieved from [link]
5 Fox-Skelly, J. (2017, January 1). There are diseases hidden in ice, and they are waking up. Retrieved from [link]
6 Russia anthrax outbreak affects dozens in north Siberia. (2016, August 2). BBC News. Retrieved from [link]
7 Biology-Online Editors. (2014, May 12). Biology Online. Retrieved from [link]
A study published in Science on January 11 seems to be the first to lay empirical evidence that concur with Charles Darwin’s hypothesis: … that mate selection might have contributed to the evolution of intelligence or cognitive abilities. Scientists from China and the Netherlands collaborated in a study on budgerigars, Melopsittacus undulatus. Based on what they observed, problem-solving skills apparently increased the attractiveness of male birds. Accordingly, female birds chose to spend more time with male birds that appear to be smarter.
Darwin on mate selection
In animal kingdom, mate selection is a real deal. One of the generalized traits that distinguish the animal from the plant is the former’s tendency to select a mate. Animals, including humans, have their set of preferences when it comes to choosing a mate. While plants chiefly let nature do the “selection” for them, animals tend to seek a potential mate by themselves. And when they find a suitable mate of their choice, they often make a conscientious effort to succeed at coupling. In particular, males engage first in a courtship ritual, for example, by wooing a female with a song, a dance, or by a display of beauty or prowess.
Sexual selection evolved as one of the means of natural selection. A male, for instance, chooses a female to mate with, and, if need be, may tenaciously compete against other males to stack the odds in his favor. Charles Darwin’s long-standing theories on sexual selection are still relevant to this day. Darwin believed that sexual selection had a key role in how humans evolved and diverged into distinct human populations. In view of that, sexual selection could have contributed as to how intelligence evolved.
Intelligent males, more attractive
Many studies on birds revolved around the notion that female birds favor male birds with vibrant feathers or stylish songs. A recent study claims that intelligence is preferred over such fancy features and skills.
In the first experiment conducted by Chen and colleagues, small budgerigars (Australian parrots) were observed inside their cages to test the hypothesis that intelligence might affect mate selection. To do that, they allowed each female budgerigars to choose among a pair of similarly-looking male budgerigars to interact with. The chosen males were called preferred whereas those that were not were referred to as the less-preferred. Next, they trained the less-preferred males into learning a skill that opens closed lids or boxes. They, then, allowed the female budgerigar to observe the less-preferred male demonstrate the skill. Consequently, almost all of the females changed their preference. They chose the less-preferred males over the initially preferred males.
To test if this preference was social rather than sexual, they conducted a second experiment with a similar experimental design but this time a female budgerigar was exposed to two females (instead of males). The results showed that none of the female budgerigars changed their preferences. [1, 3] Based on these experiments, the researchers concluded that the demonstration of cognitive skills altered mate preference but not necessarily social preference.
Video of the animal model, male budgerigar that learned a problem-solving skill that seemingly increased its attractiveness to females. [Credit: Hedwig Pöllöläinen].
Why did mate selection evolved? The answer could be associated with the species survival or longevity. Individuals must be able to stay in the mate selection pool, if not on top of it. In general, males deemed as superior or “preferred” will gain higher chances at mating, and thereby will have better opportunities at transmitting their genes as they dominate the access to fertile females. Females, on the other hand, gain an upper hand from the mate selection by being able to choose the seemingly finest among the rest. Females must choose. That is because they have a generally limited reproductive opportunity to give life to. Moreover, the energy that a female invests in producing an offspring is so great that it has to be worth it.
— written by Maria Victoria Gonzaga
1 Chen, J., Zou, Y., Sun, Y.-H., & ten Cate, C. (2019). Problem-solving males become more attractive to female budgerigars. Science, 363(6423), 166–167. https://doi.org/10.1126/science.aau8181
2 Jones, A. G., & Ratterman, N. L. (2009). Mate choice and sexual selection: What have we learned since Darwin? Proceedings of the National Academy of Sciences, 106(Supplement_1), 10001–10008. https://doi.org/10.1073/pnas.0901129106
3 GrrlScientist. (2019, January 11). Problem-Solving Budgies Make More Attractive Mates. Forbes. Retrieved from https://www.forbes.com/sites/grrlscientist/2019/01/10/problem-solving-budgies-make-more-attractive-mates/#515f24d66407
The recent Netflix’s hit flick, Bird Box, surely startled the viewers with the thrilling scenarios revolving around the precept that once seen, expect an abrupt ferocious death. Given that, Malorie (the protagonist portrayed by Sandra Bullock) blindfolded herself and the two children, and embarked down the perilous river to seek a safer refuge. (N.B. If you have not seen it yet, you probably need to pause to dodge the spoilers ahead.) Ultimately, they reached the haven, which was revealed to be an old school for the blind. The surviving community was a population of primarily blind, and as such, immune, people. By and large, this film emanated a message to me that blindness should not be taken as an utter handicap but a trait that tenders a likely evolutionary edge.
Blindness is a complete, or a nearly complete, lack of vision. Basically, two major forms exist. A partial blindness means a very limited vision. In contrast, a complete blindness means a total lack of vision — not seeing anything, even light.1
Causes of blindness
Some of the common causes of blindness include eye accidents or injuries, diabetes, glaucoma, macular degeneration, blocked blood vessels, retrolental fibroplasia, lazy eye, optic neuritis, stroke, retinitis pigmentosa, optic glioma, and retinoblastoma.1
Congenital blindness refers to a condition wherein a person has been blind since birth. In fact, several instances of infant blindness are due to inherited eye diseases, such as cataracts, glaucoma, and certain eye malformations. In this case, genetic factors play a role. Retinitis pigmentosa, for example, is a hereditary condition. The retinal cells slowly disintegrate and ultimately leads to an incurable blindness later in life. Albinism also leads to vision loss in which, at times, reaches the category of “legally blind“.
The mapping of the human genome led to the identification of certain genetic causes of blindness. Scientists recently identified hundreds of new genes associated with blindness and other vision disorders. Bret Moore and colleagues found 261 new genes linked to eye diseases.2 Furthermore, they said that these newly-identified genes from mouse models likely have an analogous counterpart gene in humans. Thus, their findings could shed light in identifying the genes causing blindness in humans.
Humans evolved eyes that enabled sight or vision. About 500 million years ago, the earliest predecessors had eyes that could detect light from the dark. This early eye, called an “eyespot“, could sense ambient brightness (but not shapes), which sufficiently helped orient single-celled organism (e.g. Euglena) to circadian rhythm and photoperiodism, and of course to food.3
Soon, the eyespot evolved into a rather complex light-detecting structure, such as that found in flatworms. Their eyes could detect light direction. Also, their eyes enabled them to seek a better spot to hide from predators. As light was able to penetrate the deep seas, organisms such as Nautilus evolved pinhole eye. A small opening on it allowed only a thin pin of light to enter. This dramatically improved resolution and directional sensing.3
The pinhole eye evolved lens that regulated the degree of convergence or divergence of the transmitted rays. Furthermore, the lens helped distinguish spatial distance between the organism and the objects in its environment.3
A modern human eye has become more intricate by the presence of other eye structures. For instance, a transparent layer called cornea covered the opening (pupil) of the eye. This caused the inside of the eye to contain transparent body fluid called vitreous humor. The iris is the colored part near the pupil. The light-sensitive membrane, retina, contains the photoreceptor cells, i.e. the rods and the cones. Apparently, the evolution of the human eye concurred with the evolution of the visual cortex of the human brain.3
Blindness – an evolutionary regression or a gain?
Should blindness be considered an evolutionary regression or an evolutionary gain? Blind beetle species that live in light-less caves, in the underground aquifers of Western Australia and the eyeless Mexican cave fish are some of the animals that once had a sight but lost it over millions of years.
Simon Tierney from the University of Adelaide offered an explanation to this seemingly evolutionary regression.4 Accordingly, the loss of sight in the cave fish species apparently led to the evolution of increased number of taste buds. In particular, pleiotropy might explain this manifestation. A pleiotropic gene, in particular, controls multiple (and possibly unrelated) phenotypic traits. In this case, the gene responsible for the eye loss might have also caused the increased number of taste buds. The eyesight may not be imperative in a light-deprived habitat; however, an amplified number of taste buds for an improved sense of taste is. Douglas Futuyma of the State University of New York at Stony Brook explained: 4
“So the argument is these mutations are actually advantageous to the organism because the trade off for getting rid of the eye is enhancing the fish’s tastebuds. It really looks like these evolutionary regressions are not a violation of Darwin’s idea at all. It’s just a more subtle expression of Darwin’s idea of natural selection.”
In 2017, a research team posited that blind people do have enhanced abilities in their other senses. To prove this, they brain scanned blind participants in a magnetic resonance imaging (MRI) scanner. Accordingly, the scans revealed heightened senses of hearing, smell, and touch among blind participants as opposed to the participants who were not blind. Moreover, they found that blind people had enhanced memory and language abilities. Lotfi Merabet of the Laboratory for Visual Neuroplasticity at Schepens Eye Research Institute of Massachusetts Eye and Ear said:5
“Even in the case of being profoundly blind, the brain rewires itself in a manner to use the information at its disposal so that it can interact with the environment in a more effective manner.”
As the popular maxim goes, the eyes are the windows to the soul. In the presence of light, our eyes can perceive all the seemingly playful colors and spatiality that surround us. At times, a simple stare is all it takes to convey what we could have said in words. Despite the loss of sight in some of our co-specifics, their brain configured into an avant-garde stratagem that enabled them to do most of what a seeing person could. Based on what the researchers observed, they had enhanced interconnections in their brain that seemed to compensate for their lack of sight. Hence, blindness appears not as an evolutionary regression but probably a shift of path forward the evolutionary line.
— written by Maria Victoria Gonzaga
1 Blindness and vision loss: MedlinePlus Medical Encyclopedia. (2019, January 1). Retrieved from https://medlineplus.gov/ency/article/003040.htm
2 University of California – Davis. (2018, December 21). 300 blind mice uncover genetic causes of eye disease. ScienceDaily. Retrieved from www.sciencedaily.com/releases/2018/12/181221142516.htm
3 TED-Ed. (2015, January 8). YouTube. YouTube. Retrieved from https://www.youtube.com/watch?v=qrKZBh8BL_U
4 How does evolution explain animals losing vision? (2015, March 18). Abc.Net.Au. Retrieved from https://doi.org/http://abc.net.au/science/articles/2015/03/18/4192819.htm
5 Miller, S. G. (2017, March 22). Why Other Senses May Be Heightened in Blind People. Retrieved from https://www.livescience.com/58373-blindness-heightened-senses.html
If one wants to trace down lineage, that person could turn to the cell’s powerhouse, the mitochondrion. This organelle contains its own special set of DNA believed as inherited solely from mothers across generations. Thus, looking at the mitochondrial DNA (by mtDNA genealogical DNA testing) could help track down lineage, and for this reason, help determine ancestral or familial connection. Recently though, a team of scientists reported that the mitochondrial DNA is not solely inherited from the mothers. New empirical evidence of biparental inheritance of mitochondrial DNA implicates the need to rectify the long-held notion that the inheritance of mitochondrial genome is exclusively matrilineal or female line.
The mitochondrion (plural: mitochondria), reckoned as the powerhouse of the cell, generates metabolic energy, especially the form of adenosine triphosphate (ATP). And it does so through the process referred to as cellular respiration. Apart from that, the organelle is also described as semi-autonomous since it has its own genetic material distinct from that found in the nucleus. The nucleus contains more genes organized into chromosomes and in charge for almost all of the metabolic processes in the body. On the contrary, the genetic material in the mitochondrion – referred to as mitochondrial DNA – is relatively fewer in number. It carries the genetic code for the manufacturing of RNAs and proteins necessary to the various functions of the mitochondrion, such as energy production.
(Recent news on the evolutionary origin of mitochondria, read: Prokaryotic Ancestor of Mitochondria: on the hunt)
(You may also want to read: Mitochondrial DNA – hallmark of psychological stress)
In humans, the mitochondrial DNA is believed to be inherited solely from the mother. This notion stems from the events that happen at fertilization. The sperm contains on its neck a helix of mitochondria that power up the tail to swim toward the ovum. And when the sperm finally makes its way into the ovum, it leaves its neck and tail on the cell surface of the ovum. Mitochondria that are brought into the ovum would eventually be inactivated and disintegrated. Thus, the mitochondria in the ovum are the only ones that the zygote eventually inherits. A human ovum has an average of 200,000 mtDNA molecules.1 For this, certain traits and diseases involving mitochondrional DNA implicate maternal origin.
Inheritance of mitochondrial DNA– not exclusive
The theory of Mitochondrial Eve holds that tracing the matrilineal lineage of all recent human beings would lead to all lines converging to one woman referred to as “Eve“. The theory is based on the exclusivity of human mitochondrial DNA inheritance to female line. Nevertheless, independent empirical findings and clinical studies challenge this precept.
For instance, Schwartz and Vissing2 reported the case of a 28-year-old man with mitochondrial myopathy. Accordingly, the patient had a mutation (a novel 2-bp mtDNA deletion in ND2 gene). Normally, the gene encodes for a subunit of the enzyme complex I of the mitochondrial respiratory chain. Thus, the faulty gene affected the production of such enzyme, which, in turn, led to the patient’s severe, lifelong exercise intolerance. Furhter, Schwartz and Vissing2 pointed out that the patient’s mitochondrial myopathy was paternal in origin.
Recently, a team of researchers observed paternal inheritance of mitochondrial DNA, but this time, on 17 people from three different families.3 They sequenced their mitochondrial DNAs and they discovered father-to-offspring transmission.
The mitochondrial DNA is said to be a mother’s legacy to her offspring. However, recent studies indicate that the father could also transmit it to his progeny. Somehow, paternal mitochondrial DNA gets into the ovum. Rather than disintegrated or inactivated, it gets expressed. Mitochondrial DNA from the fathers may not be as rare as once thought. If more studies will corroborate soon, this could debunk Mitochondrial Eve theory. It might also render mtDNA genealogical DNA testing questionable. And, we may also need to start looking to the other side of our lineage to fathom hereditary diseases arising from faulty mitochondrial DNA.
— written by Maria Victoria Gonzaga
1 Mitochondrial DNA. (2018). Biology-Online Dictionary. Retrieved from https://www.biology-online.org/dictionary/Mitochondrial_DNA
2 Schwartz, M. & Vissing, J. (2002). “Paternal Inheritance of Mitochondrial DNA”. New England Journal of Medicine. 347 (8): 576–580.
3 Luo, S., Valencia, C.A., Zhang, J., Lee, N., Slone, J., Gui, B., Wang, X., Li, Z., Dell, S., Brown, J., Chen, S.M., Chien, Y., Hwu, W., Fan, P., Wong, L., Atwal, P.S., & Huang, T. (2018). Biparental Inheritance of Mitochondrial DNA in Humans. Proceedings of the National Academy of Sciences 201810946. DOI:10.1073/pnas.1810946115
When allergy season looms, some people with serious hypersensitivity to allergens tend to be apprehensive of what may come. Some would rather stay indoors than risking the odds of sucking up triggers that could instigate severe allergic reactions. Apart from triggers from the environment, other common factors for allergy include food, medication, certain toxins, venom from insect stings or bites, stress, and heredity. How does an allergy manifest? Which cells are involved in forming an allergic reaction?
The immune system
The immune system protects the body from foreign substances (generally referred to as antigens) that could pose a threat to our well-being. It prevents harmful bacteria, viruses, parasites, etc. from invading and causing harm. The white blood cells (also called leukocytes) constantly scout for antigens in order to destroy or disable them. The white blood cells include lymphocytes, neutrophils, basophils, eosinophils, monocytes, macrophages, mast cells, and dendritic cells.
Allergy – overview
An allergy is a state of hypersensitivity of the immune system in response to an allergen (i.e. a substance capable of inciting an allergic reaction). In this regard, several white blood cells play a role in mounting an allergic reaction.
In summary, the entry of an allergen into the body triggers an antigen-presenting cell, such as a dendritic cell. The dendritic cell takes up the allergen, process it, and then present its epitopes through its MHC II receptor on its cell surface. It, then, migrates to a nearby lymph node, waiting for a T lymphocyte to recognize it.
Upon recognition, the T lymphocyte may differentiate into a Th2 cell (type 2 helper T cells), which is capable of activating B lymphocyte. B lymphocyte, when activated, matures into a plasma cell that could synthesize and release IgE antibody in the bloodstream. Some of the circulating IgE may bind to mast cell and basophil. Thus, re-entry of such allergen could incite the IgE on mast cells and basophils to recognize its epitope. In effect, this activates the mast cell or basophil to release inflammatory substances (e.g. histamine, cytokines, proteases, chemotactic factors) into the bloodstream.
Anaphylaxis – a dreadful allergic reaction
The allergic reaction mounted by the immune system is supposed to protect the body. However, the allergens perceived by the body as a threat are generally harmless. The body tends to overly react to the allergens, and so leads to symptoms. Histamine, for instance, brings about the common symptoms of allergy: pain, heat, swelling, erythema, and itchiness.
Anaphylaxis is the most severe form of allergic reaction. It can occur rapidly and it affects more than one body system, such as respiratory, cardiovascular, cutaneous, and gastrointestinal systems. It occurs as a result of the release of inflammatory substances from mast cells and basophils upon exposure to an allergen. Within minutes to an hour, symptoms could manifest as a red rash, swelling, wheezing, lowered blood pressure, and in severe cases, anaphylactic shock.
In the presence of breathing difficulties, racing heart, weak pulse, and/or a change in voice, the situation is precarious. It calls for an immediate medical attention.
Why does anaphylaxis occur? IgE-mediated anaphylaxis is the common form of anaphylaxis. Initial exposure to an allergen leads to the release of IgE so that re-exposure to the allergen leads to its identification and the eventual activation of mast cells and basophils. Apart from immunologic factors, though, other causes of anaphylaxis are non-immunologic. For example, temperature (hot or cold), exercise, and vibration may cause anaphylaxis. In this case, IgE is not involved. Rather, these agents directly cause the mast cells and the basophils to degranulate.
Novel mechanism identified
Recently, a team of researchers1,2 found a novel mechanism that could explicate the hasty allergic reaction during anaphylaxis. They were first to uncover a mechanism involving the dendritic cells. Accordingly, a set of dendritic cells seem to “fish” allergens from the blood vessel using their dendrites. The dendritic cell near the blood vessel takes up the blood-borne allergen. Rather than initially processing it, and then presenting the epitope on its surface, it hands over the allergen inside a micro-vesicle to the adjacent mast cells.
Mast cells, unlike basophils that are in the bloodstream, are located in tissues, such as connective tissue. Thus, the question as to how the mast cells detect blood-borne allergen could be answered by the recent findings.
Rather than being internalized by the dendritic cells for processing, the allergen was merely taken into a micro-vesicle that budded off from the surface of dendritic cells. This, thus, saves time. It cuts the process, leading to a much rapid allergic reaction.
However, these findings were observed in mouse models. Therefore, the researchers have yet to observe if this novel mechanism also holds true on humans. If so, this could lead to possible therapeutic regulation of allergies, especially the most dreadful form, anaphylaxis.
— written by Maria Victoria Gonzaga
1 Choi, H.W., Suwanpradid, J. Il, Kim, H., Staats, H. F., Haniffa, M., MacLeod, A.S., & Abraham, S. N.. (2018). Perivascular dendritic cells elicit anaphylaxis by relaying allergens to mast cells via microvesicles. Science 362 (6415): eaao0666 DOI: 1126/science.aao0666
2 Duke University Medical Center. (2018, November 8). Using mice, researchers identify how allergic shock occurs so quickly: A newly identified immune cell mines the blood for allergens to directly trigger inflammation. ScienceDaily. Retrieved November 22, 2018 from www.sciencedaily.com/releases/2018/11/181108142440.htm
In essence, our body consists of two major types of cells – one group involved directly in reproducing sexually (called sex cells) and another group that are not (called somatic cells). In particular, the female sex cell is referred to as the ovum (also called egg cell) whereas the male sex cell, the sperm cell. The somatic cells, in turn, are the cells in the body that have varying functions, such as nourishing the sex cells as well as keeping the body thriving and functional.
Origin of sex cells
Our body produces sex cells through the process called gametogenesis. The process is essentially a step-by-step process of meiosis. Oogenesis (i.e. gametogenesis in females) takes place in the ovaries to produce ova or egg cells. In brevity, the oogonium (the female primordial germ cell) undergoes meiosis to produce four haploid egg cells. Conversely, spermatogenesis (i.e. gametogenesis in males) occurs in the testes to yield sperm cells. Quintessentially, the spermatogonium (the male primordial germ cell) will go through meiosis to give rise to four haploid sperm cells.
Sex cells vs somatic cells
In humans, a sex cell may be identified from a somatic cell in being a haploid cell. That means a sex cell would have half the number of chromosomes as that of a somatic cell. Hence, an egg cell or a sperm cell would have 23 chromosomes whereas a somatic cell would have 46. Haploidy in sex cells is important in order to maintain the chromosomal integrity in humans across generations.
At fertilization, the sperm cell and the egg cell unite to form a diploid cell (called zygote). The zygote, then, divides mitotically, giving rise to pluripotent stem cells. A pluripotent stem cell is a cell capable of giving rise to various precursors that eventually will acquire specific identity and physiological function via a process called differentiation. A differentiated cell means that the cell has matured and acquired a more specific role, for instance as a skin cell, a blood cell, a liver cell, etc.
Somatic cell converted to sex cell
Intrinsically, a human somatic cell that has “differentiated” could never become a sex cell just as a sex cell could neither become nor give rise to a somatic cell. However, this may no longer hold true in the years to come.
Japanese researchers have, for the first time, successfully converted a somatic cell into a sex cell precursor.1 In particular, they had successfully created an oogonium from a human blood cell. They turned blood cells into “induced pluripotent stem cells” (iPS).2 Essentially, the blood cells – turned iPS – appeared to have undergone “molecular amnesia”. It means they forget their initial identity. As a result, they could become any type of cell, even as a sex cell.
The researchers transformed human blood cells into oogonia (plural of oogonium). They did so by incubating them for four months in artificial ovaries derived from embryonic mouse cells. They retrieved promising results. Admittedly though, they acknowledged they are still in the early steps of a rather long journey of research. The oogonia, indeed precursors to egg cells, are, at this point, still young, and thereby, unfit for fertilization. The researchers have yet to induce them to become mature, fully differentiated egg cells. Nevertheless, they remain optimistic in having reached this point, and, undeniably, pioneered an important milestone.
If, in the future, research on the conversion of a somatic cell into a sex cell pushes through to completion, it could lead to significant resolves to infertility issues. However, ethical concerns shall, likely, surface as well. For instance, a possibility could occur in time. A mere hair cell or a skin cell from an unsuspecting person could be turned into an egg or a sperm cell. And from there, an offspring could come into existence.
— written by Maria Victoria Gonzaga
1 Yamashiro, C., Sasaki, K., Yabuta, Y., Kojima, Y., Nakamura, T., Okamoto, I., Yokobayashi, S., Murase, Y., Ishikura, Y., Shirane, K., Sasaki, H., Yamamoto, T., & Saitou, M. (2018 Oct 19).Generation of human oogonia from induced pluripotent stem cells in vitro. Science, 362(6412):356-360. doi: 10.1126/science.aat1674.
2 Solly, M. (2018 Sept. 24). Scientists create immature Human Eggs Out of Blood Cells For the First Time. Retrieved from [link]
When sadness reeks in and you feel as if you are all by yourself, think again. That is because you are never alone. As a matter of fact, millions of microorganisms reside in our body day in and out. They are the normal flora. Our body is a world of microscopic living entities that inhabit our body without essentially causing a disease. Rather, they live in us in harmonious mutualism. Thus, our body is not ours alone. Hence, we can say we are not absolutely sterile from the moment we are born.
Typically, the body has about 1013 cells and harbors about 1014 bacteria.1 The multifarious yet specific genera of bacteria that predominate the body is referred to as the normal flora. In essence, the normal flora thrives in a host in a mutualistic lifestyle. The microbes take advantage from living stably in the body. In return, they confer benefits to the human host. For instance, their presence helps prevent other more harmful microbes from colonizing the host. Some of them biosynthesize products that the human body can use. Nevertheless, an immunocompromised host could suffer in cases when these bacteria became overwhelming in number, and thereby cause detectable harm, like infections or diseases.
Normal flora in the gut
Microbes that normally thrive in the gut are greater in density and diversity compared with those in other body parts. Nevertheless, they vary in density depending on the location in the gastrointestinal tract. For instance, the stomach harbors about 103 to 106/g of contents whereas the large bowel of the large intestine has about 109 to 1011/g of contents. The normal flora in the stomach has fewer normal microbial inhabitants due to its acidity. The ileum of the small intestine contains a moderate microbial number, i.e. 106 to 108/g of contents.1
Some of the various bacterial species of the normal gut flora includes the anaerobes, Enterococcus sp., Escherichia coli, Klebsiella sp., Lactobacillus sp., Candida sp., Streptococcus anginosus and other Streptococcus sp.. Some of these bacteria aid in the production of bile acid, vitamin K, and ammonia since they possess the necessary enzymes.
Certain normal gut bacteria can become pathogenic. They could cause a disease when opportunity presents such as when changes in their microbiota favor their growth. Be that as it may, a healthy individual would not be usually harmed by their presence. Thus, question arises — why our immune armies do not, by and large, act against the normal flora as aggressively as they would in the presence of more harmful pathogens.
Karen Guillemin, a professor of biology and one of the authors of a paper that appeared in a special edition of the journal eLife, was quoted3: “One of the major questions about how we coexist with our microbial inhabitants is why we don’t have a massive inflammatory response to the trillions of the bacteria inhabiting our guts.”
Guillemin and her team of scientists reported that they uncovered a novel anti-inflammatory bacterial protein they referred to as Aeromonas immune modulator (AimA). Accordingly, AimA is a protein produced by a common gut bacterium, Aeromonas sp., in the animal model, zebrafish. The researchers found that AimA alleviated intestinal inflammation and extended the lifespan of the zebrafish from septic shock.2 Furthermore, they described it as an immune modulator that confers benefits to both bacteria and the zebrafish host.
The newly-discovered protein seems to be the first of its kind. Nevertheless, it is structurally similar to lipocalins, a class of proteins that, in humans, modulate inflammation. Based on their findings, the removal of this protein caused more intestinal inflammation in the host and the destruction of the normal Aeromonas gut bacterium. The reintroduction of AimA reverted to “normal”, i.e. the host, relieved from inflammation and Aeromonas’ typical density, restored. AimA appears to represent a new set of bacterial effector proteins. And, Guillemin referred to them as mutualism factors.3
Guillemin and her team postulate that many more of these mutualism factors exist even in humans, and yet to be found. These mutualism factors may have therapeutic potential for use in modulating inflammation especially in medical conditions such as sepsis and certain metabolic syndromes.
— written by Maria Victoria Gonzaga
1 Davis, C. P. (1996). Normal Flora. In: Baron S, editor. Medical Microbiology. 4th edition. Galveston (TX): University of Texas Medical Branch at Galveston. Retrieved from [link]
2 Rolig, A. S., Sweeney, E. G., Kaye, L.E., DeSantis, M. D., Perkins, A., Banse, A. V., Hamilton, M.K., & Guillemin, K. (2018). A bacterial immunomodulatory protein with lipocalin-like domains facilitates host–bacteria mutualism in larval zebrafish. eLife. [link]
3 University of Oregon. (2018, November 6). Novel anti-inflammatory bacterial protein discovered: Newly discovered protein alleviates intestinal inflammation and septic shock in an animal model. ScienceDaily. Retrieved from [link]