Biology Online is a Biology blog and dictionary site that provides up to date articles on the latest developments in biological science. The Biology Online Dictionary is a completely free and open dictionary with over 60,000 biology terms. It uses the wiki concept, so that anyone can make a contribution.
Category: Life Science

Regeneration in humans – Finding the gene switch

Regeneration in humans is much more limited compared in other animals. Say for instance when one lost a limb, much as well say goodbye to it for the rest of one’s life. Perhaps, it would be nice if we have higher capacity to regenerate many of our indispensable body parts, like head, limbs, and many other “regeneration-incapables”. Then probably, we might not have to worry much about losing any of them knowing that they will eventually re-grow in due time.



Regeneration vs. Healing

Humans have the capacity to regenerate. However, we have a very limited capabiltiy to restoring parts of our skin, hair, nails, fingertips, and liver. At the tissue level, surely we have dedicated cells to replace lost and damaged cells. For instance, our non-injured bone eventually replenishes into a full new bone but in a span of ten years. Our skin naturally renews but give it two weeks. The story swerves differently though in the case of an injury.


Rather than expending energy into having it replaced with a new one, our body directs its efforts into healing it. So when our skin is deeply damaged, our body fixes it with a scar. Tissue repair mechanisms such as wound healing aren’t really a snag. They forestall pathogenic microbes from using an injured body part as an easy gateway into our body. (Besides, we do have ample microbiota naturally thriving inside of us already) The main goal is to fix it efficaciously, with relatively less effort.




Natural regeneration in humans

In humans, the only tissue that regenerates naturally, consistently, and completely is the endometrium.1 After it slough-offs during a woman’s menstrual period, it grows back by re-epithelialization until the next period. Humans can also regenerate an injured liver provided that the restoration involves as little as 25% of the original liver mass. The liver can grow back to its original size but may not to its original shape. Damaged tubular parts of the kidney can also re-grow. The surviving epithelial cells undergo migration, dedifferentiation, proliferation, and re-differentiation to set a new epithelial lining of the tubule.




Animals with higher regeneration capacities


axolotl regeneration
Axolotl (Ambystoma mexicanum) is one of the animals dubbed as masters of regeneration. It can grow back its limbs, even a heart, without a scar. [Photo credit: Mike Licht, Flickr]


Some animals have higher capacity to re-grow lost body parts. Sharks, skates, and rays can regenerate their kidneys. They can regrow an entire nephron, which humans cannot. A lizard would drop its tail as a mode of escape; its tail will be fully restored over time anyway. Sharks do not have qualms about losing teeth. They can replace any of them more than a hundred times in their lifetime. Axolotl can replace its broken heart. A starfish will once again be stellar upon the return of a lost arm. In fact, even its lost arm can fully regenerate into an entire starfish as long as the central nerve ring remains intact.2 A decapitated planarian worm needs not worry about losing its head; it can grow back, together with its brain, including the memories.2 Without a doubt, many of these animals are simply masters of their craft – regeneration.



Regeneration genes

Researchers from Harvard University published their new findings on whole-body regeneration capacity of the three-banded panther worm.3 They uncovered DNA switches that seemed to regulate genes that have a role in the regeneration process. Accordingly, they found a section of a non-coding DNA that controlled the activation of a master gene in which they called the “early growth response” (EGR) gene. When active, the EGR gene seemed like a power switch that turns on and off certain genes in the coding region during regeneration. On the contrary, when deactivated, no regeneration occurred.


Surprisingly, humans have EGR gene, too.  So why doesn’t it lead to greater regeneration capacities as it does in the three-banded panther worm? The researchers explained that while it works in the worm, it doesn’t work the same way in humans. The wiring may be different. The worm’s EGR gene may have germane connections that are absent in humans.




Switching the gene on

Regeneration in humans
Regeneration in humans is limited. If only we knew the switch that could amplify our regeneration capacity then we might not have to worry much about losing a body part. [Photo credit: Pete Johnson, Pexels]


Induced regeneration in humans is one of the goals of regenerative medicine. This field of medicine seeks new ways to give our regenerative capacity a boost. One of the ways is to look “molecularly”. Researchers are looking into the gene “Lin28a“. When active, this gene can reprogram somatic cells into embryonic-like stem cells. Accordingly, it has a role in tissue regeneration and recovery. However, the gene is naturally turned off in adults. Research in boosting our regenerative capacities is ongoing. Switching our organs from being regeneration-incapable to regeneration-capable may just be a matter of discovering the gene switch that could enhance regeneration capacity of humans.



— written by Maria Victoria Gonzaga




1 Min, S., Wang, S. W., & Orr, W. (2006). “Graphic general pathology: 2.2 complete regeneration”Pathology. Retrieved from [Link]

2  Langley, L. (2013, August 28). “Pictures: 5 Animals That Regrow Body Parts”. National Geographic News. Retrieved from [Link] ‌


Fasting boosts human metabolism, has anti-aging effects

In the advent of 2019, we are inspired to set new goals, pursue life-long dreams, or simply make better choices. Perhaps, one of the most common reveries we wish to realize is to be able to adopt a healthier kind of lifestyle. With this in mind, some of us look for ways to feel dutifully healthier, such as by managing our weight. So, many would turn to fad diets and caloric restrictions that promise to help. One of them is intermittent fasting. Based on studies, intermittent fasting does not only help trim weight but it seems to offer further health benefits as well.




Intermittent fasting – overview


intermittent fasting
Scientists found that fasting boosted human metabolism. This could mean that fasting may slow aging in humans. [Image credit: Zeyus Media]



In May 2018, I wrote the article: Intermittent Fasting – benefits and caution. There, I tackled briefly about intermittent fasting, its benefits, and potential risk. In essence, intermittent fasting is a cyclic pattern of a period of fasting and a subsequent period of non-fasting. The most common forms are: (1) whole-day fasting and (2) time-restricted eating. Whole-day fasting entails one-full day of “no eating”, done twice a week (thus, referred to as “5:2 plan“). In time-restricted eating, there is an interval of fasting and non-fasting on a daily basis. It could be half a day of fasting, and then the remaining half as the non-fasting period.  With intermittent fasting, it’s not so much about “what to eat…” or “how much…” Rather, it’s more about a question of when.




Intermittent fasting became popular because it does not only help curb weight but it also implicates other health benefits. It apparently slows aging and boosts the immune defense.[1] However, as I pointed out in that article, caution should still be taken. Intermittent fasting is not for everyone, especially those who are immunocompromised and underweight.[2]




Rejuvenating effects of fasting

Previously, I mentioned that studies confirming the health benefits of fasting were done on non-human subjects (e.g. rodent models). Without much scientific proofs of efficacy on humans, what would, therefore, be definite is doubt.  However, on January 29 of this year, a team of scientists from Okinawa Institute of Science and Technology Graduate University (OIST) and Kyoto University reported rejuvenating effects of fasting on human subjects. They published their findings in Scientific Reports.[3] Accordingly, they analyzed the blood samples from four fasting individuals. They also monitored the levels of metabolites involved in growth and energy metabolism. What they found was quite interesting and promising.




Dr. Takayuki Teruya, one of the researchers of the team, said that their results implicated the rejuvenating effects of fasting. They found that many metabolites increased significantly, about 1.5- to 60-fold, in just 58 hours of fasting. In their previous study, they identified some of these metabolites (e.g. leucine, isoleucine, and ophthalmic acid), that typically deplete with age. According to Dr. Teruya, they found that the amount of these metabolites increased again in individuals who fasted. Also, they conjectured that fasting could possibly promote muscle maintenance and antioxidant activity based on the metabolites they found. Hence, fasting may probably promote longevity as well. Dr. Teruya further said that this was not yet known until now since most studies that have said so used animal models.[4]




Fasting increased metabolism

During fasting, the body turns to alternate energy stores when carbohydrates are not available. Thus, the less-common metabolites from alternative metabolic pathways superseded the typical metabolites from carbohydrate metabolism. They identified butyrates, carnitines, and branched-chain amino acids as some of the metabolites that accumulated during fasting. [4] Apart from this, the researchers also found an increase in Citric acid cycle intermediates. This means that aside from prompting alternate metabolic pathways, fasting has also augmented the common metabolic activities. The metabolism of purine and pyrimidine seemed also heightened, indicating an increase in gene expression and protein synthesis. Because of this, the researchers also saw a boost in antioxidants (e.g. ergothioneine and carnosine) that protect cells from the free radicals produced by metabolism. The researchers assume to be the first to provide evidence of antioxidants as a fasting marker. [4]




This new-found proof infers that fasting seems to have some anti-aging effects, this time, on human subjects. Their next step is to see if they could duplicate the results in a larger-scale study. For now, let us remain cautious, look for indubitable substantiation, and weigh in the benefits and risks of all available options.




— written by Maria Victoria Gonzaga





1  Cohut, M. (2018). Intermittent fasting may have ‘profound health benefits’. Retrieved from [Link]

2  Longo, V. D., & Mattson, M. P. (2014). Fasting: Molecular Mechanisms and Clinical Applications. Cell Metabolism, 19 (2), 181–192. [Link]

3 Teruya, T., Chaleckis, R., Takada, J., Yanagida, M. & Kondoh, H. (2019). Diverse metabolic reactions activated during 58-hr fasting are revealed by non-targeted metabolomic analysis of human blood. ”Scientific Reports, 9”(1) DOI: 10.1038/s41598-018-36674-9.

4  ‌ Okinawa Institute of Science and Technology (OIST) Graduate University. (2019, January 31). Fasting ramps up human metabolism, study shows. ScienceDaily. Retrieved from [Link]

Lurking beneath the ice

Scientists found dead tardigrades beneath the Antarctica based on their report published of recent.[1] It was a surprising discovery since tardigrades have acquired the mark as the tiny infinities. They are so resistant to extreme conditions that they are thought of as some sort of “immortals“. Nonetheless, scientists found remains of tardigrades, together with crustaceans in deep, frozen Antarctic lake.[1]



Antarctic Realm – The Cold Realm


Antarctic realm
The Antarctic biogeographic realm – the smallest of all realms.


The Antarctic is a region located in the southern-most tip of the Earth. The biogeographic realm that includes the Antarctic is called the Antarctic realm. A biogeographic realm refers to an area of land where similar organisms thrived and then evolved through periods of time in relative isolation.[2] It rouses extensive research with the paramount objective of understanding the extent of biodiversity, especially the distributional patterns of residing organisms and the biological evolutionary history incurred.



The Antarctic biogeographic realm is the smallest of all realms. It spans a total area of about 0.12 million square miles. Its components include the land area, the Antarctic tectonic plate, the ice in the waters, and the ocean itself. [2]  Because of the cold temperature, few floral species are able to persist and thrive. At present, around 250 lichens, 100 mosses, 25-30 livertworts, 700 algal species, and two flowering plant species (i.e. Antarctic hair grass and Antarctic pearlwort) inhabit the region. As for fauna, animal species include the penguinsseals, and whales.[2]



An Icy Surprise


Tardigrades . [Credit: Willow Gabriel, Goldstein Lab –]

The discovery of the remains of tardigrades was unexpected, according to David Harnwood, a micropaleontologist. Late last year, Harnwood and his research team drilled a hole in the subglacial Lake Mercer. This frozen lake had been undisturbed for millennia.  Thus, their research project SALSA (Subglacial Antarctic Lakes Scientific Access) was the first to conduct direct sampling. They were absolutely surprised to find these water bears –frozen and dead.


Astounded, the animal ecologist, Byron Adams, conjectured that these tardigrades might have come from the Transantarctic Mountains, and then carried down to Lake Mercer. [1] Further, he said, “What was sort of stunning about the stuff from Lake Mercer is it’s not super, super-old. They’ve not been dead that long.”



Chilly Giants

“Mollivirus sibericum” found in Siberian permafrost.
[Credit: © IGS CNRS/AMU]

In September 2015, Jean-Michel Claverie and others reported two giant viruses (i.e. ”Pithovirus sibericum” and ”Mollivirus sibericum”) that they revived from a 30,000-year-old permafrost in Siberia.[3,5] Once revived, the viruses quickly became infectious to their natural hosts, the amoebae. [5] Luckily, these chilly giants do not prefer humans as hosts. Nonetheless, the melting of these frozen habitats could implicate danger to the public health when pathogens that can infect humans escape the icy trap.



A frozen Pandora’s Box

The frozen regions of the Earth hold so many astonishing surprises waiting to be “thawed”. In August 2016, a 12-year old boy from the Yamalo-Nenets region of Siberia died from anthrax. Reports included a few number of locals and thousands of grazing reindeer as well.[6] Prior to the anthrax outbreak, a summer heatwave caused the melting of the permafrost in the Yamal Peninsula in the Arctic Circle. The thawing of the frozen soil unleashed anthrax bacteria presumed to have come from the carcass of their reindeer host that died over 75 years ago. Their release apparently reached the nearby soil, water, the food supply, and eventually their new hosts.[5] The anthrax bacteria survived because they form spores that can protect them during their dormancy.




A Hotter Earth

Global warming supposedly increases the average temperature of the Earth’s surface enough to cause climate change. Accordingly, the global surface temperature increased 0.74 ± 0.18 °C (1.33 ± 0.32 °F) during the last century. The temperature rise brings threat as it could lead to environmental changes that could cause adverse effects of massive magnitude. One of which is the destruction of habitats due to the subsequent rise of water level from the melting of ice. Deadly pathogens could rise again from their cold slumber and plausibly cause another major mass extinction in no time. So, while we try to explore the deeper mysteries lurking beneath the ice, we should also make sure that we remain a step ahead. Claverie[5] excellently put it:

The possibility that we could catch a virus from a long-extinct Neanderthal suggests that the idea that a virus could be ‘eradicated’ from the planet is wrong, and gives us a false sense of security. This is why stocks of vaccine should be kept, just in case.



— written by Maria Victoria Gonzaga





1  Berman, R. (2019, January 18). Dead – yes, dead – tardigrade found beneath Antarctica. Retrieved from [link]

2  Pariona, A. (2018, May 18). What Are The Eight Biogeographic Realms? Retrieved from [link]

3 CNRS. (2015, September 9). New giant virus discovered in Siberia’s permafrost. ScienceDaily. Retrieved from [link]

4  ‌ Wikipedia Contributors. (2018, November 10). Antarctic realm. Retrieved from [link]

5  Fox-Skelly, J. (2017, January 1). There are diseases hidden in ice, and they are waking up. Retrieved from [link]

6  Russia anthrax outbreak affects dozens in north Siberia. (2016, August 2). BBC News. Retrieved from [link]

7  Biology-Online Editors. (2014, May 12). Biology Online. Retrieved from [link]

Antibiotics removal by walnut shell based activated carbon

Antibiotics are the most common compounds that are found in groundwater, surface water, drinking water and wastewater. Also, traces of these antibiotics found in sewage sludge, soil and sediments that caused concern to the environment. Besides, the emergence of antimicrobial resistance becomes the major health problem worldwide. Nonetheless, therapeutic used of antimicrobials in human and veterinary medicine contributes to the widespread of resistant microorganisms. On the other hand walnut shells are among the waste materials that have been suggested to have efficient sorbent alternatives. Due to its low ash content and been used as low cost sorbent for metal and oil removal.

Walnut shell activated carbon in removal of antibiotic

Advance treatment of wastewater confirming the positive results in lowering the presence of antibiotic residues. These include ozonation, membrane separation, advanced oxidation, reverse osmosis and nanofiltration. In which, the vast applicability of activated carbon in pollutants removal are always dependent on the conditions of raw materials. So, this particular research, the walnut shell has been used since it is a precursor material for activated carbon production. Moreover, the activated carbons ability to remove organic micro-pollutants lies on the solution and contaminants properties. Apparently, the absorption of antibiotic Metronidazole shows the conditions that maximize expected results.


The influence of temperature on the absorption capacity of  antibiotic is slightly significant. As a result, the absorption capacity depends on the nature of the activated carbon and its chemical characteristics, morphology and solutes. Also, the nature of solutes affecting electronic density influences the interactions with the matrix of the absorbent. In addition, activated carbon is the most common process to remove dissolved organic and inorganic compounds. Its great flexibility in applications arises from physical and chemical properties on specifically treated carbon materials.


As a result, the absorption amount of organic compounds depends strongly on essential properties of the absorbent. However, it can be slightly affected by some variables like temperature, pH, ionic strength and contact time. Therefore, antibiotic shows positive effect on the interaction of the absorbed amount. So, activated carbon from walnut shell might represent a good agent in removing antibiotic residues.


Sources: Prepared by Joan Tura from ScienceDirect: Science of the Total Environment

Volume 646, 1 January 2019 Pages 168-176

On Mate Selection Evolution: Are intelligent males more attractive?

A study published in Science on January 11 seems to be the first to lay empirical evidence that concur with Charles Darwin’s hypothesis: … that mate selection might have contributed to the evolution of intelligence or cognitive abilities. Scientists from China and the Netherlands collaborated in a study on budgerigars, Melopsittacus undulatus. Based on what they observed, problem-solving skills apparently increased the attractiveness of male birds. Accordingly, female birds chose to spend more time with male birds that appear to be smarter.[1]



Darwin on mate selection

In animal kingdom, mate selection is a real deal. One of the generalized traits that distinguish the animal from the plant is the former’s tendency to select a mate. Animals, including humans, have their set of preferences when it comes to choosing a mate. While plants chiefly let nature do the “selection” for them, animals tend to seek a potential mate by themselves. And when they find a suitable mate of their choice, they often make a conscientious effort to succeed at coupling. In particular, males engage first in a courtship ritual, for example, by wooing a female with a song, a dance, or by a display of beauty or prowess.


Sexual selection evolved as one of the means of natural selection. A male, for instance, chooses a female to mate with, and, if need be, may tenaciously compete against other males to stack the odds in his favor. Charles Darwin’s long-standing theories on sexual selection are still relevant to this day. Darwin believed that sexual selection had a key role in how humans evolved and diverged into distinct human populations.[2] In view of that, sexual selection could have contributed as to how intelligence evolved.




Intelligent males, more attractive


budgerigars on mate selection
Female budgerigars (also called parakeets or budgies) apparently prefer smarter mates, according to a study. This finding seems to support Charles Darwin theory on mate selection.


Many studies on birds revolved around the notion that female birds favor male birds with vibrant feathers or stylish songs. A recent study claims that intelligence is preferred over such fancy features and skills.


In the first experiment conducted by Chen and colleagues[3], small budgerigars (Australian parrots) were observed inside their cages to test the hypothesis that intelligence might affect mate selection. To do that, they allowed each female budgerigars to choose among a pair of similarly-looking male budgerigars to interact with. The chosen males were called preferred whereas those that were not were referred to as the less-preferred. Next, they trained the less-preferred males into learning a skill that opens closed lids or boxes. They, then, allowed the female budgerigar to observe the less-preferred male demonstrate the skill. Consequently, almost all of the females changed their preference. They chose the less-preferred males over the initially preferred males.


To test if this preference was social rather than sexual, they conducted a second experiment with a similar experimental design but this time a female budgerigar was exposed to two females (instead of males). The results showed that none of the female budgerigars changed their preferences. [1, 3] Based on these experiments, the researchers concluded that the demonstration of cognitive skills altered mate preference but not necessarily social preference.


Video of the animal model, male budgerigar that learned a problem-solving skill that seemingly increased its attractiveness to females. [Credit: Hedwig Pöllöläinen].




Why did mate selection evolved? The answer could be associated with the species survival or longevity. Individuals must be able to stay in the mate selection pool, if not on top of it. In general, males deemed as superior or “preferred” will gain higher chances at mating, and thereby will have better opportunities at transmitting their genes as they dominate the access to fertile females. Females, on the other hand, gain an upper hand from the mate selection by being able to choose the seemingly finest among the rest. Females must choose. That is because they have a generally limited reproductive opportunity to give life to. Moreover, the energy that a female invests in producing an offspring is so great that it has to be worth it.



— written by Maria Victoria Gonzaga





1  Chen, J., Zou, Y., Sun, Y.-H., & ten Cate, C. (2019). Problem-solving males become more attractive to female budgerigars. Science363(6423), 166–167.


2  Jones, A. G., & Ratterman, N. L. (2009). Mate choice and sexual selection: What have we learned since Darwin? Proceedings of the National Academy of Sciences106(Supplement_1), 10001–10008.


3  GrrlScientist. (2019, January 11). Problem-Solving Budgies Make More Attractive Mates. Forbes. Retrieved from


Blindness – Evolutionary regression? Maybe not!

The recent Netflix’s hit flick, Bird Box, surely startled the viewers with the thrilling scenarios revolving around the precept that once seen, expect an abrupt ferocious death. Given that, Malorie (the protagonist portrayed by Sandra Bullock) blindfolded herself and the two children, and embarked down the perilous river to seek a safer refuge. (N.B. If you have not seen it yet, you probably need to pause to dodge the spoilers ahead.) Ultimately, they reached the haven, which was revealed to be an old school for the blind. The surviving community was a population of primarily blind, and as such, immune, people. By and large, this film emanated a message to me that blindness should not be taken as an utter handicap but a trait that tenders a likely evolutionary edge.




Blindness defined

Blindness is a complete, or a nearly complete, lack of vision. Basically, two major forms exist. A partial blindness means a very limited vision. In contrast, a complete blindness means a total lack of vision — not seeing anything, even light.1




Causes of blindness

Some of the common causes of blindness include eye accidents or injuries, diabetes, glaucoma, macular degeneration, blocked blood vessels, retrolental fibroplasia, lazy eye, optic neuritis, stroke, retinitis pigmentosa, optic glioma, and retinoblastoma.1


Congenital blindness refers to a condition wherein a person has been blind since birth. In fact, several instances of infant blindness are due to inherited eye diseases, such as cataracts, glaucoma, and certain eye malformations. In this case, genetic factors play a role. Retinitis pigmentosa, for example, is a hereditary condition. The retinal cells slowly disintegrate and ultimately leads to an incurable blindness later in life. Albinism also leads to vision loss in which, at times, reaches the category of “legally blind“.


The mapping of the human genome led to the identification of certain genetic causes of blindness. Scientists recently identified hundreds of new genes associated with blindness and other vision disorders. Bret Moore and colleagues found 261 new genes linked to eye diseases.2 Furthermore, they said that these newly-identified genes from mouse models likely have an analogous counterpart gene in humans. Thus, their findings could shed light in identifying the genes causing blindness in humans.



Eye evolution

Humans evolved eyes that enabled sight or vision. About 500 million years ago, the earliest predecessors had eyes that could detect light from the dark. This early eye, called an “eyespot“, could sense ambient brightness (but not shapes), which sufficiently helped orient single-celled organism (e.g. Euglena) to circadian rhythm and photoperiodism, and of course to food.3


Soon, the eyespot evolved into a rather complex light-detecting structure, such as that found in flatworms. Their eyes could detect light direction. Also, their eyes enabled them to seek a better spot to hide from predators. As light was able to penetrate the deep seas, organisms such as Nautilus evolved pinhole eye. A small opening on it allowed only a thin pin of light to enter.  This dramatically improved resolution and directional sensing.3


The pinhole eye evolved lens that regulated the degree of convergence or divergence of the transmitted rays. Furthermore, the lens helped distinguish spatial distance between the organism and the objects in its environment.3


A modern human eye has become more intricate by the presence of other eye structures. For instance, a transparent layer called cornea covered the opening (pupil) of the eye. This caused the inside of the eye to contain transparent body fluid called  vitreous humor. The iris is the colored part near the pupil. The light-sensitive membrane, retina, contains the photoreceptor cells, i.e. the rods and the cones. Apparently, the evolution of the human eye concurred with the evolution of the visual cortex of the human brain.3

eye structure
Illustration of the eye with labeled parts.
[Image credit: National Eye Institute, National Institutes of Health.]




Blindness – an evolutionary regression or a gain?

blindness eyes
Is blindness an evolutionary regression?
[Photo by omar alnahi from Pexels]


Should blindness be considered an evolutionary regression or an evolutionary gain? Blind beetle species that live in light-less caves, in the underground aquifers of Western Australia and the eyeless Mexican cave fish are some of the animals that once had a sight but lost it over millions of years.


Simon Tierney from the University of Adelaide offered an explanation to this seemingly evolutionary regression.4  Accordingly, the loss of sight in the cave fish species apparently led to the evolution of increased number of taste buds. In particular, pleiotropy might explain this manifestation. A pleiotropic gene, in particular, controls multiple (and possibly unrelated) phenotypic traits. In this case, the gene responsible for the eye loss might have also caused the increased number of taste buds. The eyesight may not be imperative in a light-deprived habitat; however, an amplified number of taste buds for an improved sense of taste is. Douglas Futuyma of the State University of New York at Stony Brook explained: 4


“So the argument is these mutations are actually advantageous to the organism because the trade off for getting rid of the eye is enhancing the fish’s tastebuds. It really looks like these evolutionary regressions are not a violation of Darwin’s idea at all. It’s just a more subtle expression of Darwin’s idea of natural selection.”




Heightened senses

In 2017, a research team posited that blind people do have enhanced abilities in their other senses. To prove this, they brain scanned blind participants in a magnetic resonance imaging (MRI) scanner. Accordingly, the scans revealed heightened senses of hearing, smell, and touch among blind participants as opposed to the participants who were not blind. Moreover, they found that blind people had enhanced memory and language abilities. Lotfi Merabet of the Laboratory for Visual Neuroplasticity at Schepens Eye Research Institute of Massachusetts Eye and Ear said:5


“Even in the case of being profoundly blind, the brain rewires itself in a manner to use the information at its disposal so that it can interact with the environment in a more effective manner.”





As the popular maxim goes, the eyes are the windows to the soul. In the presence of light, our eyes can perceive all the seemingly playful colors and spatiality that surround us. At times, a simple stare is all it takes to convey what we could have said in words. Despite the loss of sight in some of our co-specifics, their brain configured into an avant-garde stratagem that enabled them to do most of what a seeing person could. Based on what the researchers observed, they had enhanced interconnections in their brain that seemed to compensate for their lack of sight.  Hence, blindness appears not as an evolutionary regression but probably a shift of path forward the evolutionary line.



— written by Maria Victoria Gonzaga





1 Blindness and vision loss: MedlinePlus Medical Encyclopedia. (2019, January 1). Retrieved from


2 University of California – Davis. (2018, December 21). 300 blind mice uncover genetic causes of eye disease. ScienceDaily. Retrieved from


3 TED-Ed. (2015, January 8). YouTube. YouTube. Retrieved from


4 How does evolution explain animals losing vision? (2015, March 18). Abc.Net.Au. Retrieved from


5 Miller, S. G. (2017, March 22). Why Other Senses May Be Heightened in Blind People. Retrieved from


Aerobic exercise modifies fine particle exposures to young adults

Aerobic exercise contributes to the prevention and treatment of various chronic diseases as well as helps improves endothelial function. It is also beneficial in adaptation of the cardio-pulmonary system and infection resistant. Moreover, aerobic exercise attributes to the release of vasoconstrictor substances and increased nitric oxide availability. However, exposure to fine particles in ambient condition linked to some adverse health effects. This includes oxidative stress, pulmonary systemic inflammation, increased blood coagulation and vascular imbalance. Aerobic exercise in polluted environments increased inhalation of air pollutants due to increased respiratory rate and reduction of nasal resistance. Also, long-term exercise aggravates air pollutant which causes associated respiratory impairment.


Air pollutant exposure during aerobic exercise

There were 20 healthy non-smoking male subjects on this study and aerobic exercise frequencies have been recorded. Wherein indices measured including fractional exhaled nitric oxide, blood pressures; cytokines exhaled breath condensate and pulse-wave analysis. However, the biomarkers of eosinophilic airway inflammation were positively associated with air pollution exposure. Also, the fractional exhaled nitric oxide concentrations were greater in high exercise frequency. Thus, explain that high strength exercise might be at higher risk of particle-mediated respiratory symptoms.


Aerobic exercise is associated with the exposure to air pollutant which caused respiratory inflammation and arterial stiffness. In terms of cardiovascular responses the increased in aortic augmentation pressure indicate higher pulse-wave velocity. Furthermore, aerobic exercise at moderate frequency had a greater protective effect against cardiopulmonary health risk than low or excessive exercise.


Therefore, long-term habitual aerobic exercise in severely polluted areas may strengthen the resistance of the cardiovascular system. But increase the risk of pollutant-related airway inflammation. In addition, surrogate biomarkers of atherosclerosis, arterial wall thickness have been decreased following the long-term aerobic exercise. And also low cardiopulmonary fitness is the key indicators for cardiovascular mortality and coronary heart disease.


Source: Prepared by Joan Tura from BMC Environmental Health

Volume 17:88 December 13, 2018

Mitochondrial DNA not just from moms but also from dads?

If one wants to trace down lineage, that person could turn to the cell’s powerhouse, the mitochondrion. This organelle contains its own special set of DNA believed as inherited solely from mothers across generations.  Thus, looking at the mitochondrial DNA (by mtDNA genealogical DNA testing) could help track down lineage, and for this reason, help determine ancestral or familial connection. Recently though, a team of scientists reported that the mitochondrial DNA is not solely inherited from the mothers. New empirical evidence of biparental inheritance of mitochondrial DNA implicates the need to rectify the long-held notion that the inheritance of mitochondrial genome is exclusively matrilineal or female line.



Mitochondrial DNA

The mitochondrion (plural: mitochondria), reckoned as the powerhouse of the cell, generates metabolic energy, especially the form of adenosine triphosphate (ATP). And it does so through the process referred to as cellular respiration. Apart from that, the organelle is also described as semi-autonomous since it has its own genetic material distinct from that found in the nucleus. The nucleus contains more genes organized into chromosomes and in charge for almost all of the metabolic processes in the body. On the contrary, the genetic material in the mitochondrion – referred to as mitochondrial DNA – is relatively fewer in number. It carries the genetic code for the manufacturing of RNAs and proteins necessary to the various functions of the mitochondrion, such as energy production.


(Recent news on the evolutionary origin of mitochondria, read: Prokaryotic Ancestor of Mitochondria: on the hunt)


(You may also want to read: Mitochondrial DNA – hallmark of psychological stress)


Mitochondrial inheritance

In humans, the mitochondrial DNA is believed to be inherited solely from the mother. This notion stems from the events that happen at fertilization. The sperm contains on its neck a helix of mitochondria that power up the tail to swim toward the ovum. And when the sperm finally makes its way into the ovum, it leaves its neck and tail on the cell surface of the ovum. Mitochondria that are brought into the ovum would eventually be inactivated and disintegrated. Thus, the mitochondria in the ovum are the only ones that the zygote eventually inherits. A human ovum has an average of 200,000 mtDNA molecules.1 For this, certain traits and diseases involving mitochondrional DNA implicate maternal origin.




Inheritance of mitochondrial DNA– not exclusive

mitochondrial dna eve
The theory of Mitochondrial Eve is based on the exclusivity of human mitochondrial DNA inheritance to the female line, which when traced would lead to only one most recent woman, “Eve”. (Image credit: Ludela, Creative Commons Attribution-Share Alike 3.0 Unported)


The theory of Mitochondrial Eve holds that tracing the matrilineal lineage of all recent human beings would lead to all lines converging to one woman referred to as “Eve“. The theory is based on the exclusivity of human mitochondrial DNA inheritance to female line. Nevertheless, independent empirical findings and clinical studies challenge this precept.


For instance, Schwartz and Vissing2 reported the case of a 28-year-old man with mitochondrial myopathy. Accordingly, the patient had a mutation (a novel 2-bp mtDNA deletion in ND2 gene). Normally, the gene encodes for a subunit of the enzyme complex I of the mitochondrial respiratory chain. Thus, the faulty gene affected the production of such enzyme, which, in turn, led to the patient’s severe, lifelong exercise intolerance. Furhter, Schwartz and Vissing2 pointed out that the patient’s mitochondrial myopathy was paternal in origin.


Recently, a team of researchers observed paternal inheritance of mitochondrial DNA, but this time, on 17 people from three different families.3 They sequenced their mitochondrial DNAs and they discovered father-to-offspring transmission.




The mitochondrial DNA is said to be a mother’s legacy to her offspring. However, recent studies indicate that the father could also transmit it to his progeny. Somehow, paternal mitochondrial DNA gets into the ovum. Rather than disintegrated or inactivated, it gets expressed. Mitochondrial DNA from the fathers may not be as rare as once thought.  If more studies will corroborate soon, this could debunk Mitochondrial Eve theory. It might also render mtDNA genealogical DNA testing questionable. And, we may also need to start looking to the other side of our lineage to fathom hereditary diseases arising from faulty mitochondrial DNA.



— written by Maria Victoria Gonzaga





1 Mitochondrial DNA. (2018). Biology-Online Dictionary. Retrieved from


2 Schwartz, M. & Vissing, J. (2002). “Paternal Inheritance of Mitochondrial DNA”. New England Journal of Medicine. 347 (8): 576–580.


3 Luo, S.,  Valencia, C.A.,  Zhang, J., Lee, N., Slone, J., Gui, B., Wang, X.,  Li, Z.,  Dell, S., Brown, J., Chen, S.M.,  Chien, Y., Hwu, W., Fan, P.,  Wong, L.,  Atwal, P.S., & Huang, T. (2018). Biparental Inheritance of Mitochondrial DNA in Humans. Proceedings of the National Academy of Sciences 201810946. DOI:10.1073/pnas.1810946115

Pathobiology of allergy and its most severe form, anaphylaxis

When allergy season looms, some people with serious hypersensitivity to allergens tend to be apprehensive of what may come. Some would rather stay indoors than risking the odds of sucking up triggers that could instigate severe allergic reactions. Apart from triggers from the environment, other common factors for allergy include food, medication, certain toxins, venom from insect stings or bites, stress, and heredity. How does an allergy manifest? Which cells are involved in forming an allergic reaction?




The immune system

How does an allergy occur? The pathobiological mechanism involves several white blood cells that play a role in mounting an allergic reaction.


The immune system protects the body from foreign substances (generally referred to as antigens) that could pose a threat to our well-being.  It prevents harmful bacteria, viruses, parasites, etc. from invading and causing harm. The white blood cells (also called leukocytes) constantly scout for antigens in order to destroy or disable them. The white blood cells include lymphocytes, neutrophils, basophils, eosinophils, monocytes, macrophages, mast cells, and dendritic cells.




Allergy – overview


allergy pathway
The allergy pathway.
Image (by Sari Sabban) distributed under the CC 3.0 Unported license.


An allergy is a state of hypersensitivity of the immune system in response to an allergen (i.e. a substance capable of inciting an allergic reaction). In this regard, several white blood cells play a role in mounting an allergic reaction.

In summary, the entry of an allergen into the body triggers an antigen-presenting cell, such as a dendritic cell. The dendritic cell takes up the allergen, process it, and then present its epitopes through its MHC II receptor on its cell surface. It, then, migrates to a nearby lymph node, waiting for a T lymphocyte to recognize it.

Upon recognition, the T lymphocyte may differentiate into a Th2 cell (type 2 helper T cells), which is capable of activating B lymphocyte. B lymphocyte, when activated, matures into a plasma cell that could synthesize and release IgE antibody in the bloodstream. Some of the circulating IgE may bind to mast cell and basophil. Thus, re-entry of such allergen could incite the IgE on mast cells and basophils to recognize its epitope. In effect, this activates the mast cell or basophil to release inflammatory substances (e.g. histamine, cytokines, proteases, chemotactic factors) into the bloodstream.




Anaphylaxis – a dreadful allergic reaction

The allergic reaction mounted by the immune system is supposed to protect the body. However, the allergens perceived by the body as a threat are generally harmless. The body tends to overly react to the allergens, and so leads to symptoms. Histamine, for instance, brings about the common symptoms of allergy: pain, heat, swelling, erythema, and itchiness.


Anaphylaxis is the most severe form of allergic reaction. It can occur rapidly and it affects more than one body system, such as respiratory, cardiovascular, cutaneous, and gastrointestinal systems. It occurs as a result of the release of inflammatory substances from mast cells and basophils upon exposure to an allergen. Within minutes to an hour, symptoms could manifest as a red rash, swelling, wheezing, lowered blood pressure, and in severe cases, anaphylactic shock.


In the presence of breathing difficulties, racing heart, weak pulse, and/or a change in voice, the situation is precarious. It calls for an immediate medical attention.


Why does anaphylaxis occur? IgE-mediated anaphylaxis is the common form of anaphylaxis. Initial exposure to an allergen leads to the release of IgE so that re-exposure to the allergen leads to its identification and the eventual activation of mast cells and basophils.  Apart from immunologic factors, though, other causes of anaphylaxis are non-immunologic. For example, temperature (hot or cold), exercise, and vibration may cause anaphylaxis. In this case, IgE is not involved. Rather, these agents directly cause the mast cells and the basophils to degranulate.




Novel mechanism identified

Recently, a team of researchers1,2 found a novel mechanism that could explicate the hasty allergic reaction during anaphylaxis. They were first to uncover a mechanism involving the dendritic cells. Accordingly, a set of dendritic cells seem to “fish” allergens from the blood vessel using their dendrites. The dendritic cell near the blood vessel takes up the blood-borne allergen. Rather than initially processing it, and then presenting the epitope on its surface, it hands over the allergen inside a micro-vesicle to the adjacent mast cells.


Mast cells, unlike basophils that are in the bloodstream, are located in tissues, such as connective tissue. Thus, the question as to how the mast cells detect blood-borne allergen could be answered by the recent findings.


Rather than being internalized by the dendritic cells for processing, the allergen was merely taken into a micro-vesicle that budded off from the surface of dendritic cells. This, thus, saves time. It cuts the process, leading to a much rapid allergic reaction.


However, these findings were observed in mouse models. Therefore, the researchers have yet to observe if this novel mechanism also holds true on humans. If so, this could lead to possible therapeutic regulation of allergies, especially the most dreadful form, anaphylaxis.



— written by Maria Victoria Gonzaga




1 Choi, H.W., Suwanpradid, J. Il, Kim, H., Staats, H. F., Haniffa, M., MacLeod, A.S., & Abraham, S. N.. (2018). Perivascular dendritic cells elicit anaphylaxis by relaying allergens to mast cells via microvesiclesScience 362 (6415): eaao0666 DOI: 1126/science.aao0666
2 Duke University Medical Center. (2018, November 8). Using mice, researchers identify how allergic shock occurs so quickly: A newly identified immune cell mines the blood for allergens to directly trigger inflammation. ScienceDaily. Retrieved November 22, 2018 from


First time! Human blood cell turned into a young sex cell

In essence, our body consists of two major types of cells – one group involved directly in reproducing sexually (called sex cells) and another group that are not (called somatic cells). In particular, the female sex cell is referred to as the ovum (also called egg cell) whereas the male sex cell, the sperm cell. The somatic cells, in turn, are the cells in the body that have varying functions, such as nourishing the sex cells as well as keeping the body thriving and functional.




Origin of sex cells

Our body produces sex cells through the process called gametogenesis. The process is essentially a step-by-step process of meiosis. Oogenesis (i.e. gametogenesis in females) takes place in the ovaries to produce ova or egg cells. In brevity, the oogonium (the female primordial germ cell) undergoes meiosis to produce four haploid egg cells. Conversely, spermatogenesis (i.e. gametogenesis in males) occurs in the testes to yield sperm cells. Quintessentially, the spermatogonium (the male primordial germ cell) will go through meiosis to give rise to four haploid sperm cells.




Sex cells vs somatic cells

In humans, a sex cell may be identified from a somatic cell in being a haploid cell. That means a sex cell would have half the number of chromosomes as that of a somatic cell. Hence, an egg cell or a sperm cell would have 23 chromosomes whereas a somatic cell would have 46. Haploidy in sex cells is important in order to maintain the chromosomal integrity in humans across generations.


At fertilization, the sperm cell and the egg cell unite to form a diploid cell (called zygote). The zygote, then, divides mitotically, giving rise to pluripotent stem cells. A pluripotent stem cell is a cell capable of giving rise to various precursors that eventually will acquire specific identity and physiological function via a process called differentiation. A differentiated cell means that the cell has matured and acquired a more specific role, for instance as a skin cell, a blood cell, a liver cell, etc.




Somatic cell converted to sex cell

sex cell
Soon, a somatic cell could be converted into human sex cells.
[Image credit: Karl-Ludwig Poggemann,, CC by 2.0]


Intrinsically, a human somatic cell that has “differentiated” could never become a sex cell just as a sex cell could neither become nor give rise to a somatic cell. However, this may no longer hold true in the years to come.


Japanese researchers have, for the first time, successfully converted a somatic cell into a sex cell precursor.1 In particular, they had successfully created an oogonium from a human blood cell. They turned blood cells into “induced pluripotent stem cells” (iPS).2 Essentially, the blood cells – turned iPS – appeared to have undergone “molecular amnesia”. It means they forget their initial identity. As a result, they could become any type of cell, even as a sex cell.


The researchers transformed human blood cells into oogonia (plural of oogonium). They did so by incubating them for four months in artificial ovaries derived from embryonic mouse cells. They retrieved promising results. Admittedly though, they acknowledged they are still in the early steps of a rather long journey of research. The oogonia, indeed precursors to egg cells, are, at this point, still young, and thereby, unfit for fertilization. The researchers have yet to induce them to become mature, fully differentiated egg cells. Nevertheless, they remain optimistic in having reached this point, and, undeniably, pioneered an important milestone.





Ethical issues

If, in the future, research on the conversion of a somatic cell into a sex cell pushes through to completion, it could lead to significant resolves to infertility issues. However, ethical concerns shall, likely, surface as well. For instance, a possibility could occur in time. A mere hair cell or a skin cell from an unsuspecting person could be turned into an egg or a sperm cell. And from there, an offspring could come into existence.




— written by Maria Victoria Gonzaga





1 Yamashiro, C., Sasaki, K., Yabuta, Y., Kojima, Y., Nakamura, T., Okamoto, I., Yokobayashi, S., Murase, Y., Ishikura, Y., Shirane, K., Sasaki, H., Yamamoto, T., & Saitou, M. (2018 Oct 19).Generation of human oogonia from induced pluripotent stem cells in vitro. Science, 362(6412):356-360. doi: 10.1126/science.aat1674.


2 Solly, M. (2018 Sept. 24). Scientists create immature Human Eggs Out of Blood Cells For the First Time. Retrieved from [link]