Numerous individuals share a common ancestor. Is it possible that the majority of people actually possess a dominant genetic disorder?

Because most people have this condition, have humans not realized that their bodies are actually ill? The ancient celestial gods in mythology are those who do not have a certain genetic disease, which gives them abilities that are incomprehensible to people today and has been considered as gods?

The Universal Deficiency: Vitamin C Synthesis in Humans

If everyone has a disease, then it’s as if no one has a disease. Its symptoms would be considered normal human characteristics.

There are examples of this. Currently, all known humans are unable to synthesize Vitamin C. The human body needs to obtain Vitamin C through the intake of fresh vegetables and fruits. Insufficient intake can lead to scurvy, and even endanger life. However, the majority of vertebrates in nature can synthesize Vitamin C.

Vitamin C Molecular Structure

Why can’t humans synthesize Vitamin C? The reason is that all humans carry a mutation in the L-gulonolactone oxidase (GLO) gene. GLO is an enzyme essential for synthesizing Vitamin C, and this mutation causes a loss of function in synthesizing Vitamin C. From a genetic perspective, this is a clear case of a monogenic hereditary disease.

Of course, this genetic defect is not fatal for most people, as fresh fruits and vegetables are easily available in most parts of the world. However, for a minority with special occupations and extreme living environments, a lack of Vitamin C can indeed be a problem.

During the Age of Exploration, many sailors died of scurvy because they could not eat fresh fruits and vegetables. Since fruits and vegetables could not be stored for long, ships had to dock periodically to purchase or collect fresh produce.

It is said that during Zheng He’s voyages, the problem of Vitamin C replenishment was solved by growing bean sprouts on the ship. Bean sprouts, being in a constant state of growth, have a much longer preservation time than other fruits and vegetables.

Additionally, the Inuit living in the vast snowfields within the Arctic Circle are almost exclusively carnivorous. They supplement Vitamin C by eating raw livers of seals and auks.

Although seals and auks do not eat fresh vegetables and fruits, they do not have the GLO mutation and can synthesize Vitamin C themselves. This is why they have a certain amount of Vitamin C in their bodies.

Why the liver? Because in these animals, the liver is the site of Vitamin C synthesis, and it is rich in Vitamin C content.

Why raw? Because cooking the liver destroys the Vitamin C in it.

Interestingly, humans are not the only mammals that cannot synthesize Vitamin C. Most non-human primates, some bats, some passerines, many teleost fish, and the commonly domesticated pet guinea pig (cavy) also cannot synthesize Vitamin C.

Vitamin B1’s situation is somewhat similar to that of Vitamin C. For more information, refer to:

What are some fascinating and interesting enzymes?

Uric Acid Metabolism and Its Evolutionary Implications in Humans

Yes, for example, humans cannot metabolize uric acid, leading to hyperuricemia, gout, and kidney stones.

Condensed Version

This is due to a mutation in the uricase gene during human evolution, leading to the complete inactivation of uricase. Uric acid cannot be further oxidized into urea, resulting in its accumulation in the body and causing hyperuricemia, gout, and kidney stones.

TL;DR Version

Uric acid is a weak organic acid with a pKa of 5.75. It is a byproduct of purine nucleotide metabolism, but many organisms further degrade it. The further decomposition of uric acid first requires uricase to oxidize it into allantoin. Uricase is also known as uric acid oxidase.

In the evolutionary process of birds and reptiles, the gene encoding uricase mutated into a non-functional pseudogene, leading to the accumulation of uric acid in these organisms. This is beneficial for reptiles and birds as they excrete nitrogenous waste in the form of uric acid. Compared to ammonia, which contains 1 N per molecule, and urea with 2 N per molecule, uric acid contains 4 N per molecule, making it a more efficient way to “package” nitrogenous waste. Additionally, its poor solubility in water helps these organisms conserve water.

This is crucial for the survival of terrestrial organisms, especially under arid conditions. Reptiles and birds have a cloaca that allows for the safe excretion of uric acid deposits without the risk of obstructing the urethra. However, in the evolution of mammals, uricase was “resurrected,” allowing them to convert uric acid into the more soluble allantoin, which is easily excreted in urine.

However, between 9 and 15 million years ago, in the middle Miocene, the uricase gene mutated again in two primate lineages, becoming a non-active pseudogene. Evidence suggests that this mutation occurred gradually: first, a mutation in the promoter region led to a gradual reduction in uricase expression levels; then, a mutation in codon 33 of exon 2 caused uricase to lose its activity completely. The loss of uricase forced primates to excrete less soluble uric acid, resulting in higher serum uric acid levels, far higher than in other mammals, and unlike other mammals, their levels are not as regulated.

For example, the concentration of uric acid in human blood is 240-360 μmol/L compared to 30-50 μmol/L in mice. Nowadays, there is a significant variation in serum uric acid levels among people. Individuals with high levels of uric acid are at an increased risk of developing gout and kidney stones. More importantly, high levels of uric acid also often mean an increased risk of several other diseases, such as obesity, metabolic syndrome, diabetes, fatty liver, hypertension, cardiovascular disease, and kidney disease.

So, the question arises: what is the biological effect of uric acid on primates? Why did natural selection favor this mutation? Could it have some evolutionary advantage?

Hypothesis One: Uric Acid May Play a Role in Intelligence Formation

In 1903, H. Ellis first proposed that uric acid might play a role in intelligence formation. He noted that gout often occurred in people with high intelligence, such as many famous philosophers and scientists of the 19th century, like Benjamin Franklin. In 1955, Orowan E. reintroduced this idea in a letter to Nature, noting the chemical similarities between uric acid and caffeine. He conducted extensive research to assess the uric acid levels of the general population and special groups (university professors and students) to understand if there is a relationship between uric acid and intelligence. The results showed only a weak correlation or no correlation between the two, but a close relationship was found between uric acid and reaction time.

Now, the possibility of uric acid playing a role in the central nervous system has not been entirely ruled out. Acute hyperuricemia in rats, induced by inhibiting uricase, does increase motor activity. At least one study suggests that uric acid may increase dopamine levels in the brain. In fact, individuals with hyperuricemia have a lower risk of developing Parkinson’s disease. Although hyperuricemia may have minor benefits for reaction time and intelligence formation, the outcome may be counterproductive over time.

For example, high uric acid is more common in patients with vascular dementia. Additionally, experimental hyperuricemia can induce small vessel lesions in animal kidneys, affecting their ability to auto-regulate. If such events also occur in the brain, as the increase in white matter disease in hyperuricemia patients suggests, it indicates that uric acid might be a cause of vascular dementia. It now seems that Ellis’s initial observation was not due to a relationship between uric acid and high intelligence but because, in that era, gout was generally a condition of the wealthy, who had sedentary lifestyles and consumed more sugar, which can increase uric acid levels. In fact, gout is increasing in all populations today, regardless of education level.

Hypothesis Two: Uric Acid as a Natural Antioxidant

In 1981, B.N. Ames and others proposed a new view, suggesting that uric acid acts as a natural antioxidant in the body, helping to eliminate free radicals, thus providing a selective advantage. Their and other researchers' findings indicate that uric acid can react with various oxidants, especially peroxides, nitrites, superoxides, and light-based free radicals. Injecting uric acid into animals significantly enhances their antioxidant capacity and improves endothelial cell function. For example, adding uric acid to the liver can prevent oxidative damage to liver cells in animal models of hemorrhagic shock; also, high levels of uric acid identified in the tracheobronchial aspirates of mechanically ventilated premature infants suggest that uric acid can protect the body from oxidative lung damage early in life; finally, the increased serum uric acid levels also explain why patients with significant atherosclerosis have improved antioxidant capacity.

As @ScientificPig mentioned, a mutation that occurred about 40 million years ago caused a key gene related to Vitamin C synthesis in primates to become a pseudogene. Consequently, primates lost their ability to synthesize Vitamin C. As is well known, one of the important functions of Vitamin C in the body is as an antioxidant. Therefore, after early primates lost the ability to synthesize Vitamin C, their oxidative stress risk increased. Because oxidative stress can drive the occurrence of cancer, cardiovascular diseases, and aging, the uricase mutation provided a survival advantage. In mammals, only humans and other primates lost the ability to synthesize a substitute antioxidant, ascorbic acid, millions of years ago. This may be the evolutionary pressure that led to our increased uric acid levels. Studies show that species with higher uric acid levels tend to live longer.

Hypothesis Three: Uric Acid Helps Prevent Starvation

In 2011, Richard J. Johnson and others proposed another viewpoint. They believe that uric acid can act as a danger and survival signal in the body, beneficial in preventing starvation, as it can increase fat storage, maintain blood pressure, and enhance immune function.

In the middle Miocene, the main food of apes was fructose-rich fruits. Unlike glucose, fructose can increase the body’s ability to store fat, including in the liver, viscera, and plasma. Thus, fruits were not only a food source but also potentially helpful for animals facing intermittent food shortages.

The specific reason why fructose is superior to glucose in increasing fat storage may be related to the unique first step in fructose metabolism. When fructose enters liver cells, it is catalyzed by fructokinase and converted into fructose-6-phosphate. Unlike hexokinase, which catalyzes glucose phosphorylation and has a negative feedback mechanism to prevent excessive glucose phosphorylation, fructokinase-catalyzed fructose phosphorylation will continue unabated, depleting intracellular phosphate and AMP, stimulating intracellular adenylate deaminase activity, thereby accelerating the rate at which AMP is degraded into uric acid via IMP; conversely, uric acid produced within the cell can induce an oxidative stress response. These processes may promote fat synthesis.

Evidence supporting this view is that patients with hereditary fructose intolerance, who have a mutation in aldolase B involved in fructose metabolism, still develop fatty liver despite the metabolic block. In this respect, increasing evidence suggests that intracellular uric acid production may play a role in fat accumulation. Therefore, uric acid may provide some protection against severe energy depletion in the body, such as during starvation, tissue damage, or local ischemia. It can be imagined that in early primates, often facing intermittent starvation, the loss of uricase may have provided a survival advantage.

In addition to possibly gaining fat through the action of fructose, increased uric acid may also enhance the function of the immune system. There is evidence that high concentrations of uric acid, especially in crystalline form, besides presenting antigens to major histocompatibility complex molecules, also act as a co-stimulatory signal. This second signal is thought to stimulate the maturation of dendritic cells and convert a low-level T-cell immune response into an explosive reaction by confirming the invasion of pathogens during cellular damage.

Furthermore, high uric acid levels are particularly important in maintaining blood pressure during bipedal walking in primates, especially humans, an evolutionary step required for further enhancing cardiovascular function. In a low-salt environment, we need to stand without falling from trees. This may have led us to select higher uric acid levels to help maintain blood pressure. Important evidence supporting this comes from a finding that animals on a low-sodium diet can only maintain blood pressure when given a uricase inhibitor.

In summary, increasing evidence suggests that the loss of uricase in the evolution of primates provided us with various evolutionary advantages, but at the same time, it has brought many potential health problems to modern humans, such as gout, hypertension, obesity, cardiovascular disease, and kidney disease.

Reference: Yang Rongwu. Principles of Biochemistry[M]. Higher Education Press, 2006.

In humans, oocyte cells naturally lack the KIFC1 protein, while other mammals possess this protein. The absence of this protein results in unstable spindle formation during oocyte cell division, leading to a higher risk of chromosomal abnormalities.

Due to the absence of this protein, 20% to 50% of human eggs are aneuploid, containing an abnormal number of chromosomes, either too many or too few. This increases the likelihood of miscarriages, infertility, and the birth of children with Down syndrome.

The introduction of this protein from an external source can enhance spindle stability and reduce the risk of chromosomal abnormalities.

Clearly, this genetic condition increases the risk of human extinction and falls under the category of diseases.

Biological traits alone

Whether or not one is sick is relative to the environment

The absence of webbed hands can be considered a dominant genetic disease, and similarly, having four limbs can also be considered one. If the Earth is about to be submerged in water, having lungs but no gills is also a dominant genetic disease, with a 100% fatality rate.

Humans do not have a baculum (penis bone), while members of the felidae, canidae, and most other primates, including the vast majority of chimpanzees and monkeys, possess a baculum.

The greatest ailment of humanity is having a lifespan limited to less than 120 years.

As for topics like divine beings, I don’t think they can be discussed on Zhihu.

The Permanence of Female Mammary Glands and the Notion of Genetic Diseases

Among all mammals, human females are unique in possessing permanent mammary glands that do not retract even outside of the lactation period. This constant presence of breast tissue not only hampers their mobility but also increases the burden on their shoulders, potentially leading to neck problems. Can this be considered a dominant genetic trait?

(Male readers, don’t think you’re exempt. You are also different from other mammals; you lack a penile bone, which can lead to limitations in certain aspects.)

During my childhood readings of “A Dream of Red Mansions” and “Dream of the Red Chamber,” I noticed that humans have a tendency to be drawn to the morbid, much like other animals and plants. It wasn’t until I watched “Animal World” and came across the term “sexual selection” in biology textbooks that I realized this tendency is not unique to humans but a widespread phenomenon in the animal kingdom.

Thus, the question arises: “If a condition provides an advantage in sexual selection, can it still be considered a disease? Or is it possible that not having this condition is the true ailment?

Let’s consider an extreme scenario:

Albinism, included in the “First Batch of Rare Diseases” by the Chinese National Health Commission in 2018, is undoubtedly a genetic disease. However, I admire albinism, especially the “violet eyes” that only some albino individuals possess.

Suppose that, with the development of society, humans increasingly become “creatures of the night,” and vampires become a popular theme. Everyone, like me, admires individuals with albinism. As a result, with each generation, the proportion of albino individuals in the population grows even faster than that of AIDS patients! To the point that by the year 2424, there is not a single human in the Milky Way who is not fair-skinned, white-haired, and violet-eyed. Is albinism still a genetic disease then? Or should we consider those with black, red, or golden hair in 2424 as the genetic disease patients?

If someone argues that albinism is still a genetic disease in 2424, then they should also consider modern humans as having genetic diseases—after all, modern women’s breasts do not retract outside of the lactation period, and modern men lack a penile bone.

What about… No tail?

Approximately 2.45 billion years ago, cyanobacteria emerged, constantly releasing a toxic gas - oxygen, which took one billion years to exterminate most anaerobic organisms.

Today, most organisms on Earth possess a dominant genetic trait:

Oxygen

Our common Mendelian genetic disease: Cells can only divide 50 times, with each cycle taking 2.4 years, and a maximum potential lifespan of 120 years.

When certain cells probabilistically break through genetic locks, not only do we fail to become gods, but we also receive the divine punishment of cancer.

Please distinguish between “dominant inheritance” and “dominant genetic disease” as two distinct concepts. It is the weakening of survival competitiveness that is referred to as a disease, whereas the opposite is referred to as evolution.

For example, none of them can fly.

Is this kind of albinism without hair considered? The corresponding phenomenon is atavistic fur moles.

Evolution is like this. Among one million people, 100 genes mutate, and at least 90 of them are silent and not expressed. Among the 10 dominant ones, at least 9 are harmful or have no effect. Only one is advantageous, and it may not even be inherited.

So now many normal genes in humans were genetic diseases in ancient times. For example, being hairless couldn’t survive the winter, hemorrhoids caused by upright walking, degenerated tails…

Allegations of Corruption in a State-Owned Enterprise in Henan, China

The state-owned enterprise in Henan, China, is facing allegations of financial misconduct. It is accused of misreporting the budget for the reconstruction of abandoned buildings, borrowing 150 million yuan when only 70 million yuan was needed for the project. The remaining funds are suspected to have been transferred to shadow accounts controlled by public officials, raising concerns of embezzlement.

Furthermore, the enterprise has undergone a debt evasion restructuring process, changing its official seal, and shifting the blame to financing partners without political connections. This move has been criticized for covering up the debt evasion and corrupt practices within the enterprise. Additionally, the authorities have labeled the money lent by elderly individuals to the enterprise as illegal to further obscure the debt evasion restructuring.

The corruption allegations involve the misuse of power to manipulate budgets, public letters deflecting responsibility, and questionable financial practices. The situation highlights the need for transparency and accountability in state-owned enterprises and has raised concerns about the financial losses incurred due to the misconduct. Media coverage and investigation into these allegations are warranted.

Aging, illness, death, the fear of the physical body

Genetic Diseases and Environmental Adaptation

If everyone truly had such a genetic disease, it would be considered a normal human characteristic by all, and what we call this genetic disease wouldn’t exist.

If there were indeed such a disease, it would be due to the environment. As humans reproduced, they constantly moved away from their original environment to various places with completely different living conditions. These new environments would not be suitable for the growth of the original genetic disease, and over time, it would naturally disappear.

Take, for example, something many of us know and understand: when we were children, there were many lice on our clothes. Why don’t we see them now? Firstly, people wash their clothes more frequently, and secondly, various treatments have been used. The original living environment has been thoroughly disrupted, leading to the extinction of these organisms.

If the majority of people in the world were mentally ill, would normal individuals be considered abnormal?

The Relativity of Normality

It’s widely known that if everyone is rich, everyone is poor; if everyone is poor, everyone is rich. If a particular ailment afflicts everyone, it ceases to be called a disease and becomes a “feature” of humanity.

Following this line of thought, one can imagine countless “diseases” of humanity. For instance, let’s imagine that humanity is universally afflicted with dwarfism, with the natural height of humans being 3 meters or more. Due to a genetic mutation in the past, everyone has shrunk to under 2 meters, and now almost everyone is affected by this condition, making individuals of 1.7 meters appear normal.

Alternatively, we could imagine that humans originally had wings, which typically began developing near the shoulder blades shortly after birth. However, in a particular century, a person with unusual tendencies decided not to fly and created a mutation to inhibit wing development, which then spread to the entire population, rendering all descendants unable to fly.

Taking this thought experiment further, one could even create a million “diseases” of humanity. After all, as long as it remains logically possible, no one can definitively deny it.

However, even a three-year-old knows that this is purely nonsensical and far-fetched.

Normality and abnormality are relative concepts. For example, in terms of height, there is no absolute “normal” height for humans. We define a normal range, say 1.4 to 2.4 meters, based on statistical analysis, and consider deviations from this range as potentially influenced by external factors. But can we truly say that the “normal” height for humans is confined to that range? The answer is no.

Similarly, with lifespan, modern scientific research suggests that human lifespans could potentially reach two to three centuries, yet the average human lifespan is currently around 70 to 80 years. Does that mean everyone who doesn’t reach two centuries has a problem? With future technological advancements, we might achieve immortality for some organisms. In that scenario, what would be considered “normal”?

Furthermore, if you happen to be that unique individual with something others don’t have, or vice versa, you should be most concerned about being scrutinized and studied by “experts” in a dark room with magnifying glasses.

Next
Previous