Launch — Access limited to 250 Members to ensure personalized follow-up quality.

DNA Testing and Supplementation: What Consumer Genomics Can (and Cannot) Tell You

DNA double helix and blood test tubes symbolizing the complementarity between genomics and biomarkers

The consumer genetic testing market has surpassed 40 million kits sold worldwide. The promise is compelling: a saliva sample, a few weeks of waiting, and a detailed report of your "risks" accompanied by "personalized" nutritional recommendations. Some companies go further, offering dietary supplements directly calibrated to your DNA.

One question remains that these companies prefer not to ask too loudly: does it actually work?

Three levels of genetic reliability: an important distinction

Not all genetic variants carry the same predictive power. Conflating them is the first methodological error of nutrigenomic marketing.

High-penetrance monogenic variants constitute the first level. A single mutated gene, a massive biological effect. BRCA1/BRCA2 mutations considerably increase the risk of breast and ovarian cancer. The APOE ε4 variant multiplies Alzheimer's disease risk by 3 to 12 depending on the number of copies carried. These variants are clinically validated, and their detection changes medical management. But they affect only a limited fraction of the population (less than 5%), and their direct utility for nutritional supplementation is virtually nil.

Pharmacogenomics represents the second level. Polymorphisms in cytochrome P450 enzymes (CYP2D6, CYP2C19, liver enzymes responsible for metabolizing many medications) modulate the speed at which your liver processes certain molecules. Knowing that you are a "slow metabolizer" of CYP2D6 can change the choice or dosage of an antidepressant or analgesic. This is the most mature domain of applied genomics, with clinical recommendations validated by the CPIC (Clinical Pharmacogenetics Implementation Consortium). But this is pharmacology, not nutrition. The gap between "adjusting a medication based on genotype" and "choosing a dietary supplement based on genotype" is substantial.

Polygenic risk scores (PRS) constitute the third level, and this is the one nutrigenomic marketing exploits. A PRS aggregates thousands, sometimes millions of variants with minuscule individual effects (each modifying risk by 0.01 to 0.1%) to produce an overall score. A study published in Nature Genetics showed that PRS could identify high-risk individuals for certain cardiometabolic conditions, with predictive power sometimes comparable to rare monogenic mutations (PubMed). But the devil is in the details.

The polygenic risk score: a promise with conditions

The PRS is a powerful statistical tool in epidemiological research. Its individual clinical utility is an entirely different matter.

First limitation: explained variance. For most metabolic and nutritional conditions, PRS explain between 5 and 15% of phenotypic variance (the observable variation between individuals) (PubMed). This figure deserves concrete translation: 85 to 95% of what determines your actual nutritional status comes from non-genetic factors. Diet, physical activity, sleep, chronic stress, environmental exposure, gut microbiome, complex interactions between all these elements. Genetics sets a framework; lifestyle fills the canvas.

5 to 15%
Variance explained by PRS

For most multifactorial conditions, polygenic risk scores explain only a minority fraction of the variation observed between individuals. Lifestyle, environment, and diet collectively weigh far more heavily.

Second limitation: population bias. The vast majority of GWAS (genome-wide association studies, which compare DNA across thousands of individuals to identify variants associated with a trait) have been conducted on cohorts of European ancestry. PRS calibrated on this data are significantly less predictive for non-European populations (PubMed). A score presented as "personalized" can be biologically misleading for a large portion of the world's population. This is a structural problem, not a detail.

Third limitation: the absolute risk paradox. A high PRS for type 2 diabetes may indicate a 30% increase in relative risk. Impressive on paper. But if baseline prevalence is 8%, your individual risk rises from 8% to roughly 10.4%. The actionable question becomes: does this information concretely change what you should do compared to standard recommendations? In most cases, no. Physical exercise, balanced nutrition, and weight management remain the first line of action, whether your PRS is elevated or not.

MTHFR and nutrigenomics: when marketing outpaces evidence

The MTHFR C677T polymorphism has become the commercial flagship of nutrigenomics. It is real, documented, and systematically misinterpreted by those who sell it.

The MTHFR gene encodes an enzyme (methylenetetrahydrofolate reductase) that converts folic acid into 5-MTHF, the biologically active form of folate. The C677T polymorphism, in its homozygous form (two copies of the variant inherited, one from each parent), reduces enzymatic activity by 30 to 70%. This is a biochemical fact established since the original discovery by Frosst et al. in 1995 (PubMed).

What is less often mentioned: the American College of Medical Genetics and Genomics (ACMG) published an explicit recommendation in 2013 against routine clinical MTHFR testing (PubMed). Their reasoning is straightforward. Folate and homocysteine status (homocysteine being an amino acid whose accumulation is a cardiovascular risk marker) can be measured directly in the blood. If your homocysteine is elevated, the course of action is identical whether you carry the variant or not: methylfolate. If your homocysteine is normal, the polymorphism is biologically silent. Testing the gene to decide what to do is like consulting a building's blueprint to find out if it's warm inside, instead of reading the thermometer.

The same problem recurs with other nutrigenomic panel favorites. The VDR gene (vitamin D receptor), COMT (catechol-O-methyltransferase, an enzyme involved in degrading catecholamines like dopamine and adrenaline), SOD2 (mitochondrial superoxide dismutase, an antioxidant enzyme): each features documented variants whose individual biological effect is real but modest. Commercial panels associate these SNPs (single nucleotide polymorphisms, variations of a single "letter" in the DNA sequence) with bioactive recommendations with a confidence that the scientific literature does not support.

And this is where the most problematic gap lies. Is there a single randomized clinical trial demonstrating that genotype-guided supplementation produces better outcomes than supplementation guided by blood biomarkers? As of 2026, the answer is no.

The Food4Me trial remains the largest randomized controlled trial on personalized nutrition in Europe. With 1,607 participants divided into four groups, it compared four levels of personalization: standard dietary recommendations, personalization based on reported diet, personalization incorporating blood biomarkers, and personalization adding genetic data to biomarkers (PubMed). The result is unambiguous: adding genetic data provided no significant incremental benefit on dietary behavior changes or health markers compared to biomarker-based personalization alone.

This result does not disqualify genomics in nutrition. It establishes a fact: given current knowledge, genotype does not add actionable predictive value beyond what phenotype (as measured by blood biomarkers) already provides.

Blood biomarkers: measuring rather than predicting

A 25(OH)D level (the circulating form of vitamin D) of 15 ng/mL in your blood does not say you have an "increased risk" of suboptimal vitamin D status. It says your status is suboptimal. Right now. The difference between prediction and measurement is fundamental.

Blood biomarkers naturally integrate all variables that influence your nutritional status: your genetic heritage (including those famous polymorphisms), but also your diet over recent weeks, sun exposure, stress level, physical activity, sleep quality, gut microbiome health, and age. All these dimensions are captured in a single measured value, without needing to separately model each one.

Biomarkers offer a second decisive advantage: they are dynamic. Your DNA does not change (epigenetic modifications, which modulate gene expression without altering the sequence, are not detected by consumer tests). Your biology, however, evolves constantly. A blood test performed every four to six months captures these variations and enables continuous supplementation adjustment. Genotype provides a static photograph of a terrain; biomarkers provide a real-time field survey.

85 to 95%
Non-genetic factors

The majority of individual variation in nutritional markers comes from modifiable factors: diet, lifestyle, environment. Blood biomarkers capture this composite reality; DNA tests only see the genetic fraction.

Consider a concrete example. Two individuals carry the same VDR variant associated with "less efficient" vitamin D metabolism. The first lives in Marseille, works outdoors, and eats fatty fish three times a week. Their 25(OH)D level is 45 ng/mL. The second lives in Lille, works in an office, and follows a strict vegan diet. Their level is 12 ng/mL. The genetic variant is identical. The supplementation need is radically different. Only the biomarker can distinguish these two situations.

The future: complementarity, not substitution

The relevant question is not "DNA or biomarkers?" but "in what order and with what weighting?"

DNA could, in time, play a relevant complementary role. Identifying carriers of pharmacogenomic variants to adjust certain molecular forms of bioactives. Guiding the monitoring frequency of certain biomarkers for carriers of at-risk polymorphisms. Refining formulation algorithms by integrating genetic terrain as one contextual variable among others.

But this integrated vision requires two conditions that the current market does not meet. First, randomized clinical trials validating the incremental contribution of genotype within a protocol that already includes blood biomarkers. Second, recommendation algorithms that correctly weight the actual effect sizes of each variant (modest, for the most part), rather than presenting them as diagnostic certainties.

The Academy of Nutrition and Dietetics has positioned nutritional genomics as a field that is "promising but premature for individual clinical practice" (PubMed). The hierarchy of evidence remains clear. Direct measurement via blood biomarkers remains the most reliable, most actionable, and most validated tool for calibrating personalized supplementation.

DNA tells you who you might become. Blood tells you who you are.

Frequently asked questions


References

  1. Khera AV, Chaffin M, Aragam KG, et al. Genome-wide polygenic scores for common diseases identify individuals with risk equivalent to monogenic mutations. Nat Genet. 2018;50(9):1219-1224 (PubMed).
  2. Mars N, Koskela JT, Ripatti P, et al. Polygenic and clinical risk scores and their impact on age at onset and prediction of cardiometabolic diseases and common cancers. Nat Med. 2020;26(4):549-557 (PubMed).
  3. Martin AR, Kanai M, Kamatani Y, et al. Clinical use of current polygenic risk scores may exacerbate health disparities. Nat Genet. 2019;51(4):584-591 (PubMed).
  4. Frosst P, Blom HJ, Milos R, et al. A candidate genetic risk factor for vascular disease: a common mutation in methylenetetrahydrofolate reductase. Nat Genet. 1995;10(1):111-113 (PubMed).
  5. Hickey SE, Curry CJ, Toriello HV. ACMG Practice Guideline: lack of evidence for MTHFR polymorphism testing. Genet Med. 2013;15(2):153-156 (PubMed).
  6. Celis-Morales C, Livingstone KM, Marsaux CF, et al. Effect of personalized nutrition on health-related behaviour change: evidence from the Food4Me European randomized controlled trial. Int J Epidemiol. 2017;46(2):578-588 (PubMed).
  7. Camp KM, Trujillo E. Position of the Academy of Nutrition and Dietetics: nutritional genomics. J Acad Nutr Diet. 2014;114(2):299-312 (PubMed).