According to researchers, higher levels of vitamin D are associated with a correspondingly reduced risk of cancer.
The body produces Vitamin D when the body is exposed to sunshine.
Researchers made the first connection between vitamin D deficiency and some cancers in 1980 when they noted populations at higher latitudes were more likely to be deficient in vitamin D and experience higher rates of colon cancer.
A new study tried to determine what blood level of vitamin D was required to effectively reduce cancer risk. The marker of vitamin D was 25-hydroxyvitamin D, the main form in the blood.
Researchers used a non-traditional approach of polling analyses of two pervious studies of different types: a randomized clinical trial of 1m169 women and a prospective cohort study of 1,135 women. A clinical trial focuses on whether a specific test or treatment is safe and effective. Conversely, a prospective study looks for outcomes during the study period, in this case incidence of cancer among participants.
The combination of these two studies allowed researchers to obtain a larger sample size and a greater range of blood serum levels of 25-hydroxyvitamin D or 25(OH)D.
A blood test is the only accurate measure of vitamin D level in a person. In the trial cohort, the median blood level of 25(OH)D was 30 ng/ml. In the prospective cohort, the medial blood level of 25(OH)D was 48ng/ml.
Recommended blood levels of vitamin D have been a source of intense debate recently. In 2010, the Institute of Medicine concluded that levels lower the 12 ng/ml represented a vitamin D deficiency and recommended a target of 20 ng/ml. 20 ng/ml can be met in most healthy adults ages 17 to 70.
Afterwards other groups argued for higher blood levels of 50 ng/ml or more. Some groups even advocate for a specific daily dosages.
The current study does not identify a singular optimum daily intake of vitamin D of the manner of intake, which could be sunlight exposure, diet and/or supplementation. Instead the current study simply clarifies that reduced cancer risk becomes measurable at 40 ng/ml.