Thursday, June 25, 2009

What caused the rickets epidemic?

The English, like other Western nations, were once plagued by rickets—a softening of the bones leading to fractures and deformity, particularly in children. Initially rare, it became much more frequent after 1600 and had reached epidemic levels by the turn of the 20th century (Gibbs, 1994; Harrison, 1966; Holick, 2006; Rajakumar, 2003). A survey at the Great Ormond Street Hospital found symptoms of rickets in one out of three children under 2 years of age and another one at Clydeside, in 1884, found symptoms in every child examined (Gibbs, 1994). Elsewhere during the late 19th century, in Boston and Leiden (Netherlands), autopsy studies showed rickets in 80-90% of all children (Holick, 2006).

The currently accepted explanation was developed in the late 1880s by Dr. Theobald Palm. For Palm, rickets seemed to correlate with lack of sun. The illness was more common in northwestern Europe, particularly England, where sunlight was naturally weaker. It was also more common in urban areas where “a perennial pall of smoke, and ... high houses cut off from narrow streets a large proportion of the rays which struggle through the gloom” (Hardy, 2003). His hypothesis was strengthened in 1919 by the finding that ultraviolet light can cure rickets by releasing a chemical, eventually identified as vitamin D, that provides us with calcium and phosphorus from food passing through the gut (Gibbs, 1994). Such health benefits, together with the anti-microbial action of UV light, led to the ‘sunshine movement’ of the 1920s—a vast effort to boost sun exposure by redesigning our clothes, our streets, parks, and buildings, and even our notions of fun and recreation. This movement gave us much of the look and feel of modern life.

By the mid-20th century, rickets was again rare, thus vindicating not only UV therapy but also the view that lack of sun had been the cause (Harrison, 1966). This view nonetheless remains unproven. Although vitamin D does help the body absorb more calcium and phosphorus, no one knows for sure whether the epidemic was due to low levels of this vitamin. That kind of blood test did not exist yet. People may have developed rickets because something was immobilizing calcium or phosphorus in their bodies. They would have then required more vitamin D. This possibility is hinted at by Harrison (1966):


… a number of cases are on record of children with marked rickets who have received an amount of vitamin D ordinarily sufficient to prevent rickets and who do not have manifestations of intestinal malabsorption. When these children are given increased amounts of vitamin D, several thousand units per day, the biochemical manifestations of vitamin D effect result, and roentgenograms show healing of the rickets. The basis for this increased requirement is not known.

At the height of the epidemic, one physician did suggest that something was immobilizing phosphorus in people with rickets. Dr. John Snow (1857) observed that the illness was most frequent in London and the south of England where industrial bakeries used alum to make bread look whiter. It was rare in the north where bread was normally home-baked. He reasoned that this alum combined with phosphorus in the body to form insoluble aluminum phosphate, thus depleting the reserves of phosphorus needed for strong bones.

Snow pointed out that London bakeries would add about one and a half ounces of alum per four pounds of loaf. Since manual laborers met 70% of their energy requirements by eating bread, they would have been ingesting 20 g of alum daily or 4 g of aluminum hydroxide (Dunnigan, 2003). A recent case study describes an infant who developed rickets after consuming 2 g of aluminum hydroxide (via antacids) per day over five to six weeks (Pattaragarn & Alon, 2001). There have been many other reports of antacid-induced rickets (Boutsen et al., 1996; Cooke et al., 1978; Pivnick et al., 1995; Shetty et al., 1998; Spencer & Kramer, 1983).

Snow’s hypothesis was forgotten and has been dusted off only in recent years (Dunnigan, 2003; Hardy, 2003; Paneth, 2003). This renewed interest is partly due to the realization that rickets can be caused not only by lack of vitamin D but also by ingested substances that make phosphorus or calcium unusable. Alum is one, as seen in reports of rickets induced by antacids. Another is phytic acid in cereal grains (Sandberg, 1991). The acid binds to calcium and makes it unavailable to the body, as shown when dogs develop rickets on an oatmeal diet (Harrison & Mellanby, 1939). It is this calcium depletion, via food like unleavened bread or chapatti, that now causes rickets in the Middle East and South Asia (Berlyne et al., 1973; Harinarayan, Ramalakshmi, Prasad, Sudhakar, Srinivasarao, Sarma, & Kumar, 2007).

We may never disprove the view that lack of sun caused the rickets epidemic of a century ago. But we can point to some inconsistencies. First, rickets was much less frequent in northern England and absent in northwest Scotland—the area of Great Britain with the weakest solar UV (Gibbs, 1994). Second, it was not really a disease of cities with dark narrow streets and smoke-filled skies, as Snow (1857) himself observed.


The usual causes to which rickets are attributed are of a somewhat general nature, such as vitiated air, want of exercise and nourishing food, and a scrofulous taint. These explanations, however, did not satisfy me, as I had previously seen a good deal of practice in some of the towns in the north of England, where the over-crowding and the other evils above mentioned were as great as in London, whilst the distortion of the legs in young children was hardly present; moreover, I noticed that the most healthy-looking and best-nourished children often suffered most from curvature of the bones of the legs, owing to their greater weight; and I afterwards found that this complaint was quite common in the villages around London as well as in the metropolis itself.

Lack of sun also fails to explain why the epidemic initially broke out within a small geographic area. Indeed, the evidence points to a highly localized origin, essentially southwest England in the early 17th century. Rickets was completely new to observers at the time, including the College of Physicians president Francis Glisson. In 1650, he wrote:


The disease became first known as near as we could gather from the relation of others, after sedulous inquiry, about thirty years since, in the counties of Dorset and Somerset … since which time the observation of it hath been derived unto all the southern and western parts of the Kingdom. (Gibbs, 1994)

Gibbs (1994) attributes this rapid growth to that of England’s home-based textile industry, which by 1600 had become the main export. “Whole families worked from before dawn until after dusk in their homes and, whether the children were too young to work or old enough to assist in home production, they would have lived their lives predominantly indoors.” This explanation is hard to accept because textile cottage industries developed primarily in the midlands and around London. The southwest trailed the rest of England in this regard. In addition, family workshops were normally off-limits to children below the age of apprenticeship. Such children would have been left with elderly relatives or told to play outside.

But there may have been an indirect link with the growth of England’s textile industry, namely the parallel growth in the use of alum to fix the colors of cloth. Until the ban on alum imports in 1667, when newly exploited Yorkshire shales became the main source, England imported this substance from the Italian Papal States (Balston, 1998; Jenkins, 1971). The port of entry would have been Bristol—in Somerset county, southwest England. This may have been where English bakers first learned to whiten bread with alum.

When did bakers stop using alum? The practice seems to have died out after the turn of the century with tougher enforcement of food adulteration statutes in the United Kingdom and elsewhere (Kassim, 2001). By the mid-20th century, “the use of alum in bread was only occasionally encountered” (Hart, 1952). Eliminating this additive from bread probably did much to eliminate rickets. Probably just as important was the decreasing importance of bread in working class diets, as a result of increasing affluence.

But this is not what the history books say. As Steve Sailer tells us, history is written by those who like to write, and much more has been written about the sunshine movement and its presumed benefits for humanity.

References

Balston, J. (1998). “In defence of alum – 2. England”, In: The Whatmans and Wove Paper: Its invention and development in the West, West Farleigh.


Berlyne, G.M., Ari, J.B., Nord, E., & Shainkin, R. (1973). Bedouin osteomalacia due to calcium deprivation caused by high phytic acid content of unleavened bread, The American Journal of Clinical Nutrition, 26, 910-911.

Boutsen, Y. Devogelaer, J.P., Malghem, J., Noël, H., & Nagant de Deuxchaisnes, C. (1996). Antacid-induced osteomalacia, Clinical Rheumatology, 15, 75-80.

Cooke, N., Teitelbaum, S., & Aviol, L.V. (1978). Antacid-induced osteomalacia and nephrolithiasis, Archives of Internal Medicine, 138, 1007-1009.

Dunnigan, M. (2003). Commentary: John Snow and alum-induced rickets from adulterated London bread: an overlooked contribution to metabolic bone disease, International Journal of Epidemiology, 32, 340-341.

Gibbs, D. (1994). Rickets and the crippled child: an historical perspective, Journal of the Royal Society of Medicine, 87, 729-732.

Hardy, A. (2003). Commentary: Bread and alum, syphilis and sunlight: rickets in the nineteenth century, International Journal of Epidemiology, 32, 337-340

Harrison, D.C., & Mellanby, E. (1939). Phytic acid and the rickets-producing action of cereals, Biochemical Journal, 33, 1660-1680.

Harrison, H.E. (1966). The disappearance of rickets, American Journal of Public Health, 56, 734-737.

Hart, F.L. (1952). Food adulteration in the early twentieth century, Food Drug Cosmetic Law Journal, 7, 485-509.

Harinarayan, C.V., Ramalakshmi, T., Prasad, U.V., Sudhakar, D., Srinivasarao, P.V.L.N., Sarma, K.V.S., & Kumar, E.G.T. (2007). High prevalence of low dietary calcium, high phytate consumption, and vitamin D deficiency in healthy south Indians, American Journal of Clinical Nutrition, 85, 1062-1067.

Harrison, H.E. (1966). The disappearance of rickets, American Journal of Public Health, 56, 734-737.

Holick, M.F. (2006). Resurrection of vitamin D deficiency and rickets, The Journal of Clinical Investigation, 116, 2062-2072.

Jenkins, R. (1971). “The alum trade in the fifteenth and sixteenth centuries, and the beginnings of the alum industry in England,” in: Links in the history of engineering and technology from Tudor times: the collected papers of Rhys Jenkins, pp. 193-203, Newcomen Society (Great Britain), Published by Ayer Publishing.

Kassim, L. (2001). The co-operative movement and food adulteration in the nineteenth century, Manchester Region History Review, 15, 9-18.

Paneth, N. (2003). Commentary: Snow on rickets, International Journal of Epidemiology, 32, 341-343.

Pattaragarn, A., & Alon, U.S. (2001). Antacid-induced rickets in infancy, Clinical Pediatrics, 40, 389-393.

Pivnick, E.K., Kerr, N.C., Kaufman, R.A., Jones, D.P., & Chesney, R.W. (1995). Rickets secondary to phosphate depletion: a sequela of antacid use in infancy. Clinical Pediatrics, 34, 73-78.

Rajakumar, K. (2003). Vitamin D, cod-liver oil, sunlight, and rickets: a historical perspective. Pediatrics, 112, 132-135.

Sandberg, A.S. (1991). The effect of food processing on phytate hydrolysis and availability of iron and zinc. Advances in Experimental Medical Biology, 289, 499-508.

Shetty, A.K., Thomus, T., Rao, J., and Vargus, A. (1998). Rickets and secondary craniosynostosis associated with long-term antacid use in an infant. Archives of Pediatrics & Adolescent Medicine, 152, 1243-1245.

Snow J. (1857). On the adulteration of bread as a cause of rickets. Lancet, ii:4–5. (Reprinted in International Journal of Epidemiology (2003), 32, 336–337.)

Spencer, H., & Kramer, L. (1983). Antacid-induced calcium loss, Archives of Internal Medicine, 143, 657-659.

Thursday, June 18, 2009

Vitamin D and homeostasis

In my previous posts, I argued that a homeostatic mechanism keeps the level of vitamin D in our bloodstream within a certain range. When UV-B light is always intense, as in the tropics, the level seems to be 50-75 nmol/L in young adults and progressively lower in older age groups. The more sunlight varies seasonally, the more the body will produce vitamin D in summer in order to maintain at least 50 nmol/L in winter—a level well below the recommended minimum of 75 nmol/L and even further below the 150 nmol/L now being advocated by vitamin-D proponents.

This homeostatic mechanism breaks down if we daily ingest 10,000 IU of vitamin D or more (Vieth, 1999). It seems that the human body has never naturally encountered such intakes, at least not on a continual basis.

In a recent review article, Robins (2009) presents evidence for a second homeostatic mechanism. Even when the level of vitamin D varies in the bloodstream, the second mechanism ensures that these divergent levels will translate into the same concentration of the biologically active 1,25-(OH)2D metabolite.

Matsuoka et al. (1991) demonstrated that after single-dose, whole-body UVB exposure black subjects had distinctly lower serum vitamin D3 levels than whites, but differences between the two groups narrowed after liver hydroxylation to 25-OHD and disappeared after kidney hydroxylation to 1,25-(OH)2D. These findings suggest that there is a compensatory mechanism whereby, in the presence of vitamin D3 suppression by melanin, the liver and kidney hydroxylating enzymes are activated in tandem to ensure that the concentration of the biologically active 1,25-(OH)2D metabolite is normalized and kept constant regardless of ethnic pigmentation (Matsuoka et al., 1991, 1995).

Robins (2009) goes on to note that nearly half of all African Americans are vitamin-D deficient but show no signs of calcium deficiency. Indeed, they “have a lower prevalence of osteoporosis, a lower incidence of fractures and a higher bone mineral density than white Americans, who generally exhibit a much more favourable vitamin D status.” He also cites a survey of 232 black (East African) immigrant children in Melbourne, Australia, among whom 87% had levels below 50 nmol/L and 44% below 25 nmol/L. None had rickets (McGillivray et al., 2007).

References

Matsuoka, L.Y., Wortsman, J., Chen, T.C., & Holick, M.F. (1995). Compensation for the interracial variance in the cutaneous synthesis of vitamin D, Journal of Laboratory and Clinical Medicine, 126, 452-457.

Matsuoka, L.Y., Wortsman, J., Haddad, J.G., Kolm, P., & Hollis, B.W. (1991). Racial pigmentation and the cutaneous synthesis of vitamin D. Archives of Dermatology, 127, 536-538.

McGillivray, G., Skull, S.A., Davie, G., Kofoed, S., Frydenberg, L., Rice, J., Cooke, R., & Carapetis, J.R. (2007). High prevalence of asymptomatic vitamin-D and iron deficiency in East African immigrant children and adolescents living in a temperate climate. Archives of Disease in Childhood, 92, 1088-1093.

Robins, A.H. (2009). The evolution of light skin color: role of vitamin D disputed, American Journal of Physical Anthropology, early view.

Vieth, R. (1999). Vitamin D supplementation, 25-hydroxyvitamin D concentrations, and safety, American Journal of Clinical Nutrition, 69, 842-856.

Thursday, June 11, 2009

Mad dogs and ....

How can vitamin-D deficiency exist despite lengthy sun exposure? This apparent paradox was raised in my last post. The medical community now recommends bloodstream vitamin D levels of at least 75-150 nmol/L, yet these levels are not reached by many tanned, outdoorsy people.

In a study from Hawaii, vitamin D status was assessed in 93 healthy young adults who were visibly tanned and averaged 22.4 hours per week of unprotected sun exposure, with 40% reporting no use of sunscreen. Yet their mean vitamin D level was 79 nmol/L and 51% had levels below the recommended minimum of 75 nmol/L (Binkley et al., 2007).

These results are consistent with those of a study from Nebraska. The subjects were thirty healthy men who had just completed a summer of outdoor activity, e.g., landscaping, construction, farming, and recreation. One subject used sunscreen regularly and sixteen others sometimes or rarely. Their mean vitamin D level was initially 122 nmol/L. By late winter, it had fallen to 74 nmol/L (Barger-Lux & Heaney, 2002).

A study from south India found levels below 50 nmol/L in 44% of the men and 70% of the women. The subjects are described as “agricultural workers starting their day at 0800 and working outdoors until 1700 with their face, chest, back, legs, arms, and forearms exposed to sunlight.” (Harinarayan et al., 2007).

These studies lead to two conclusions. First, sun exposure seems to produce vitamin D according to a law of diminishing returns: the more we expose ourselves to the sun, the less the vitamin D in our bloodstream increases. Perhaps frequent sun exposure results in less being produced in the skin and more being broken down in the liver. This might explain why intense sun exposure leads to a lower vitamin D level in Hawaiian subjects than in Nebraskans. In the latter group, vitamin D production may be ‘calibrated’ to provide a reserve for the winter months.

Second, to stay above the recommended minimum of 75-150 nmol/L, we must take supplements in the form of vitamin pills or fortified foods. Sun exposure is not enough. Yet even dietary supplementation seems to be countered by some unknown mechanism within the body:


… what effect does a 400 IU/d dose of vitamin D for an extended time (months) have in adults? The answer is little or nothing. At this dose in an adult, the circulating 25(OH)D concentration usually remains unchanged or declines. This was first shown in both adolescent girls and young women. … mothers who were vitamin D deficient at the beginning of their pregnancies were still deficient at term after receiving supplements of 800-1600 IU vitamin D/d throughout their pregnancies. (Hollis, 2005)

The assembled data from many vitamin D supplementation studies reveal a curve for vitamin D dose versus serum 25-hydroxyvitamin D [25(OH)D] response that is surprisingly flat up to 250 μg (10000 IU) vitamin D/d. To ensure that serum 25(OH)D concentrations exceed 100 nmol/L, a total vitamin D supply of 100 μg (4000 IU)/d is required.
(Vieth, 1999)


Only mega-doses can overcome what seems to be a homeostatic mechanism that keeps bloodstream vitamin D within a certain range. Indeed, this range falls below the one that is now recommended. Curious isn't it? Why would natural selection design us the wrong way?

Perhaps ancestral humans got additional vitamin D from some other source, such as the food they ate. In the diets of hunter/gatherers and early agriculturalists, fatty fish are clearly the best source, as seen when we rank the vitamin D content (IU per gram) of different foods (Loomis, 1967):

Halibut liver oil : 2,000-4,000
Cod liver oil : 60-300
Milk : 0.1
Butter : 0.0-4.0
Cream : 0.5
Egg yolk : 1.5-5.0
Calf liver : 0.0
Olive oil : 0.0

Yet fatty fish were unavailable to many ancestral humans, if not most. And again, when vitamin D enters the blood from our diet, it seems to be limited by the same homeostatic mechanism that limits entry of vitamin D from sun-exposed skin.

It looks like natural selection has aimed for an optimal vitamin D level substantially lower than the recommended minimum of 75-150 nmol/L. This in turn implies some kind of disadvantage above the optimal level. Indeed, Adams and Lee (1997) found evidence of vitamin D toxicity at levels as low as 140 nmol/L. But this evidence is ridiculed by Vieth (1999):

The report of Adams and Lee, together with its accompanying editorial, suggest that serum 25(OH)D concentrations as low as 140 nmol/L are harmful. This is alarmist. Are we to start avoiding the sun for fear of raising urine calcium or increasing bone resorption?

These side effects may or may not be serious. But there are others. High vitamin D intake is associated with brain lesions in elderly subjects, possibly as a result of vascular calcification (Payne et al., 2007). Genetically modified mice with high vitamin D levels show signs of premature aging: retarded growth, osteoporosis, atherosclerosis, ectopic calcification, immunological deficiency, skin and general organ atrophy, hypogonadism, and short lifespan (Tuohimaa, 2009). Vitamin D supplementation during infancy is associated with asthma and allergic conditions in adulthood (Hyppönen et al., 2004)

In this, vitamin-D proponents are guilty of some hypocrisy. They denounced the previous recommended level, saying it was just enough to prevent rickets while ignoring the possibility that less visible harm disappears only at higher intakes. Yet the current recommended level ignores the possibility that less visible harm appears below the level of vitamin D poisoning.

This being said, the pro-vitamin-D crowd may still be partly right. The optimal level might now exceed the one the human body naturally tends to maintain. With the shift to industrial processing of cereals, we today consume more phytic acid, which makes calcium unusable and thus increases the body’s need for vitamin D. We have, so to speak, entered a new adaptive landscape and our bodies have not had time to adapt.

Or they may be completely wrong. Frankly, I’m not reassured by the pro-vitamin-D literature. It strikes me as being rife with loosely interpreted facts, like the correlation between cancer rates and distance from the equator (and hence insufficient vitamin D). Cancer rates also correlate with the presence of manufacturing, which is concentrated at temperate latitudes for a number of historical and cultural reasons, notably the absence of slavery and plantation economies.

Then there’s this gem:

The concentrations of 25(OH)D observed today are arbitrary and based on contemporary cultural norms (clothing, sun avoidance, food choices, and legislation) and the range of vitamin D intakes being compared may not encompass what is natural or optimal for humans as a species (Vieth, 1999)

Actually, cultural norms are much more heliophilic today than during most of our past. In a wide range of traditional societies, people avoided the sun as much as possible, especially during the hours of peak UV (Frost, 2005, pp. 60-62). Midday was a time for staying in the shade, having the main meal, and taking a nap. Nor is there reason to believe that sun avoidance and clothing were absent among early modern humans. Upper Paleolithic sites have yielded plenty of eyed needles, awls, and other tools for making tight-fitting, tailored clothes (Hoffecker, 2002).

Heliophilia is the historical outlier, not heliophobia. It was the sunshine movement of the 1920s that first persuaded people to cast off hats, cut down shade trees, and lie on beaches for hours on end. This cultural revolution was still recent when Noël Coward wrote his 1931 piece ‘Mad Dogs and Englishmen’:

In tropical climes there are certain times of day
When all the citizens retire to tear their clothes off and perspire.
It's one of the rules that the greatest fools obey,
Because the sun is much too sultry And one must avoid its ultry-violet ray.
The natives grieve when the white men leave their huts,
Because they're obviously, definitely nuts!

Mad dogs and Englishmen go out in the midday sun,

The Japanese don’t care to, the Chinese wouldn’t dare to,
Hindus and Argentines sleep firmly from twelve to one

But Englishmen detest a siesta.
In the Philippines there are lovely screens to protect you from the glare.
In the Malay States there are hats like plates which the Britishers won't wear.
At twelve noon the natives swoon and no further work is done,
But mad dogs and Englishmen go out in the midday sun.

References

Adams, J.S., & Lee, G. (1997). Gains in bone mineral density with resolution of vitamin D intoxication. Annals of Internal Medicine, 127, 203-206.

Barger-Lux, J., & Heaney, R.P. (2002). Effects of above average summer sun exposure on serum 25-hydroxyvitamin D and calcium absorption, The Journal of Clinical Endocrinology & Metabolism, 87, 4952-4956.

Binkley N, Novotny R, Krueger D, et al. (2007). Low vitamin D status despite abundant sun exposure. Journal of Clinical Endocrinology & Metabolism, 92, 2130 –2135.

Frost, P. (2005). Fair Women, Dark Men. The Forgotten Roots of Color Prejudice. Cybereditions: Christchurch (New Zealand).

Harinarayan, C.V., Ramalakshmi, T., Prasad, U.V., Sudhakar, D., Srinivasarao, P.V.L.N., Sarma, K.V.S., & Kumar, E.G.T. (2007). High prevalence of low dietary calcium, high phytate consumption, and vitamin D deficiency in healthy south Indians, American Journal of Clinical Nutrition, 85, 1062-1067.

Hoffecker, J.F. (2002). Desolate Landscapes. Ice-Age Settlement in Eastern Europe. New Brunswick: Rutgers University Press.

Hollis, B.W. (2005). Circulating 25-Hydroxyvitamin D levels indicative of vitamin D sufficiency: implications for establishing a new effective dietary intake, Journal of Nutrition, 135, 317-322.

Hyppönen, E., Sovio, U., Wjst, M., Patel, S., Pekkanen, J., Hartikainen, A-L., & Järvelin, M-R. (2004). Infant Vitamin D Supplementation and Allergic Conditions in Adulthood. Northern Finland Birth Cohort 1966, Annals of the New York Academy of Sciences, 1037, 84–95.

Loomis, W.F. (1967). Skin-pigment regulation of vitamin-D biosynthesis in man, Science, 157, 501-506.

Payne, M.E., Anderson, J.J.B., & Steffens, D.C. (2008). Calcium and vitamin D intakes may be positively associated with brain lesions in depressed and non-depressed elders, Nutrition Research, 28, 285-292.

Tuohimaa, P. (2009). Vitamin D and aging, Journal of Steroid Biochemistry and Molecular Biology, 114(1-2), 78-84.

Vieth, R. (1999). Vitamin D supplementation, 25-hydroxyvitamin D concentrations, and safety, American Journal of Clinical Nutrition, 69, 842-856.


Thursday, June 4, 2009

A pseudo-epidemic?

In the late 19th century, a major concern was the poor health of industrial populations, particularly in England but also in other Western countries. The cause? For the medical profession, it seemed to be lack of sunlight. In densely packed tenements under the pall of factory smoke, not enough sunlight was getting through to kill bacteria in the air and on exposed surfaces. This argument seemed clinched by the high prevalence of rickets in children. Rickets develops when not enough calcium is absorbed from the gut. Since this absorption is aided by vitamin D, which the skin synthesizes in response to UV-B light, the conclusion was that people were not getting enough sunlight.

This consensus spread from the medical community to public health authorities, as evidenced at the 1913 congress of Québec health services: "Sunlight must illuminate all classrooms for several hours in the day; the sun is a factor of gaiety, and it is the best natural disinfectant" (Labarre, 1913). Another speaker denounced the dimly lit homes of Canadian farmers:

Natural light, in these homes, hardly has more than pure air its rightful place. It is in many cases intercepted by blinds, shutters, thick curtains … In a word, our people have not learned or do not sufficiently understand these laws of hygiene, fundamental laws of capital importance, which consist in wise and frequent ventilation and in salutary lighting of houses by letting the sun's beneficial rays flood in abundantly. (Savard, 1913)

A movement thus took shape to bring sunlight into every area of life. The means were many and varied: public beaches, fresh-air camps, summer resorts, outdoor youth movements, relocation of the working class to the suburbs, early closure of businesses to let workers walk home in the sun, and more and bigger windows on buildings.

The ‘sunshine movement’ became culturally dominant during the 1920s, fueled in part by the Spanish flu epidemic of 1918. The mid-decade was the tipping point. A 1925 novel The Great Gatsby features a woman with "sun-strained eyes," a "slender golden arm," a "brown hand," a "golden shoulder," and a "face the same brown tint as the fingerless glove on her knee" (Fitzgerald, 1992, pp. 15, 47, 57, 84, 185). In 1926, a Connecticut radio station announced that "a coat of tan seems to be the latest style in natural coloring at this season of the year. [It has] been increasing in favor during the last few years" (Nickerson, 1926). In 1929, a fashion magazine, The Delineator, affirmed that all women would appear incompletely beautiful if not made entirely brown or at least golden by the sun (Cole, 1929). The same year, the readers of Vogue were told, "The 1929 girl must be tanned" and "A golden tan is the index of chic" (Vogue, 1929).

This was a big change, as recalled later in a 1938 poem by Patience Strong (Strong, 1938, p. 37). It begins with a crowded beach “full of lovely girls in scant attire – stretched out full length upon the sand beneath the Sun’s fierce fire.” Then amidst the throng, she sees a lone girl:

Her pretty little parasol she carried with an air;
she wore long gloves – a shady hat – and how the
folks did stare! Protected from the sun, her skin
looked smooth and soft as silk; her cheeks were pink
as roses, and her throat as pale as milk.

And suddenly like magic she had disappeared from
view. She had vanished like a vision that dissolves
into the blue. “Come back! Come back!” I cried to
her. But she had passed away;
and then I knew that I had seen the Ghost of Yesterday.

Much of our modern culture can be traced to the sunshine movement. Without it, we would have no public beaches or winter trips to the Caribbean. Early afternoon would be a time for staying indoors. We would have a more densely built urban environment with less sprawl and taller buildings closer to streets. Demographics, too, would look different. The suburbs not having the same allure, the old-stock population would have remained in the inner city, with the suburbs being home to newer groups (as is the pattern in France). Perhaps even sexual morality might have taken another path. After all, it was the sunshine movement that increasingly exposed the human body to public view, notably on beaches and in the street. Public space thus became sexualized to a degree hitherto unthinkable.

Ironically, this cultural revolution may have all begun through a misunderstanding. Doubts have already been expressed about whether lack of sunlight explains the poor health of industrial towns and cities in late 19th century Britain. Malnutrition and poor sanitation were likelier causes. Now there is reason to doubt whether this factor explains the rickets epidemic of the same period.

Today, rickets is most common not where sunlight is weak but where sunlight is quite strong—the Middle East and South Asia. The cause is dietary, specifically low consumption of calcium and high consumption of foods rich in phytic acid, such as unleavened bread or chapatti (Berlyne et al., 1973; Harinarayan, Ramalakshmi, Prasad, Sudhakar, Srinivasarao, Sarma, & Kumar, 2007). Phytic acid strongly binds to calcium and makes it unusable, with the result that less calcium is available to the body. It is this calcium depletion—and not lack of vitamin D—that causes rickets in the Middle East and South Asia.

In the Western world, phytic acid is present in industrially processed cereals, particularly the high-fiber ones that have become popular in recent years (Sandberg, 1991). Before the industrial age, it was much less present in Western diets:

In the archaeological record, rickets is rare or absent in preagricultural human skeletons, while the prevalence increases during medieval urbanization and then explodes during industrialism. In the year 1900, an estimated 80-90 per cent of Northern European children were affected. This can hardly be explained only in terms of decreasing exposure to sunlight and decreased length of breast-feeding. An additional possible cause is a secular trend of increasing intake of phytate since cereal intake increased during the Middle Ages and since old methods of reducing the phytate content such as malting, soaking, scalding, fermentation, germination and sourdough baking may have been lost during the agrarian revolution and industrialism by the emergence of large-scale cereal processing. The mentioned methods reduce the amount of phytic acid by use of phytases, enzymes which are also present in cereals. These enzymes are easily destroyed during industrial cereal processing. (Paleolithic Diet Symposium)

We thus have the apparent paradox of rickets in the face of normal vitamin D levels. This was shown in a case study from the 1970s of rickets in a Bedouin woman:

Vitamin D was present in normal amounts in the plasma of our patient so this excludes the premise that she was deprived of vitamin D. Bedouin women are sunburned over the anterior half of their head and forearms. They go about their tasks at home unveiled. Vitamin D levels would be expected to be normal from the area of skin available for irradiation and the intensity of sunlight in this area. (Berlyne et al., 1973)

She might still have been vitamin-D deficient. Recommended vitamin D levels have since been raised and now range between 75 nmol/L and 150 nmol/L. These new levels, however, are based on data from a North American population that is consuming ever higher levels of phytic acid, particularly with the popularity of high-fiber diets. It’s also doubtful whether such levels can be attained even with considerable sun exposure. Binkley et al. (2007) studied the vitamin D status of 93 healthy young adults from Hawaii. They had an average of 22.4 hours per week of unprotected sun exposure, 40% reported never having used sunscreen, and all were visibly tanned. Yet their mean vitamin D level was 79 nmol/L and 51% had levels below 75 nmol/L.

This study may surprise those who’ve heard that 15 minutes of sunshine every other day will provide more than enough vitamin D. Well, that figure is just a back-of-the-hand calculation. It makes a lot of assumptions about things we don’t fully know. The truth is that we still know little about the different feedback loops that maintain vitamin D in the human body, especially at the levels that now seem necessary.

This study also calls into question the media-fueled perception that North Americans are facing severe vitamin D deficiency because of sun avoidance and excessive sunscreen use. Such a perception is at odds with the rising incidence of skin cancer, particularly among 20-30 year olds. The trend actually seems to be pointing in the other direction: people are exposing themselves more to the sun, not less.

All this is not to say that vitamin D cannot help people who lack calcium because they consume too much phytic acid. Of course it can. Modern diets have created a new adaptive equilibrium that requires higher levels of vitamin D. We could, however, get the same health outcome by changing industrial processing of cereals, specifically by eliminating the heat treatments that inactivate phytases and by allowing these enzymes to reduce the phytic acid content.

References

Berlyne, G.M., Ari, J.B., Nord, E., & Shainkin, R. (1973). Bedouin osteomalacia due to calcium deprivation caused by high phytic acid content of unleavened bread. The American Journal of Clinical Nutrition, 26, 910-911.

Binkley N, Novotny R, Krueger D, et al. (2007). Low vitamin D status despite abundant sun exposure. Journal of Clinical Endocrinology & Metabolism, 92, 2130 –2135.

Cole, C.C. (1929). La Revue Moderne, July 1929, p. 16.

Fitzgerald, F.S. (1992). The Great Gatsby, New York: Collier Books.

Harinarayan, C.V., Ramalakshmi, T., Prasad, U.V., Sudhakar, D., Srinivasarao, P.V.L.N., Sarma, K.V.S., & Kumar, E.G.T. (2007). High prevalence of low dietary calcium, high phytate consumption, and vitamin D deficiency in healthy south Indians, American Journal of Clinical Nutrition, 85, 1062-1067.

Labarre, M.J.P. (1913). De l’hygiène scolaire et de son influence sur le physique et le moral des écoliers, Bulletin Sanitaire, Conseil d’hygiène de la province de Québec, 13, 86-98.

Nickerson, E.C. (1926). Nature's Cosmetics, Bulletin sanitaire, 26(5),134-140.

Sandberg, A.S. (1991). The effect of food processing on phytate hydrolysis and availability of iron and zinc. Advances in Experimental Medical Biology, 289, 499-508.

Savard, A. (1913). Ce que doit être l’Organisation Municipale pour la lutte contre la Tuberculose, Bulletin Sanitaire, Conseil d’hygiène de la province de Québec, 13, 129-150.

Strong, P. (1938). The Sunny Side, London: Muller.

Vogue (1929). June 22, pp. 99, 100.