A nurse is teaching a client who is at 10 weeks of gestation about nutrition during pregnancy

It’s always important to eat a balanced diet — and it’s even more important when you’re pregnant because what you eat is the main source of nutrients for your baby. However, many women don’t get enough iron, folate, calcium, vitamin D, or protein. So when you are pregnant, it is important for you to increase the amounts of foods you eat with these nutrients.

Most women can meet their increased needs with a healthy diet that includes plenty of fruits, vegetables, whole grains, and proteins. According to the American College of Obstetricians and Gynecologists (ACOG), you should try to eat a variety of foods from these basic food groups. If you do, you are likely to get all the nutrients you need for a healthy pregnancy.

Key Nutrients You Need

According to ACOG, you and your baby need these key nutrients for a healthy pregnancy:

Calcium

Helps to build strong bones and teeth. Main sources include milk, cheese, yogurt, and sardines. During pregnancy you need 1,000 milligrams (mg) daily.

Iron

Helps red blood cells deliver oxygen to your baby. Sources include lean red meat, dried beans, peas, and iron-fortified cereals. During pregnancy you need 27 mg daily.

Vitamin A

You need this vitamin for healthy skin, eyesight, and bone growth. Carrots, dark, leafy greens, and sweet potatoes are good sources. During pregnancy you need 770 micrograms daily.

Vitamin C

Promotes healthy gums, teeth, and bones, and helps your body absorb iron. Good sources include citrus fruit, broccoli, tomatoes, and strawberries. During pregnancy you need 85 mg daily.

Vitamin D

Aids your body in the absorption of calcium to help build your baby’s bones and teeth. Sources include exposure to sunlight, fortified milk, and fatty fish, such as salmon. During pregnancy you need 600 international units (IUs) daily.

Vitamin B6

Helps form red blood cells and helps your body use protein, fat, and carbohydrates. You can find vitamin B6 in beef, liver, pork, whole-grain cereals, and bananas. During pregnancy you need 1.9 mg daily.

Vitamin B12

Helps form red blood cells and maintains your nervous system. You can find this vitamin only in animal products. Good sources include liver, meat, fish, poultry, and milk. During pregnancy you need 2.6 micrograms daily.

Folate (Folic Acid)

A B vitamin important in the production of blood and protein, it also reduces the risk of neural tube defects (a birth defect of the brain and spinal cord). You can find folate in green, leafy vegetables, liver, orange juice, legumes (beans, peas, lentils), and nuts.

You must get at least 400 micrograms of folate daily before pregnancy and during the first 12 weeks of pregnancy to reduce the risk of neural tube defects. During pregnancy, doctors recommend you get 600 micrograms daily.

Weight Gain

Weight gain is important during your pregnancy and something you and your doctor will monitor for nine months until you give birth.  However, gaining too much or too little weight can contribute to problems during your pregnancy for both you and your baby.

Just because you are eating for two doesn’t mean you should eat twice the amount of food. If you are a healthy weight before your pregnancy, you only need to eat an average of about 300 extra calories a day.

Recent recommendations by the Institute of Medicine for pregnancy weight gain begin your pre-pregnancy body mass index (BMI).

Weight gain goal: single Weight gain goal: twins
28-40 lbs Not enough data
25-35 lbs 37-54 lbs
15-25 lbs 31-50 lbs
11-20 lbs 25-42 lbs

Multiple Births

If you are expecting more than one baby, you should discuss what and how much to eat with your health care provider. Your nutrient and calorie needs are higher than those of women carrying one baby.

Prenatal Vitamins

Vitamin and mineral supplements cannot replace a healthy diet. Most doctors recommend that pregnant women take a prenatal vitamin and mineral supplement every day in addition to eating a healthy diet.

Taking a supplement ensures that you and your baby get enough important nutrients like folic acid and iron. But don't overdo it — taking too much can be harmful for you and your baby.

Alcohol, Caffeine, and Fish

  • Pregnant women and women who may become pregnant should not drink alcohol. Drinks containing alcohol include beer, wine, liquor, mixed drinks, and malt beverages.

    Even moderate drinking during pregnancy can cause behavioral or developmental problems for a baby. Heavy drinking during pregnancy can result in serious problems for the baby, including malformation and intellectual disability.

  • While it’s unclear whether or not high caffeine intake leads to miscarriage, it appears moderate caffeine intake (about two 8-ounce cups of coffee) does not.

    Still, it’s probably a good idea to limit caffeine in your diet during your pregnancy. Too much caffeine can interfere with sleep, contribute to nausea, and lead to dehydration.

  • Fish can be a great source of protein, omega-3 fatty acids, and other healthy nutrients. But pregnant women should take care to avoid certain kinds of fish because they contain high levels of mercury, which can harm a growing baby. Fish you should avoid include shark, swordfish, king mackerel, and tilefish.

Among healthy human beings, pregnant women and rapidly growing infants are most vulnerable to iron deficiency (Bothwell et al., 1979). Both groups have to absorb substantially more iron than is lost from the body, and both are at a considerable risk of developing iron deficiency under ordinary dietary circumstances. During pregnancy, more iron is needed primarily to supply the growing fetus and placenta and to increase the maternal red cell mass (Hallberg, 1988).

Iron deficiency is common among pregnant women in industrialized countries, as shown by numerous studies in which hemoglobin concentrations during the last half of pregnancy were found to be higher in iron-supplemented women than in those given a placebo or no supplement (Table 14-1) (Chanarin and Rothman, 1971; Dawson and McGanity, 1987; Puolakka et al., 1980b; Romslo et al., 1983; Svanberg et al., 1976a; Taylor et al., 1982; Wallenburg and van Eijk, 1984). This higher hemoglobin concentration as a result of an improved iron supply not only increases the oxygen-carrying capacity, but it also provides a buffer against the blood loss that will occur during delivery (Hallberg, 1988).

Iron is essential for the production of hemoglobin, which functions in the delivery of oxygen from the lungs to the tissues of the body, and for the synthesis of iron enzymes, which are required to utilize oxygen for the production of cellular energy (Bothwell et al., 1979).

Anemia is defined as a hemoglobin concentration that is more than 2 standard deviations below the mean for healthy individuals of the same age, sex, and stage of pregnancy. Although iron deficiency is the most common cause of anemia, infection, genetic factors, and many other conditions can also lead to anemia. Iron depletion is generally described in terms of three stages of progressively increasing severity (Bothwell et al., 1979):

1.

depletion of iron stores

2.

impaired hemoglobin production (or iron deficiency without anemia)

3.

iron deficiency anemia

The first stage, depletion of iron, is characterized by a low serum ferritin level. This stage is the most difficult to define because it involves an arbitrary decision about how low iron stores should be before they are considered depleted. This is a particularly thorny issue with respect to pregnancy, because storage iron, estimated from bone marrow aspirates (or less directly by the serum ferritin), is low or absent in most women during the third trimester, whether (Svanberg et al., 1976a) or not (Heinrich et al., 1968; Svanberg et al., 1976a) they have received an iron supplement. For this reason, the subcommittee considered low iron stores in late pregnancy to be physiologic and reserved the term iron deficiency for the second and third stages. The second stage, impaired hemoglobin production , is recognized by laboratory tests that indicate an insufficient supply of iron to developing red blood cells, such as low ratio of serum iron to total iron-binding capacity (Fe/TIBC), low mean corpuscular volume (MCV), and/or elevated erythrocyte protoporphyrin (EP), but with a hemoglobin concentration that remains within the normal reference range. Iron deficiency anemia (the third stage) refers to an anemia (e.g., hemoglobin values below the 5th percentile in Figure 14-1) that is associated with additional laboratory evidence of iron deficiency, such as a low serum ferritin level, low serum Fe/TIBC, low MCV, or an elevated EP level (also see the section Laboratory Characteristics of Impaired Hemoglobin Production).

Although some epidemiologic evidence suggests that anemia during pregnancy could be harmful to the fetus, the data are far from conclusive. In a report of more than 54,000 pregnancies in the Cardiff area of South Wales, the risk of low birth weight, preterm birth, and perinatal mortality was found to be higher when the hemoglobin concentration was in the anemic range—<10.4 g/dl before 24 weeks of gestation—compared with a midrange hemoglobin concentration of 10.4 to 13.2 g/dl (Murphy et al., 1986). Elevated hemoglobin values of >13.2 g/dl were also associated with an increased risk of the same poor pregnancy outcomes, perhaps because such values are characteristic of women who develop preeclampsia (hypertension accompanied by generalized pitting or proteinuria after week 20 of gestation), who are similarly at risk. Another pertinent study is that of Garn and coworkers (1981), which was based on more than 50,000 pregnancies in the National Collaborate Perinatal Project National Institute of Neurologic and Communicative Disorders and Stroke. As in the South Wales study, there was a U-shaped relationship between the maternal hemoglobin or hematocrit level during pregnancy and the pregnancy outcome. When the lowest hemoglobin concentration during any stage of pregnancy was below 10.0 g/dl, the likelihood of low birth weight, preterm birth, and perinatal mortality was increased. A hemoglobin concentration that was high during pregnancy (>13.0 g/dl) was also associated with these poor pregnancy outcomes.

In most populations, iron deficiency is by far the most common cause of anemia before 24 weeks of gestation (Puolakka et al., 1980c). It seems plausible, therefore, that iron deficiency could account for the higher risk to the fetus among the anemic pregnant women in the studies described above. However, a cause-and-effect relationship has not been established. Iron deficiency and anemia are more common in blacks and in those of low socioeconomic status, those with multiple gestations, and those with limited education (LSRO, 1984). Any of these confounding factors could be related to a poor pregnancy outcome independently of iron deficiency.

Additional studies indicate a link between maternal anemia at full term and low birth weight (Klein, 1962; Lieberman et al., 1987; Macgregor, 1963), but interpretation of the results is complicated by the fact that the hemoglobin concentration normally rises in the third trimester of pregnancy (Figure 14-1) if sufficient iron is available (Puolakka et al., 1980b; Sjöstedt et al., 1977; Svanberg et al., 1976a Taylor et al., 1982). An association between a low maternal hemoglobin concentration at delivery and low birth weight can be expected since lower hemoglobin values are characteristic of an earlier stage of gestation.

The infant of an iron-depleted mother has surprisingly little evidence of anemia or depletion of iron stores. Numerous studies in which serum ferritin was used to estimate the neonatal iron stores of infants from iron-deficient or iron-sufficient or supplemented mothers show relatively little difference (Agrawal et al., 1983; Fenton et al., 1977; Kaneshige, 1981; Kelly et al., 1978; MacPhail et al., 1980; Milman et al., 1987; Puolakka et al., 1980a) or no significant difference (Bratlid and Moe, 1980; Celada et al., 1982; Hussain et al., 1977; Messer et al., 1980; Rios et al., 1975; van Eijk et al., 1978). Hemoglobin concentration in the newborn was unaffected or minimally affected in most studies (Agrawal et al., 1983; Murray et al., 1978; Sisson and Lund, 1958; Sturgeon, 1959). In the study reported by Sturgeon, hemoglobin concentrations of 6-, 12-, and 18-month-old infants of iron-supplemented mothers were similar to those in infants of unsupplemented mothers. Only in two studies, both from developing countries, was it concluded that newborn infants of anemic mothers were also anemic, although to a far lesser degree than their mothers (Nhonoli et al., 1975; Singla et al., 1978). However, comparable studies from similar settings did not confirm this finding (Agrawal et al., 1983; Murray et al., 1978), suggesting that other nutritional deficiencies and such factors as infection (malaria) might explain the disagreement. Overall, there is little or no laboratory evidence that infants of iron-deficient mothers are more likely to be iron deficient, but it is possible that the risk of low birth weight, prematurity, and perinatal mortality may be increased.

Data on the prevalence of iron deficiency among women during the childbearing years in the United States are available mainly from the second (1976–1980) National Health and Nutrition Examination Survey (NHANES II) (LSRO, 1984). Population estimates were based on a combination of laboratory indices—EP, MCV, and Fe/TIBC—in a nationally representative sample of 2,474 women aged 20 to 44 and 697 younger women aged 15 to 19. Serum ferritin assays were done on a subset of approximately 30% of this population; it was a relatively new assay at the time. Too few pregnant women were included in the survey for detailed analysis.

Two sets of laboratory criteria were used to estimate the prevalence of impaired iron status, which was defined as two or three abnormal laboratory test results out of a set of three tests for iron status. This approach had been found to be more reliable in relation to anemia than the use of any single test (Cook et al., 1976). In the so-called MCV model, MCV, Fe/TIBC, and EP were used for the analysis; these laboratory results were available for most subjects. In the ferritin model, the serum ferritin concentration was substituted for the MCV and probably represents an earlier stage of iron deficiency. Impaired iron status in either model can be considered to be equivalent to iron deficiency, taking into consideration that infection and chronic disease can be confounding factors by mimicking the laboratory abnormalities of iron deficiency.

Among nonpregnant women between the ages of 20 and 44, estimated percentages of impaired iron status varied according to model used: 9.6 ± 1.3% (standard error of the mean [SEM]) as determined by the ferritin model, and 5.4 ± 0.5% as determined by the MCV model. Iron deficiency anemia (two or three abnormal values and hemoglobin < 11.9 g/dl) among nonpregnant white women aged 20 to 44 was less than 2% as determined by both models.

If the prevalence of iron deficiency among pregnant women were no higher than the 5 to 10% reported in NHANES II for nonpregnant women of childbearing age, there would be little basis for considering routine iron supplementation during pregnancy. However, it is generally agreed that both iron needs and prevalence of iron deficiency increase substantially during pregnancy (Hallberg, 1988). In a paper on the worldwide prevalence of anemia written for the World Health Organization, the global prevalence of anemia was estimated at 51% among pregnant women, compared with 35% among women in general, including pregnant women (DeMaeyer and Adiels-Tegman, 1985). Most of the anemia was attributed to iron deficiency. The higher prevalence for pregnant women is consistent with the estimated high iron needs during pregnancy (Bothwell et al., 1979; Hallberg, 1988; see also section Iron Requirements for Pregnancy).

The most convincing evidence that pregnant women in industrialized countries often cannot meet their iron needs from diet alone comes from three careful longitudinal studies from northern European countries. Groups of iron-supplemented and unsupplemented pregnant women were followed with laboratory studies from early pregnancy at 4-week intervals (Puolakka et al., 1980b; Svanberg et al., 1976b; Taylor et al., 1982). In all these studies, the hemoglobin values in the unsupplemented group were significantly lower than those in the supplemented group after 24 to 28 weeks of gestation. The mean difference was 1.0 to 1.7 g/dl between weeks 35 and 40 of gestation. In the latter two studies, the means were more than 2 standard deviations apart during this period, indicating a high prevalence of impaired hemoglobin production because of lack of iron in the unsupplemented group. Thus, even though are no good prevalence data for iron deficiency during pregnancy, it is reasonable to infer that the prevalence is high.

Data from 1976–1980 NHANES II and 1982–1984 Hispanic HANES (HHANES) suggest that low socioeconomic status, low level of education, black or Hispanic background, and high parity were associated with iron deficiency (impaired iron status) in the MCV model for nonpregnant women (LSRO, 1984, 1989). It is reasonable to infer that the same factors would play a role, probably to a greater degree, when iron demands are drastically increased during the last half of pregnancy. The following factors are associated with an increased risk of iron deficiency:

  • Pregnancy (second two trimesters)

  • Menorrhagia (loss of more than 80 ml of blood per month)

  • Diets low in both meat and ascorbic acid

  • Multiple gestation

  • Blood donation more than three times per year

  • Chronic use of aspirin

The prevalence of iron deficiency in NHANES II tended to be higher among the poor; the difference was of borderline significance (p <.1) for women aged 20 to 44 and significant (p <.05) for women aged 15 to 19. The percentages were 5.1 ± 0.5 (SEM) and 3.6 ± 1.0, respectively, for those above the poverty level and 7.8 ± 1.5 and 8.2 ± 1.8%, respectively, for those below the poverty level. Iron deficiency was also more common among women between the ages of 20 and 44 with limited education (13.4 ± 2.8%) compared with those with high school (5.4 ± 0.6%) or college (4.2 ± 0.8%) education.

The prevalence of iron deficiency using the MCV model among 20- to 44-year-old Mexican-American women in HHANES was substantially higher than that among non-Hispanic whites in NHANES II, 11.9 ± 2.0% compared with 5.4 ± 0.5%, respectively. Average parity among the Mexican-American women was considerably higher than that among non-Hispanic whites, and this probably contributed to the higher prevalence of iron deficiency among the Mexican-American women. Impaired iron status with increasing parity was also more prevalent in NHANES II. Iron deficiency was present in 3.1 ± 0.5% of women with no children, in 3.8 ± 0.8% of those with one or two children, in 9.4 ± 1.1% of those with three to four children, and in 11.5 ± 2.1% of those with five or more children. In NHANES II, the same prevalence of iron deficiency (impaired iron status) was found among black women, aged 20 to 44, as among whites in the same group (5.7 ± 0.9 and 5.0 ± 0.6%, respectively). Among the small sample of teenagers, there was a difference (3.8 ± 0.9 and 12.6 ± 4.7, respectively) of borderline significance—(p < 0.1).

Many risk factors, such as poverty, ethnic background, education, and parity, are closely interrelated. Unfortunately, the effects of these interrelationships have not been systematically studied. Most studies have focused on only one factor at a time.

Teenagers may also have an increased risk of iron deficiency because of the high iron requirements imposed by their recent growth spurt (Dawson and McGanity, 1987). In NHANES II, females aged 15 to 19 had a 4.9 ± 1.1% prevalence of iron deficiency as determined by the MCV model and 14.2 ± 3.5% as determined by the ferritin model. The sample was small, and the percentages did not differ significantly from those of the corresponding group of women between the ages of 20 and 44.

The total amount of iron in the average woman's body is about 2.2 g (Bothwell et al., 1979), which is equal to the weight of a dime. Most of this iron can be considered essential because it functions in the transport and utilization of oxygen for the production of cellular energy. Two compounds, ferritin and hemosiderin, serve as a reserve. Iron from these compounds can be mobilized for the production of essential compounds when the supply of dietary iron is insufficient. The vulnerability of an individual to iron deficiency depends on the amount of iron stored.

Catabolized iron is efficiently reutilized, and very little iron is lost from the body except through bleeding. Normal iron losses average approximately 0.9 mg/day in adult men—the population that has been the most thoroughly studied (Green et al., 1968). The corresponding value for women, excluding menstrual losses, is estimated to be about 0.8 mg/day. Menstrual iron losses average about 0.5 mg/day. When this is added to the other losses of 0.8 mg/day, the total is 1.3 mg/day.

Excessive menstrual blood loss (menorrhagia), defined as >80 ml/month, occurs in about 10% of women (Cole et al., 1971; Hallberg et al., 1966). This is equivalent to 1 mg of iron or more lost per day, more than twice the average menstrual iron loss. Menstrual blood loss of >80 ml/month commonly results in iron deficiency (Hallberg et al., 1966). Consequently, some women face the increased iron demands of pregnancy with an already established iron deficiency. Menstrual blood loss varies markedly among women, but in any given woman, there is relatively little variation in the amount of blood lost from one month to the next (Hallberg et al., 1966). Unfortunately, a careful history can barely distinguish groups of women whose volume of blood loss is expected to differ on the basis of oral contraceptive use (Frassinelli-Gunderson et al., 1985).

Women who take oral contraceptive agents will, on average, halve their menstrual blood loss, whereas those who use intrauterine devices (now rare in the United States) will roughly double it (Hefnawi et al., 1974; Israel et al., 1974). Users of oral contraceptives have substantially higher iron stores than do nonusers, based on serum ferritin values (Frassinelli-Gunderson et al., 1985). In NHANES II, 18% of women aged 20 to 44 took oral contraceptives.

Other common reasons for increased iron losses are blood donation and aspirin ingestion. Women who donate more than three units of blood per year are at high risk of being iron deficient (Finch et al., 1977; Simon et al., 1981), unless they take iron supplements regularly. Aspirin ingestion amounting to 300 mg (one tablet) four times a day increases intestinal blood loss from the normal average of about 0.5 ml/day (Pierson et al., 1961). These amounts are equivalent to about 0.2 and 2.0 mg of iron loss per day, respectively.

The amount of iron in the body is determined mainly by the percentage of food iron absorbed from the intestine—a percentage that can vary more than 20-fold (Charlton and Bothwell, 1983; Hallberg, 1981). The bioavailability of iron—that is, the proportion absorbed from food—is determined both by the nature of the diet and by a regulatory mechanism in the intestinal mucosa that is responsive to the abundance of storage iron.

Two types of iron are present in food: heme iron, which is found principally in animal products, and nonheme iron, which is found mainly in plant products. Most of the iron in the diet, an average of more than 88% (Raper et al., 1984), is present as nonheme iron and consists primarily of iron salts. The absorption of nonheme iron is strongly influenced by its solubility in the upper part of the small intestine, which in turn depends on the composition of the meal as a whole (Charlton and Bothwell, 1983; Hallberg, 1981).

Iron absorption tends to be poor from meals in which whole-grain cereals and legumes predominate. Phytates in whole-grain cereals, calcium and phosphorus in milk, tannin in tea, and polyphenols in many vegetables all inhibit iron absorption by decreasing the intestinal solubility of nonheme iron from the entire meal. The addition of even relatively small amounts of meat and ascorbic acid-containing foods substantially increases the absorption of iron from the entire meal by keeping nonheme iron more soluble. For example, compared with water, orange juice will roughly double the absorption of nonheme iron from a meal. Tea and coffee, on the other hand, will cut the absorption of nonheme iron by more than half when compared with water (Hallberg, 1981; Rossander et al., 1979). Thus, modifications in the diet offer great scope for improving iron absorption during pregnancy. Consumption of meals containing enhancers of iron absorption, such as meat and ascorbic acid-rich fruits and vegetables, and avoidance of strong inhibitors such as tea should do much to prevent iron deficiency (Monsen et al., 1978). However, there is little information on the effectiveness of dietary counseling in preventing iron deficiency.

Heme iron is derived primarily from the hemoglobin and myoglobin in meat, poultry, and fish. Although heme iron accounts for a smaller proportion of iron in the diet than nonheme iron does, a much greater percentage of heme iron is absorbed, and its absorption is relatively unaffected by other dietary constituents (Hallberg, 1981).

When both forms of iron in the diet are considered, the average total dietary iron absorption by men is about 6% and by women in their childbearing years, 13% (Charlton and Bothwell, 1983). The higher absorption in women is related to their lower iron stores and helps to compensate for the losses of iron associated with menstruation.

Entry of soluble nonheme iron into the body is regulated in the mucosal cell of the small intestine (Charlton and Bothwell, 1983), but the mechanism remains uncertain (Davidson and Lönnerdal, 1988; Huebers and Finch, 1987; Peters et al., 1988). If iron stores are low, the intestinal mucosa readily takes up nonheme iron and increases the proportion that is absorbed from the diet. During the course of pregnancy, as iron stores decrease, the absorption of dietary nonheme iron increases (Svanberg et al., 1976b). However, the adequacy of this homeostatic response is limited by the amount of absorbable iron in the diet and the high iron requirements for pregnancy.

The body iron requirement for an average pregnancy is approximately 1,000 mg. Hallberg (1988) calculated that 350 mg of iron is lost to the fetus and the placenta and 250 mg is lost in blood at delivery . In addition, about 450 mg of iron is required for the large increase in maternal red blood cell mass. Lastly, basal losses of iron from the body continue during pregnancy and amount to about 240 mg. Thus, the total iron requirements of a pregnancy (excluding blood loss at delivery) average about 1,040 mg. Permanent iron losses during pregnancy include loss to the fetus and placenta, blood loss at delivery, and basal losses, which together total 840 mg.

The total iron needs of slightly more than 1,000 mg are concentrated in the last two trimesters of pregnancy. This amount is equivalent to about 6 mg of iron absorbed per day in a woman who starts pregnancy with absent or minimal storage iron. This is a large amount of iron to accumulate over a 6-month period, especially when compared with the average total body iron content of 2,200 mg and the 1.3 mg of iron absorbed per day by nonpregnant women.

Although 450 mg of iron for red cell production must be supplied during pregnancy, a large part of this can subsequently augment iron stores after a vaginal delivery, when the red cell mass decreases. The result is analogous to a postpartum injection of iron: serum ferritin levels will spontaneously increase within a few months after delivery in most women who develop mild iron deficiency during late pregnancy because of the iron that is released by the decline in red cell mass (Puolakka et al., 1980b; Svanberg et al., 1976a; Taylor et al., 1982). Postpartum iron status is also improved by the decreased iron loss during this period: less than 0.3 mg/day is lost in human milk, and menstruation is rare in women during their first few months of lactation.

The average blood loss during a cesarean delivery is almost twice that occurring with the average vaginal delivery of a single fetus (Pritchard, 1965; Pritchard et al; 1962; Ueland, 1976); the postpartum improvement in iron status may therefore be less complete after a cesarean delivery.

To establish reasonable goals for iron nutrition during pregnancy, it is helpful to distinguish between the potential for iron deficiency as reflected by low iron stores alone and iron deficiency that results in impaired hemoglobin production. In the absence of other laboratory evidence of iron deficiency, low serum ferritin is not associated with any deficits in physiologic function (Bothwell et al., 1979; Dallman, 1986). For this reason, an acceptable goal for iron nutrition during pregnancy is simply to avoid progression beyond low iron stores (first stage of iron depletion) to the stages of impaired hemoglobin production (second stage) or iron deficiency anemia (third stage).

At this stage of iron depletion, the hemoglobin concentration is typically at the lower end of the normal range but not low enough to meet the definition of anemia. Nevertheless, there is the potential for impairment of physiologic function, because the production of essential iron compounds is decreased. This stage can most reliably be distinguished by a rise in hemoglobin concentration in response to iron treatment, demonstrating that the woman's previous hemoglobin concentration had been below her potential. Iron administration, regardless of dose, will not raise the hemoglobin concentration in the absence of iron deficiency. The hemoglobin response to iron therapy is difficult to use to detect iron deficiency during pregnancy because of the normal changes in blood volume and hemoglobin concentration during this period (Figure 14-1).

Fe/TIBC is typically subnormal in people with impaired hemoglobin production due to iron deficiency. Unfortunately, normal ranges for Fe/TIBC during pregnancy have not been firmly established. Fe/TIBC declines substantially during pregnancy, even in iron-supplemented women (Puolakka et al., 1980b; Svanberg et al., 1976a), and results from the test to determine this ratio indicate large biologic variations (Dallman, 1984; LSRO, 1984). EP level is elevated with iron deficiency and is a potentially useful test during pregnancy (Schifman et al., 1987), particularly since levels in iron-supplemented women appear to remain stable from early to late gestation (Romslo et al., 1983).

For nonpregnant women, the normal mean hemoglobin concentration is 13.5 g/dl, and the 5th percentile value is about 12.0 g/dl (LSRO, 1984). As noted above, however, values change considerably during pregnancy (Figure 14-1). Even women who are adequately supplemented with iron have an almost 2 g/dl decline to a mean hemoglobin level of 11.6 g/dl in the second trimester (Puolakka et al., 1980b; Sjöstedt et al., 1977; Svanberg et al., 1976a; Taylor et al., 1982), largely because of the normal expansion of plasma volume (Hytten, 1985; Taylor and Lind, 1979). This increase in plasma volume is detectable as early as 6 to 8 weeks into gestation (Lund and Donovan, 1967). In the last trimester, the hemoglobin concentration gradually rises, reaching a mean value of 12.5 g/dl at 36 weeks of gestation. The World Health Organization criteria for anemia include a uniform value of <11.0 g/dl for all pregnant women instead of <12.0 g/dl, which is used for nonpregnant women (WHO, 1968). The values shown in Figure 14-1 were derived from four carefully performed longitudinal studies of iron-supplemented women (Puolakka et al., 1980b; Sjöstedt et al., 1977; Svanberg et al., 1976a; Taylor et al., 1982). On the basis of these data, cutoff values of 11.0, 10.5, and 11.0 g/dl for the first, second, and third trimesters, respectively, have recently been proposed by the Centers for Disease Control for the screening of pregnant women for anemia (CDC, 1989). If a pregnant woman is judged to be anemic by these criteria (or according to Figure 14-1) and the serum ferritin is <12 µg/liter, she can be presumed to have iron deficiency anemia because other causes of anemia are not characterized by a low serum ferritin level.

The MCV of red blood cells is typically decreased in people with iron deficiency anemia. In the absence of iron deficiency, however, the MCV normally rises by about 5% during pregnancy (Puolakka et al., 1980b; Taylor et al., 1982). Until adjusted MCV reference values for pregnancy are developed, it is doubtful that this determination will have much diagnostic value for pregnant women.

As long as iron deficiency remains prevalent among pregnant women, the difficulty of predicting the subsequent development of iron deficiency from laboratory tests will argue strongly in favor of routine iron supplementation, irrespective of the results of a routine screen for anemia. A more definitive screening followed by treatment only of those women with laboratory evidence of iron deficiency is a more difficult, time-consuming, and costly alternative. Tests for hemoglobin and serum ferritin are the most commonly used combination of laboratory tests for the diagnosis of iron deficiency in studies of pregnant women (Charoenlarp et al., 1988; Dawson and McGanity, 1987; Puolakka et al., 1980b; Romslo et al., 1983; Taylor et al., 1982; Wallenburg and van Eijk, 1984). However, even if hemoglobin and serum ferritin values are normal early in pregnancy, this is no assurance that iron deficiency anemia or impaired hemoglobin production (Lewis and Rowe, 1986) will not develop later. The need to repeat the hemoglobin and serum ferritin analyses one or more times later in pregnancy may be a deterrent to applying the screen-and-treat approach to the routine care of pregnant women.

The absorption of iron supplements is influenced by the solubility of the iron compound (Brise and Hallberg, 1962a); the dose (Brise and Hallberg, 1962a; Ekenved et al., 1976a; Hahn et al., 1951; Sölvell, 1970); timing of the dose, e.g., with or between meals (Brise, 1962; Ekenved et al., 1976a; Hallberg et al., 1978; Layrisse et al., 1973); delivery of the dose, e.g., alone or as part of a vitamin-mineral supplement (Babior et al., 1985; Seligman et al., 1983); and the abundance of iron stores in the individual (Bezwoda et al., 1979; Brise and Hallberg, 1962a; Nielsen et al., 1976; Norrby, 1974).

When determining the dose of iron supplements, it is important to distinguish between the amount of the iron compound and the equivalent in terms of elemental iron. For example, hydrated ferrous sulfate (USP) contains 20% of iron by weight; therefore, a 300-mg tablet of ferrous sulfate contains 60 mg of iron. The corresponding percentages of iron in other commonly used compounds are 12% in ferrous gluconate and 32% in ferrous fumarate. The total absorption of iron from any of these ferrous iron compounds at any specified dose is roughly proportional to its iron content (Brise and Hallberg, 1962a). In the following discussion, doses are specified in terms of elemental iron.

There are two general approaches to evaluating the adequacy of various doses of iron supplements during pregnancy. One involves determining the percentage of iron that is absorbed from a test dose of the iron supplement and then extrapolating the results to the conditions of routine supplement use. The second approach, which might be termed a therapeutic trial, is to administer a certain dose of the supplement over several months and then to determine the efficacy of the regimen in preventing iron deficiency, based on the hemoglobin concentration and other laboratory tests of iron status.

A large and classic study of iron absorption from iron supplements during pregnancy included 466 women (Hahn et al., 1951), who were studied on their second prenatal visit, before initiation of iron supplements. Eleven different doses of ferrous 59Fe were administered ranging from 1.8 to 120 mg. Figure 14-2 shows the median results for the 18-, 39-, and 120-mg doses. The higher the iron dose, the larger the amount absorbed but the lower the percentage absorbed. Absorption from the 120-mg dose was only about twice that from the 18-mg dose. Similarly, only about 50% more iron was absorbed when the dose was increased by about 200%—from 39 to 120 mg. These data suggest that doses of 120 mg and above may not confer a sufficient benefit to warrant the substantial likelihood of side effects (discussed below). Another noteworthy finding was the increase in iron absorption as pregnancy progressed (Figure 14-2).

Studies of women who donate blood regularly can provide data that are of relevance to pregnant women, because women who donate blood share many of the same characteristics with respect to low iron stores, increased iron requirements, and increased iron absorption. The average blood donation by a woman results in the loss of 200 mg of iron (Finch et al., 1977). Brise and Hallberg (1962a) used a dose of 30 mg of elemental iron as ferrous sulfate to study iron absorption in women after an overnight fast. A reassuringly high absorption of 27% was reported in donors compared with 17% in nondonors.

The relationships among stage of gestation, abundance of iron stores, and iron absorption were evaluated by Heinrich and coworkers (1968) in a large group of unsupplemented pregnant women who were given a very small test dose of ferrous iron (0.56 mg of 59Fe) after an overnight fast between the fourth and ninth months of gestation. Hematologic studies and stainable iron in the bone marrow were also analyzed. Absorption increased gradually from about 40% during the fourth month to 90% during the ninth month of gestation. The results support the study of Hahn and coworkers (1951) in showing a substantial rise in iron absorption during pregnancy in unsupplemented women. The very high percentages that were reported by Heinrich and coworkers (1968) can be anticipated with a small test dose of iron, but should not be extrapolated to ordinary circumstances of diet and supplement use. The results also showed that absorption was greatest in those women with the lowest iron stores.

Iron absorption during the progression of pregnancy was also studied by Svanberg et al. (1976a). Iron absorption in primiparous women was measured from a single test dose of 100 mg of elemental iron, which was given as a solution of ferrous sulfate after an overnight fast at 12, 24, and 35 weeks of gestation. Starting at 12 weeks of gestation, women were randomized to groups receiving either 100 mg of iron as a sustained-release tablet of ferrous sulfate or a placebo twice a day with meals. In accord with the study of Hahn and coworkers (1951), iron absorption in the placebo group increased during the course of pregnancy from about 7% at 12 weeks and 9% at 24 weeks to 14% at 35 weeks of gestation. This rise in absorption was associated with and could be ascribed to a decline in storage iron, as estimated from bone marrow aspirates. There was a smaller rise in mean iron absorption in the iron-treated group, who had values of 6, 7, and 9% at 12, 24, and 35 weeks of gestation, respectively.

Slow-release iron supplements of various types were developed primarily to circumvent the high prevalence of side effects when large doses of iron are used. These preparations are more expensive than ordinary, rapidly soluble iron supplements. In three of the studies summarized in Table 14-1, slow-release forms of ferrous sulfate at doses of 105 to 200 mg of iron per day were effective in preventing iron deficiency during pregnancy (Puolakka et al., 1980b; Svanberg et al., 1976a; Wallenburg and van Eijk, 1984). Ekenved et al. (1976a) found that a slow-release preparation was better absorbed than ordinary ferrous sulfate when given with a meal but less well absorbed under fasting conditions. Several slow-release forms of iron are so poorly absorbed that they are unlikely to confer substantial benefit (Middleton et al., 1966). Consequently, only a slow-release preparation of proven effectiveness provides a reasonable alternative should gastrointestinal side effects develop when standard preparations of ferrous iron are used at recommended doses.

Iron tablets (other than slow release preparations) are absorbed more completely when given between rather than with meals (Ekenved et al., 1976a; Hallberg et al., 1978; Layrisse et al., 1973). In the study by Layrisse et al. (1973), 100 mg of iron was given in the form of a ferrous sulfate tablet either after an overnight fast or with a variety of meals characterized by high or low food iron bioavailability. An average of 10 mg (10%) was absorbed after the overnight fast. Irrespective of the type of meal, only an average of 4 to 5 mg was absorbed when the tablet was administered with the meal.

Iron absorption seems to be much greater when a supplement contains only an iron salt than when the iron is part of certain multivitamin-mineral supplements. Calcium carbonate and magnesium oxide appear to be particularly inhibitory to iron absorption (Babior et al., 1985; Seligman et al., 1983). In the 1983 study by Seligman and colleagues, iron absorption improved markedly when calcium as calcium carbonate was decreased from 350 to 250 mg and magnesium as magnesium oxide was decreased from 100 to 25 mg. These findings demonstrate the importance of additional research to test appropriate prenatal multinutrient supplements for in vivo bioavailability. Rough comparisons of iron absorption from various supplements can be most easily derived from the increase in serum iron that occurs after administration of the supplement (Ekenved et al., 1976b).

Since ascorbic acid-rich foods enhance the absorption of dietary iron, one might anticipate that adding ascorbic acid to an iron supplement would also increase iron absorption. However, this was not the case when 50 or 100 mg of ascorbate was given with 30 mg of iron as ferrous sulfate after an overnight fast (Brise and Hallberg, 1962b). Only with very large doses of 200 mg or more was there an increase in iron absorption. However, such large doses of ascorbate, when given with 60 mg of iron, commonly result in epigastric pain as a side effect (Hallberg et al., 1967a). Even when an iron supplement was given with a meal, the addition of 100 mg of ascorbic acid was not effective in increasing the absorption of ferrous iron (Grebe et al., 1975). Thus, it appears that although ascorbic acid is effective in enhancing absorption of dietary iron, presumably by helping to convert insoluble ferric iron to more soluble ferrous iron, this role is probably less important for supplemental iron given in the ferrous form.

Some commonly consumed foods, such as certain breakfast cereals, are highly fortified to supply the equivalent of the Food and Drug Administration's U.S. Recommended Daily Allowance (U.S. RDA) for iron in a single serving. The extent to which fortification iron is absorbed is likely to vary markedly according to the type of iron and the composition of the product (INACG, 1982). Absorption of added ferrous fumarate is enhanced if the product is also fortified with ascorbic acid (INACG, 1982). This enhancement may be surprising in view of the lack of a similar effect with ascorbic acid added to the much larger amounts of ferrous iron that are contained in supplements. Iron absorption is decreased if the fortified cereal product is rich in phytate (Hallberg et al., 1989). Breakfast cereals in the United States are not typically fortified with ferrous iron, and the reliability of such products in preventing iron deficiency during pregnancy is not established. It is therefore prudent not to rely on fortified products as a substitute for an iron supplement.

Several longitudinal studies comparing the use of an iron supplement with a placebo or no supplement during pregnancy are summarized in Table 14-1. All of them were performed in northern Europe or North America and most involved middle-income women. The results indicate that 30 mg (Chanarin and Rothman, 1971) or 65 mg (Dawson and McGanity, 1987; Taylor et al., 1982) of elemental iron per day was as effective as any of the higher doses. There are few therapeutic trials in which doses of iron were less than 60 mg/day. Their results are important, however, because the absorption studies of Hahn et al. (1951) indicate that lower doses should be adequate and because the prevalence of intestinal side effects is related to dose (Hallberg et al., 1967b; see also the section Compliance and Side Effects).

Probably the most complete and authoritative of these studies was that of Chanarin and Rothman (1971), who compared groups receiving doses of ferrous fumarate supplying 30, 60, or 120 mg of iron per day with a group given a placebo. An additional group was given one injection of 1 g of iron as intravenous iron dextran, followed by a 60-mg oral dose of iron per day. Women at 20 weeks of gestation were assigned sequentially to the five groups and treated until term; the contents of the prescribed tablets were not known to the investigators. Between 46 and 49 subjects per group completed the study. Figure 14-3 shows the hemoglobin and serum iron values during the course of pregnancy in the 30-mg, 120-mg, and placebo groups. When compared at 37 weeks of gestation, the hemoglobin and serum iron levels of the 30-mg group did not differ significantly from those of the groups receiving higher doses of iron. The investigators concluded that 30 mg of elemental iron per day was effective in maintaining hemoglobin levels throughout pregnancy. More recently, Hiss (1986) also recommended a dose of 30 mg/day in a review on anemia during pregnancy. In support of this dose, he cited the results of Scott and Pritchard (1974), who determined the hemoglobin concentration and stainable iron in bone marrow at the beginning of the second trimester and at delivery in 20 women who were given 30 mg of iron per day as ferrous fumarate throughout that period. The hemoglobin concentration rose from an initial mean of 11.2 g/dl to a final value of 12.6 g/dl, and bone marrow iron at delivery equaled or exceeded the initial values. One study that cast some doubt on the efficacy of low doses is that of de Leeuw and coworkers (1966), who report that 39 mg of ferrous iron per day did not maintain as high a hemoglobin concentration as did a dose of 78 mg/day. Chanarin and Rothman (1971) suggested that compliance may not have been as good as it was in their own well-monitored study.

A relatively modest dose of iron is preferable to a high dose since iron may inhibit the absorption of other nutrients, zinc in particular. Sandström and coworkers (1985) showed that when a multivitamin-mineral supplement is taken on an empty stomach, high iron doses will inhibit the absorption of concurrently administered zinc. It also appears that iron supplementation, even at modest doses (38 to 65 mg/day for 1 to 4 weeks), may result in a slight decline in serum zinc (Hambidge et al., 1987). Possible consequences of the iron-zinc interaction for human nutrition were reviewed by Solomons (1986).

A reasonable theoretic approach to estimating the requirement for absorbed supplemental iron is to calculate how much iron would be needed to prevent the hemoglobin deficits in unsupplemented women compared with supplemented women. The amount of iron required to prevent such a deficit can be derived most directly from the study of Taylor and Lind (1979), which is summarized in Table 14-1, in which total red cell mass and plasma volume were measured at 12 and 36 weeks of gestation. The final hemoglobin concentration averaged 1.6 g/dl higher in the supplemented than in the unsupplemented group (among the highest differences observed in the studies summarized in Table 14-1). Over this 24-week study period, the red cell mass increased by an average of approximately 450 ml in the supplemented group compared with 180 ml in the unsupplemented women, a difference of 270 ml. Since each milliliter of packed red blood cells contains about 1.2 mg of iron, a 270-ml difference in red cell mass is equivalent to 325 mg of iron. Dividing 325 mg of iron by the 168 days between 12 and 36 weeks of gestation indicates that 1.9 mg of extra iron must be assimilated daily to prevent the deficit in red blood cell mass. After allowing for a higher rate of iron accumulation between 36 and 40 weeks of gestation and adding 25% for greater than average needs, the subcommittee estimated that about 3 mg of supplemental iron in addition to dietary iron should be assimilated daily during the second and third trimesters to prevent iron deficiency in most women. Figure 14-2 suggests that 3 mg of iron can be readily absorbed from a 30-mg daily dose of ferrous iron given between meals.

An appropriate time to begin iron supplementation at a dose of 30 mg/day is after about week 12 of gestation (the beginning of the second trimester), when the iron requirements for pregnancy begin to increase. Iron administration at a dose of 60 to 120 mg/day (preferably in divided doses) is indicated if there is laboratory evidence of an already established anemia at any stage of pregnancy. The dose should be decreased to 30 mg/day when the hemoglobin concentration is within the normal range for the stage of gestation (Figure 14-1).

One problem with supplement use during pregnancy is uncertainty about compliance (Bonnar et al., 1969), particularly when poverty and certain ethnic beliefs reduce the availability or acceptability of supplements. Early in pregnancy, morning sickness probably contributes to reduced consumption of nutrient supplements. Late in pregnancy, constipation and abdominal discomfort are frequent regardless of whether supplements are used or not. Taking high doses of iron may increase these problems and thus discourage supplement use. Iron appears to be best tolerated when administered at bedtime.

Potential side effects of iron administration include heartburn, nausea, upper abdominal discomfort, constipation, and diarrhea. The most careful studies that focused on side effects involved double-blind administration of large therapeutic doses of iron to large groups of blood donors (Hallberg et al., 1967b; Sölvell, 1970). At a dosage of 200 mg of iron per day as ferrous sulfate divided into three doses per day, approximately 25% of subjects had side effects, compared with 13% of those who received a placebo. When the dose was doubled, side effects increased to 40%. Constipation and diarrhea occurred with the same frequency at the two doses, but nausea and upper abdominal pain were more common at the higher dose. The risk of side effects is proportional to the amount of elemental iron in various soluble ferrous iron compounds and therefore appeared to be primarily a function of the amount of soluble iron in the small intestine. Little information is available about side effects at doses below 200 mg, but it is reasonable to infer that side effects are much less likely at 30 mg/day.