Let’s look at a forage test report – particularly at the levels of minerals. (In the “dry matter” column, of course.) Lots of interesting numbers. Here are some things I consider when I look at those numbers.

Lane woody
Lane Livestock Services / Roseburg, Oregon
Woody Lane is a certified forage and grassland professional with AFGC and teaches forage/grazing ...

Calcium and phosphorus

I first look at the levels of calcium and phosphorus, and I am particularly interested in two things: the absolute levels of these minerals and the ratio between them. Grasses typically contain 0.25% to 0.6% calcium (recall that everything in this article is expressed on a dry matter basis). Legumes tend to have higher calcium levels than grasses, often 1.4% of the dry matter or higher. For example, if I see a forage calcium level of 1.2%, I would guess that this forage contained a high percentage of legume. Phosphorus levels for both grasses and legumes tend to be in the range of 0.15% to 0.5%. Low phosphorus levels tell me the forage was quite mature. High phosphorus levels, on the other hand, may be due to high levels of phosphorus fertility in the soil. But if I see values outside of these ranges, I look very carefully at them. For example, I would ask if something unusual occurred in that field, like a fertilizer spill, or if the forage lab did the correct assay.

For the ratio between calcium and phosphorus (Ca-P ratio) in the total diet, I like to see at least 1.3 to 1 for most situations and, ideally, 2 to 1 or slightly higher for young, growing animals. Once the phosphorus requirements are met, having a calcium level approximately twice as high as phosphorus helps ensure that male animals don’t suffer from a syndrome called urinary calculi, in which insoluble crystals containing these two minerals form in the urethra and block urination. Older animals generally don’t need such high Ca-P ratios, but under some conditions, ratios lower than 1 to 1 may cause calcium deficiency problems in adults, particularly the milk fever syndrome in mature ewes or dairy cows. Since many feed companies add calcium and/or phosphorus to their supplements and mineral mixtures, knowing the forage levels of calcium and phosphorus helps me understand the total diet and guides my decisions about the need to supplement them.

Magnesium and potassium

After evaluating the levels of calcium and phosphorus, I look at the level of magnesium and an associated mineral, potassium. Low magnesium levels contribute to the spectacular neurological problem of magnesium tetany, which is also called grass staggers or winter tetany, among other names. Whatever we call it, symptoms occur when blood magnesium levels drop below a trigger threshold, which causes the animal to go into seizures. We usually see this problem in the early spring when forage is lush and young. The rule of thumb about magnesium tetany is rather simple: low risk for forage magnesium levels above 0.2%, moderate risk for levels between 0.15% and 0.19%, and high risk for levels  0.15%. 

Except that … (drum roll, please) … high potassium levels in a forage can reduce magnesium absorption from the intestinal tract. How high? Another rule of thumb: Forage potassium levels above 3% can cause problems, particularly if forage magnesium levels are low or marginal. Compared to legumes, grasses are particularly greedy about potassium; they will absorb extra potassium from high-potassium soils, even above their own requirements for growth. I’ve seen grass test higher than 4% potassium – and in the early spring before the soil really warms up, the primary forage growth is grass rather than legumes. Knowing the levels of both magnesium and potassium helps me evaluate the metabolic risks of magnesium tetany and the option of adding extra magnesium to the mineral mix during the risk period.

Advertisement

Sodium 

Some folks ignore the level of sodium, but I don’t. Most trace mineral mixtures contain white salt (sodium chloride), which can also be the main palatability factor that drives animals to consume the mixture. But high sodium levels in forages make me a little nervous, because those high levels may satisfy an animal’s desire for salt and thus reduce its intake of the free-choice trace mineral mixture. Forages usually contain less than 0.2% sodium, but I’ve seen levels higher than 0.4%, especially in forages grown near the ocean or on saline ground, and also in byproduct feedstuffs and other supplemental feeds.

Copper, molybdenum, sulfur and iron

And then there is copper. This is a big bugaboo, especially among sheep producers who rightly worry about chronic copper toxicity. But I’m not only interested in copper; I’m also interested in three other minerals that affect copper absorption: molybdenum, sulfur and possibly iron.

Cattle and sheep have a nutritional requirement for copper at approximately 8 to 11 parts per million (ppm) in their total diet when the dietary molybdenum level is low. But sheep are particularly sensitive to chronic copper toxicity, and even slightly higher copper levels over a long period could cause problems. I’m generally happy to see 8 to 11 ppm copper in a forage test, and this seems to be a common range in forages. But what about forages grown in old orchards, where farmers periodically sprayed trees with Bordeaux mixture (copper sulfate + hydrated lime + water), or in fields where hog manure or chicken litter was applied as fertilizer? What about feeds composed of copper-containing ingredients or feeds mixed incorrectly? I always want to know about elevated levels of copper, and copper values greater than 15 to 18 ppm are red flags that I look at very carefully.

But copper absorption is profoundly influenced by molybdenum, sulfur and, to some extent, iron. High dietary levels of these minerals will reduce copper absorption across the gut wall and, if they are high enough, may even cause a copper deficiency. Forage molybdenum levels can range from less than 1 ppm up through 4 ppm or higher. Sulfur levels are generally 0.1% to 0.3%. I would consider sulfur above 0.35% to be high.

In an ideal world, I would like to see a ratio of copper to molybdenum (in the total diet) of between 6 to 1 and 10 to 1. Higher ratios may increase the risk of copper toxicity, especially if the sulfur levels are also low, while lower ratios suggest the possibility of copper deficiency, especially if sulfur levels are high. Iron can also tie up copper, so I am wary of high iron levels – say above 400 ppm. On the other hand, high iron can also be due to soil contamination of the sample, which is not necessarily a nutritional issue, so I take these iron levels with a grain of salt.

Forages contain other required minerals, of course, such as zinc and manganese. I always scan these numbers for uncommonly high or low values, looking for obvious problems. Selenium, iodine and cobalt values would also be useful, but most laboratories don’t test for them.

Someday, however, I would like to see laboratories provide information about some unusual minerals, like radium and uranium. Because if a forage contained high levels of these minerals, I could feed that forage knowing I could always find my animals at night.