Progress in the past 50 years is reviewed with reference to the major scientific disciplines involved in forage conservation. Hybrid cultivars of maize (Zea mays L.), the forage harvester, the large baler, polyethylene covering for silos, stretch-wrap for bales, and additives designed to improve the preservation of moist hay and the fermentation of silage all contributed to improved technological efficiency. The major biochemical pathways involved in silage fermentation have been described together with the effects on fermentation and aerobic stability during the feed-out phase of bacterial inoculation. Total mixed rations have improved efficiency of feed use by livestock. The value of covering bunker and clamp silos has been established, with recent progress in the development of co-extruded oxygen barrier film. Challenges for the future include improving the hygienic quality of silage to reduce risks to animal health, optimising crop and silage composition for biogas digesters, reducing loss of feed nitrogen to the soil and atmosphere, improving silo safety and developing edible sealants for silos.
The five decades from the 1960s to the 2010s have seen huge changes in agriculture with increased mechanisation and larger livestock farm units. In addition, output per head of livestock increased, partly through improved animal nutrition. The development of techniques for the preservation of forage crops of good quality and sufficient quantity has allowed high levels of animal output to be sustained during periods of the year when pasture growth was inadequate to support the nutritional requirement of the grazing animal.
The process of growing, harvesting and conserving forage crops is an applied science involving five distinct disciplines: crop production, engineering, chemistry and biochemistry, microbiology, and animal nutrition. The successful production of silage and hay therefore involves an understanding of the important physical, chemical and biological factors affecting the conservation process of which the most significant are oxygen and water.
I was introduced to silage and hay whilst working on farms in the 1950s and have been involved in forage conservation research from 1965 to the present day. This review is a personal selection of some highlights of progress in forage conservation and some challenges for future improvements in technological efficiency.
Evolution of hay and silage production
Estimates of silage and hay production in Western Europe are shown in Figure 1 for the period 1962 to 2012. There was a five-fold increase in silage DM production between 1962 and 1992, associated with the technological developments described below. Production of hay declined throughout the period so that by 2012 total output was about half that in 1962. Total production of conserved forage increased to reach a peak around 2002, with a small decrease thereafter. Maize silage increased substantially since the 1960s, especially in Western Europe so that by 2000 maize silage accounted for 48 million tonnes DM, 50% of total silage DM production. Since then, the production of maize silage has probably increased further to around 55 million tonnes of DM in 2012.
A similar trend occurred in the USA with total silage production increasing from about 16 million tonnes DM in 1959 to 44 million tonnes DM in 2013. Maize silage currently accounts for 75% of total USA silage DM production. Of the total silage produced in the USA, it is estimated that about 85% is stored in bunkers and unwalled clamps (drive-over piles) with the remainder stored in towers, bales and bags.
Figure 1. Evolution of silage and hay production in Western Europe: 1962 to 2012.
Million tonnes DM
The changes in forage conservation in Europe between 1962 and 2012 were accompanied by large reductions in the population of ruminant livestock during the period; from 201 million cattle in 1962 to 122 million cattle in 2012, and from 272 million sheep in 1962 to 129 million in 2012. In Great Britain, inputs of fertiliser N input to grassland increased between 1962 and 1982 but thereafter the average annual quantity of fertiliser N applied to grassland decreased from 126 kg N ha-1 in 1983 to 55 kg N ha -1 in 2012. It is likely therefore that grassland yields increased in the period 1962 to 1982 and then decreased. There was also an increase in the production of concentrate feed for cattle by animal feed mills in the European Union from 32 million tonnes in 1989 to 41 million tonnes in 2012. There has been a trend in recent decades away from grazing towards more intensive feeding of cattle on diets comprised of silage and concentrates to support higher levels of animal output which, in the case of average milk production per cow, increased in Europe from 2295 kg annum-1 in 1962 to 5580 kg annum-1 in 2012.
Much of the progress in forage conservation in the past 50 years, described in the following sections, has been made with silage and not with hay. An important disadvantage of haymaking is the need for up to seven days of dry weather during which time the crop is dried by turning and tedding (shaking) the crop once or twice daily, with consequential loss of valuable leaf tissue. More energy is needed from solar radiation and convection towards the end of the field-drying period to remove water from within flowering stem tissue than water from leaves and the outer layers of the stem that is lost rapidly in the initial phase of the field-drying process. Consequently the rate at which the crop dries decreases as drying proceeds (Jones, 1979; Jones and Harris, 1980; Wilkinson and Wilkins, 1980). The vulnerability of haymaking to loss of nutrients due to poor weather has been a major factor in the move from hay to silage in the past 50 years.
Silage in the 1950s and early 1960s was made without the use of forage harvesters. Unwalled clamps were made at the edge of the field by raking unchopped material into piles. Although plastic sheeting was first introduced as a method of covering clamps in the 1950s (Shukking, 1976), lack of covering resulted in oxygen ingress during storage with loss of extensive nutrients due to aerobic spoilage (composting or rotting). Nevertheless the probability of success was much greater with silage making than with haymaking. The risk of wet weather after cutting the crop meant that in the 1960s high quality hay was likely to be made in one out of four years in northwest Europe. Today, with only a 24 to 36-hour interval between cutting and harvesting, high quality silage is likely to be made in three out of four years.
In the following sections, progress in the major scientific disciplines involved in forage conservation is highlighted and some challenges for the future are discussed.
The most important development in crop production was the development of high-yielding hybrid cultivars of maize (Zea mays L), some specifically destined for conservation as ensiled whole-crop forage, with emphasis in northern regions on earliness of maturity. In areas where summer drought and early frosts are uncommon the risk of crop failure is relatively low. Research on maize in the 1960s was mainly concerned with establishing the factors affecting the composition and nutritional value of the plant at harvest (Demarquilly, 1969; Bunting and Gunn, 1973) and after ensiling (Harris, 1965; Cummins, 1970; Danley and Vetter, 1973; Andrieu and Demarquilly, 1974ab; Deinum, 1976). This work laid the foundation for work on the role of maize silage in the nutrition of beef and dairy cattle (Thomas et al., 1975; Wilkinson and Penning, 1976; Wangsness and Muller, 1981; Phipps, 1990; Phipps et al., 1995; Bal et al., 1997; Cherney et al., 2004). Recent work has focussed on evaluating new low ferulate sfe and bm3 mutants of potentially higher intake and digestibility (Jung et al., 2011).
Table 1. Typical composition of perennial ryegrass, whole-crop maize and whole-crop wheat silages (Chamberlain and Wilkinson, 1996; Thomas, 2004).
The ability of the maize plant to yield relatively high quantities of low-cost starch per hectare, coupled with a relatively high metabolisable energy (ME) concentration (Table 1) have made the crop a popular choice for farmers in areas where the land is suitable for its cultivation. In terms of composition, maize silage is complementary to grass and grass/clover silages (Phipps, 1990). In marginal regions, whole-crop wheat forage, of similar starch concentration but lower in ME than either maize silage or good quality grass silage, is often the preferred annual forage crop to forage maize.
Other notable highlights of progress in crop production for forage conservation include the development of cultivars of perennial ryegrass (Lolium perenne L.) with higher concentrations of water-soluble carbohydrates (Wilkins and Lovatt, 2004; Davies and Merry, 2004; Moorby et al., 2005; Marley et al., 2007) and hybrids of perennial ryegrass and tall fescue (Festuca arundinaceae L., Humphreys, et al., 2012) which are more resistant to summer drought than ryegrass. Tall fescue is particularly valuable as a perennial grass species for the production of high DM silage and hay because of its ability to lose water more rapidly during wilting than other grasses.
The most important engineering development in forage conservation in the 20th century was the development of the forage harvester. Early flail forage harvesters cut and lifted the crop into a trailer in a single operation – direct cutting – with little or no chopping to reduce particle length. Current recommendations are to chop drier crops relatively short (25 to 50 mm) to aid consolidation and to chop wetter crops to a longer average particle length (80 to 100 mm) to preserve particle length to stimulate rumination and also to reduce liquid effluent production.
The introduction of more complex chopping cylinders on forage harvesters resulted in the separation of the cutting and chopping operation for grass crops, but not for maize and whole-crop wheat for which specialist pick-up attachments were developed. Grain-processing rollers were incorporated into the chopping equipment of forage harvesters in the 1990s to ensure more complete digestion of grain by the animal (Shinners, 2003). The benefits to grain processing are most likely to be seen with mature crops of maize and wheat (Allen et al., 2003).
The invention of the big baler in the 1970s was an important milestone in both hay and silage making. The baler allowed silage to be made for the first time in smaller, transportable packages that, like hay, could be traded easily between farms. Big bale silage gained popularity on smaller farms with limited financial resources to invest in silos and also in upland areas where the terrain was unsuitable for larger machinery. Balers could be used for harvesting straw after grain harvest in arable areas and at other times of the year for silage or haymaking. The design of both balers and self-loading forage wagons meant that they were well-suited to harvesting grass crops that had been field-wilted to 450 and 550 g DM kg-1 fresh weight. In the 1980s automatic wrapping equipment followed the introduction of the large baler. Bales are wrapped in stretch-film of 25μm thickness with a 50% overlap either in-line as they are formed, or subsequently in the field or at the place of storage. Baled silage probably accounts for about 25% of total silage production in Europe.
Tower silos, the predominant way of storing silage in Europe and North America since the end of the 19th century, represented the ultimate in efficiency in terms of low losses and mechanisation of filling, removal and delivery of silage to livestock (Wilkinson et al., 2003; Savoie and Jofriet, 2003). But towers were expensive and had restricted capacity. As livestock units increased in size, bunkers and large unwalled clamps increased in popularity along with mechanical loading and unloading equipment.
The introduction of total mixed rations in the mid 1960s was the result of adding mixing augers to a mobile forage wagon or truck so that silage and other feeds could be mixed together before the mixture was transported to the livestock building and discharged into a feed trough. Colman et al. (2011) demonstrated an improvement in feed efficiency and animal health in 273 dairy herds in France and the United Kingdom following the adoption of a total mixed ration feeding system linked to an internet-based nutrition support service.
Chemistry and biochemistry
In the past 50 years the main biochemical pathways in the silage fermentation have been described and the processes involved have been reviewed elsewhere (McDonald et al., 1991; Rooke and Hatfield, 2003). Research in the early years of the 20th century was aimed at preventing undesirable fermentations that had adverse effects on the quality of cheese made from milk from cows given poorly fermented silage. The emphasis was on direct acidification (reviewed by Watson and Nash, 1960). The preservation of moist hay by acidification with mould inhibitors such as propionic acid, added at 10 g kg-1 fresh weight was a significant development in the 1970s (Knapp et al., 1976; Benham and Redman, 1980) since the traditional method of preserving moist hay by the addition of sodium chloride had proved to be of variable efficacy (Stuart and James, 1931, Watson and Nash, 1960). Uneven application and loss of additive can result in zones where tolerant moulds are able to develop and degrade the preservative allowing subsequent growth of other species and deterioration of the hay (Lacey et al., 1978). Despite these drawbacks, organic acids such as propionic or acetic and their salts are still in use as hay preservatives (Bagg, 2012).
Sulphuric acid, the main acid additive for silage for several decades, was superseded by the introduction in the 1960s of formic acid, a by-product of the refining of crude oil. Concern over adverse effects of sulphuric aid on the animal led to considerable research in the 1960s and 1970s into formic acid, but its relatively high cost limited its initial use (Shukking, 1976). The development of the gravity-feed applicator that allowed the acid to be added uniformly at a relatively low level (20 to 30 kg tonne-1 fresh crop weight) accelerated the adoption of formic acid (Saue and Brierem, 1969). However, it was not until the late 1970s that the scientific basis of its use was established by the important work of Woolford (1975, 1978) who showed that whilst the mineral acids hydrochloric, sulphuric and orthophosphoric had no specific antimicrobial properties against a range of silage bacteria yeasts and moulds, other than via acidification, some straight-chain organic acids (formic, propionic and acrylic) had the dual function of both acidification and inhibitory activity against undesirable spore-bearing bacteria such as clostridia.
Animal feeding experiments confirmed the efficacy of formic acid, especially with wetter crops (Wilson and Wilkins, 1973; Waldo, 1977; Dulphy and Demarquilly, 1977) and by the early 1980s there was sufficient evidence to recommend its use by farmers as the most efficacious silage additive (Wilkinson, 1984) and also to justify its use as the positive control chemical against which other potential additives were assessed (Steen, 1991; Pflaum et al., 1996; European Food Safety Authority, 2006, 2012).
Bunkers are today the preferred way of storing silage, but in the 1960s and for several decades thereafter there was debate about the cost-effectiveness of covering them. Early research to demonstrate the effects of covering on losses showed clear advantages to covering bunkers with neoprene-nylon sheeting (synthetic rubber) over no covering (Gordon et al., 1961). Surprisingly, some farm silos in the USA are not covered today (K.K. Bolsen, personal communication, 2014) in the mistaken belief that the value of the material lost is less than the cost of the covering film, despite detailed research in the early 1990s confirming significantly higher losses in uncovered farm-scale silos than from covered silos (Bolsen et al., 1993). Loss of DM from uncovered silos was very high in the uppermost 0.5m. This finding was substantiated by a study of 127 commercial farm silos that revealed average losses of organic matter (OM) in the uppermost 0.5 m of 470 g kg-1 of crop ensiled for uncovered silos compared to 203 g kg-1 for covered silos (Bolsen, 1997).
The development of covering materials for bunkers and clamps was most rapid in northwest Europe where these types of storage structures predominated. By the mid-1970s the conventional method of covering bunkers and clamps was with a double layer of polyethylene film, each of 125 or 150μm thickness. This technology remained essentially unchanged for 30 years until the development of co-extruded oxygen barrier film (Degano, 1999).
The introduction of high oxygen barrier (HOB) film for covering silos was a step-change in technology that was probably as large as the initial introduction of polyethylene film itself. Silage under HOB film showed less development of moulds and undesirable bacteria, including butyric acid bacterial spores, in the peripheral areas of the silo or bale during the storage period (Borreani and Tabacco, 2008; Orosz et al.,2012). The results of a meta-analysis of 51 comparisons (41 with bunker and clamp silos, 10 with baled silage) between standard polyethylene film and HOB film are in Table 2. The HOB film reduced losses from the outer layers of the silo during the storage period and increased the aerobic stability of maize silage. The practical implications of these findings are that less labour is needed to discard inedible material and the risk of accidentally including spoiled silage in the animals’ diet is reduced. With bales, fewer layers of wrapping and less weight of film may be needed with OB stretch-wrap than with standard wrap and the process of wrapping bales may be speeded up.
Table 2. Losses, inedible silage and aerobic stability of silage in the top surface layer stored under standard or high oxygen barrier (HOB) films.
A significant development in the early 1970s was a better understanding of the key chemical components of the crop that affect the extent and pattern of the silage fermentation process. Weissbach and colleagues in Germany demonstrated the significance of DM, WSC and buffering capacity (BC) in determining the pattern of fermentation and the minimum DM required to achieve a stable lactic acid-dominant fermentation (Figure 2). At the same time, McDonald and co-workers were describing the energy changes in ensiling and showing that losses of energy were lower than those of DM, especially in heterolactic and yeast fermentations of glucose and in the clostridial fermentation of glucose and lactate. As a result the gross energy of extensively fermented silage is about 10% higher than that of the original crop at harvest because the end products are reduced chemically compared to the substrates (McDonald et al., 1973).
Figure 2. Minimum recommended dry matter concentrations for stable silage fermentation
The significance of the growth of undesirable microorganisms such as clostridia in wet silage had been known since the early part of the 20th century (Watson and Nash, 1960) and to this day the production of silage is either prohibited or discouraged in some regions of Europe because of the risk of “late-blowing” of cheese made from milk contaminated with clostridial spores (Wilkinson and Toivonen, 2003). Research in the 1960s into the factors affecting the pattern of silage fermentation demonstrated the important loss in nutritional value associated with clostridial growth in silage, reflected in reduced proportions of total fermentation acids as lactic acid and increased proportions of acetic acid and ammonia-N in total N (Wilkins et al., 1971). Suppression of clostridial growth in crops of low DM and low WSC may be achieved by a range of technological interventions including wilting, chopping and addition of either acid (see above) or homofermentative lactic acid bacteria (reviewed by Woolford, 1984; McDonald et al., 1991; Pahlow et al., 2003 and Kung et al., 2003).
The probability of effectiveness of microbial inoculation depends on the acid tolerance of the species or strain and the number of bacterial colony forming units added per gram of crop (Pitt and Leibensperger, 1987). Heron et al., (1988) found that addition of 104 organisms g-1 was insufficient to improve the fermentation quality of ryegrass whereas 106 g-1 was adequate. Strain of lactic acid bacteria is also likely to influence the probability of a beneficial effect on the fermentation (Woolford and Sawczyc, 1984; Weinberg and Muck, 1996), especially if the bacterial strains have the ability to produce cell-wall degrading enzymes such as ferulate esterase (Dupon et al., 2012).
Work in Germany established that silages with concentrations of undissociated acetic acid of more than 8 g kg-1 FW were stable in air while silages with concentrations lower than 3 g kg-1 FW were unstable in air (Wolthusen et al., 1989). Thus, enhancement of acetic acid concentration should reduce problems of heating and moulding of silage in the feed-out period. Driehuis et al. (1999) and Driehuis and Oude Elferink (1999) reported improved aerobic stability following inoculation of maize silage with L. buchneri, which produces acetic acid and 1,2 propandiol from water-soluble carbohydrates and lactic acid. The same team later identified a new strain, L. diolivorans, which degraded 1,2 propandiol to 1-propanol and propionic acid, potentially further enhancing silage aerobic stability if present in the silage (Krooneman et al., 2002). Acetobacter pasteurianus may also have the potential to improve the aerobic stability of silage (Nishino et al., 1999) although the same species was implicated in the initiation of aerobic deterioration in later work (Dolci et al., 2011). Recent developments have centred on identifying microbial strains and species, some hitherto unknown, qualitatively by analysis of DNA (reviewed by Muck, 2012), which promises further elucidation of relationships between strains, species, communities, chemical composition and nutritional value of conserved forages.
Inoculants have also been developed for hay, some based on Lactobacillus buchneri (Baah et al., 2005) and others based on Bacillus pumilus, an organism that is capable of growing in relatively low concentrations of available water (aw) and is able to compete with spoilage micro-organisms (Mahanna, 1994). However, there is relatively little independent research information to support the efficacy of inoculant products when used on hay baled below 800g DM kg-1 fresh weight (Department of Environment and Primary Industries, 2009; Kung, 2014).
Mould development is a particular hazard because airborne spores can cause allergies and respiratory distress in both livestock, especially horses, and also humans (Robinson, et al., 1996). Mycotoxins can adversely affect the performance, susceptibility to infectious diseases and fertility of dairy cows (Fink-Gremmels, 2008; Hofve, 2014), and some, especially aflatoxin B1, pose a risk to food safety (Driehuis, 2012). Adverse effects of mycotoxins in rations on rumen function, immune status and milk yield may be mitigated by the inclusion of a mycotoxin deactivator product in the diet of dairy cows (Kiothong et al., 2012).
The ultimate test of successful preservation of nutrients in hay or silage is the animal. Demarquilly and Dulphy (1977) comprehensively reviewed earlier work involving comparisons between silage and the corresponding fresh forage or between silage and field-dried hay made from the same original crop. They stated that both the intensity (extent) of fermentation and its pattern influenced the intake and utilisation of silage; for silage to be consumed in similar amounts as the corresponding fresh forage it must have the following characteristics: NH3-N ≤ 50 g kg-1 total N, acetic acid ≤ 25 g kg-1 DM and other volatile acids approx. zero. They concluded that the degradation of protein during ensilage could limit the performance of animals with high protein requirements, as Clancy et al. (1977) also found in comparison of different conservation methods.
Thomas et al. (1968) reported although intake of DM as hay was higher than that of silage made from the same crops, the digestible energy concentration of silage was 1.24 times that of hay made from the same cop which explained higher efficiency of conversion of conserved forage to weight gain or milk production. Wilkinson (1980) showed that the relative yield of ME per unit of fresh crop was higher for silage than that of hay as a result of lower loss of digestible energy during conservation. The substitution (or concentrate sparing) effect of silage increases with silage intake potential (Wilkins, 1974; Huhtanen et al., 2008) reflecting increased digestibility and ME concentration (Flynn and Wilson; 1978; Steen and Agnew, 1995; Keady and Hanrahan, 2013) and improved fermentation quality (Demarquilly and Dulphy, 1977; Steen, 1991; Moran and Owen, 1994; Patterson et al., 1996) enhanced by good silage making technique (Aston et al., 1994).
Degradation of protein to water-soluble nitrogenous compounds including amines and ammonia during the silage fermentation is considered to be a factor responsible for reduced efficiency of nitrogen utilisation in animals given silage diets compared to the fresh crop or dried material (Wilkins, 1974), often attributed to lower rumen microbial protein synthesis (reviewed by McDonald et al., 1991). Some possible reasons, including asynchrony of release of energy and N solubilisation of N, were suggested by Huhtanen et al. (2012). Supplementation of silage with additional protein has given production responses and there is evidence to support histidine as a limiting amino acid for milk production, along with lysine and methionine (Shingfield, et al., 2003; Hristov, et al., 2012). Davies et al. (1997) found that high initial WSC concentration in herbage and inoculation resulted in silage with improved protein quality, indicated by a higher proportion of the leaf protein ribulose-1, 5-biphosphate carboxylase (RUBISCO) than in silage made from herbage of low WSC, without additive or with formic acid. Herbage with low WSC underwent a heterofermentative fermentation, implicating this pathway in enhanced proteolysis.
Inoculation of crops with homofermentative strains of lactic acid bacteria at the time of harvest has been the predominant type of additive for ensilage since the early 1980s in North America (Bolsen and Heidker, 1985) and since the late 1990s in Europe (Wilkinson and Toivonen, 2003). But meta-analyses of animal responses to silage inoculation have produced equivocal conclusions (Kung and Muck, 1997). Possible reasons for failures include competition from epiphytic microflora, insufficient fermentable substrate, low water activity and excessive oxygen (Kung et al., 2003). It is also important to add a greater number of homofermentative lactic acid bacteria than the natural (epiphytic) population of bacteria to increase the probability that the inoculation will dominate the fermentation. Modelling work indicated that a 10-fold increase was necessary (Pitt and Liebensperger, 1987). The target level of addition should be 106 colony-forming units per gram of fresh crop (Heron, 1996; Wilkinson, 2005).
Some challenges for the future
Reduced gaseous emissions from silage
The need to reduce greenhouse gas (GHG) emissions from livestock and ancillary operations is well established (Gill et al., 2009). Although the principal source of GHG emissions is methane from enteric fermentations (Opio et al., 2013), inputs of primary energy to livestock production systems that rely heavily on mechanisation (e.g. silage) should be reduced wherever possible. The energy balance of silage making is shown in Table 3 for three different crops. Primary energy, mainly non-renewable energy used in the manufacture of fertiliser, accounts for almost 60% of the total energy consumed in grass silage. In this analysis maize silage grown with reduced fertiliser and animal manure gave the highest output of ME per unit of energy input with lucerne and grass having similar ratios of ME output to energy input.. GHG emissions may be reduced by reducing the quantity of fertiliser N applied to grass, by growing alternative forages such as lucerne, and also by reducing the global warming potential of other inputs such as polyethylene film. For example, Wheelton et al. (2014) showed that use of OB film was associated with an 82% reduction in global warming potential compared to standard polyethylene film.
The issue of the low nitrogen use efficiency (NUE, N in animal product as a proportion of total N intake) in livestock given forage-based diets is receiving attention worldwide. N excretion is directly related to total N intake (Dewhurst, 2006; Mills, et al., 2008) and work to reduce excretion of N in manure is concentrated on the extent to which total N intake maybe reduced without at the same time reducing animal performance. A detailed study with dairy cows into the scope for reducing both methane and N excretion by manipulating forage source and diet CP concentration showed that methane emission per kg milk was lower for a maize silage-based diet than for a grass silage-based diet. Reducing diet CP concentration from 180 to 140 g kg DM-1 increased NUE, though milk yield was also slightly reduced (Reynolds et al., 2010). Greater capture of N from animal manures and residual crop N in soil will be a focus for future research into crops grown in association with maize and whole-crop cereals, either as winter cover crops or as second crops on double-cropped land in warmer regions.
Table 3. Energy input and output in silage making
Improved hygienic quality of conserved forages
Some silage is very unstable when exposed to air and can deteriorate in less than 24 hours of exposure to the atmosphere (Danner et al., 2003). Inoculation of crops with homofermentative lactic acid bacteria can reduce aerobic stability (Weinberg et al., 1993; Danner et al., 2003). Wilkinson and Davies (2012) highlighted the significance of the aerobic deterioration of silage in terms of hazards to animal health through, for example, the development of mycotoxins and bacterial endotoxins in spoiled silage. Muller (2012) stressed the importance of good hygienic quality in the production of silage for horses and highlighted the challenge of describing hygienic quality with improved accuracy.
The significance of aerobic deterioration of conserved forages in terms of the effects on the animal is now being established though the specific anti-nutritional factors remain to be fully described. Research in Germany showed an average 57% reduction in DM intake of eight maize silages differing in DM, chop length and density and exposed to air for 8 days prior to being offered to goats in a preference trial (Gerlach et al., 2013). In this trial the temperature of the silages was stable for the first 48 hours exposure to air. The mean composition and intakes of the silages exposed to air for 0, 4 and 8 days are shown in Table 4.
Table 4. Composition and intake by goats of maize silage after 0, 4 or 8 days exposure to air
Dry matter concentration, pH, and counts of yeasts, moulds and aerobic mesophilic bacteria increased during exposure to air whilst concentrations of fermentation products decreased, with the largest changes occurring between 4 and 8 days exposure. Accumulated increase in silage temperature above ambient during exposure to air was the best predictor of intake. Comparable work with Lolium multiflorum silages also revealed a mean reduction in DM intake of 0.50 after 8 days exposure to air. However, in contrast to the work with ensiled maize there was little change in temperature, chemical composition or microbial composition during the 8-day aerobic exposure period, suggesting that small and hitherto undetected changes in chemical or microbial composition during exposure of silage to air affect animal preference and DM intake (Gerlach et al., 2014).
The use of L. buchneri as a silage inoculant to improve silage aerobic stability has been criticised on the grounds that its fermentation is inefficient compared with addition of homofermentative bacteria such as L. plantarum (Wilkinson and Davies, 2012). Kleinschmitt et al. (2013) reported that although increased additions of L. buchneri to maize silage were reflected in improved aerobic stability of maize silage, intake and milk yield were not improved (Table 5). The effects were attributed to increased concentrations of acetic acid in silage, in agreement with Eisner et al., (2006); though Gerlach et al (2014) found a positive relationship between acetic acid concentration and intake of grass silage by goats. Possibly there is a critical concentration of acetic acid in silage that determines whether or not the animal is likely to discriminate against the material on the basis of smell or taste; further work is needed on this topic.
Table 5. Effect of level of addition of L. buchneri CNCM 1-4323 to maize silage on acetic acid concentration in maize silage and performance of dairy cows
Increased silo safety
The trend to more continuous housing of livestock and increased production of biogas from silage will increase the demand for year-round provision of high quality conserved forage. As the number of livestock per farm increases greater silo capacity is required. Many silos on livestock units were constructed several decades ago and are now too small. Instead of investing in new structures, old silos are over-filled. The issue of silo safety has been highlighted with particular emphasis on the risk of human injury and death from avalanche collapses of the silo feed face in silos greater than 3 metres in settled height (Bolsen and Bolsen, 2013).
Optimised yield of biogas methane from silage
Increasing quantities of silage are being used in the production of biogas. Weissbach (2009) found that gas yield was related to digestible (i.e. fermentable) organic matter (FOM), which in turn could be predicted from concentrations of ash and acid detergent fibre. For most crops potential biogas yield was 800 litres kg FOM-1 and methane yield was 420 litres kg FOM-1. Amon et al. (2004, 2007) found that ensiling increased specific methane yield (litres of CH4 per kg volatile solids) of whole-crop maize by 25% compared to use of unensiled (i.e. green) material, presumably because the products of the silage fermentation were chemically reduced and thus more suitable substrates for utilisation by methanogenic archaea. They also found that several phenotypic characters of the maize plant had a significant influence on methane yield; namely crude protein, crude fat, cellulose and hemicellulose. A challenge for the future is to improve methane yield via in-line analysis of feedstock composition, by making detailed assessments of the factors in the ensiling process that impact significantly on methanogenesis, and by breeding cultivars with higher concentrations of fermentable substrate. New edible sealants for silos.
The development of an effective material for covering silos that is also edible by livestock remains to be achieved. Berger and Bolsen (2006) described criteria for an edible sealant and experiments with a gelatinised starch/salt matrix. However, the material was costly to produce and required a protective waxy film to prevent water ingress through cracks in the matrix. Similar results were obtained in German work with a complex sprayable blend of potentially edible compounds (Uhl et al., 2011). Sodium acrylate may have potential as an ingredient of an edible silo sealant since it is i) antimicrobial (Woolford, 1978), probably due to its ability to absorb 300 times its weight of water and hence reduce aw (Gelfand, 2014); ii) restricts fermentation and iii) improves the aerobic stability of maize silage (Wilson et al., 1979). When added at 2 kg active ingredient tonne-1 fresh crop sodium acrylate produced silages of similar nutritional value to those made with addition of formic acid (Wilkinson et al., 1979). The potential of sodium acrylate as an edible sealant for silos should be evaluated.
Considerable technological progress has been made in improving forage conservation in the past 50 years but more work has to be done to achieve reduced environmental impact and increased efficiency of feed use by both live animal and mechanical fermenter. Research on improving the efficiency of the conservation of forages from field to faeces is expensive. Given the importance of silage worldwide, the increasing size of livestock units, the rising world population and the need to secure global food security it is disappointing that the capacity for multidisciplinary research on forage crops, which do not compete as food sources with the human population, has decreased with the closure in the past two decades of several major agricultural research establishments in Europe. Investment in forage conservation research brings long-term benefits to many sectors of society, especially when sponsored by both public and private sector organisations. Perhaps the biggest challenge for the future is to create sufficient political will to secure sustained future investment in forage conservation research and development.