Location and Climate
Climatic factors can have an important impact on disease risk (Coakley, 1988). For most of the important foliar diseases in the northern Great Plains region, the risk of disease outbreak is greatest in areas where long-term mean rainfall levels and production potential are highest. Intermittent wet–dry periods often limit progress of foliar disease in this region. Climate also influences the prevalence of root pathogens. Continuous wheat crops grown with no tillage develop severe, yield-limiting root rots (Fusarium spp., take-all, and Pythium spp.) in the Pacific Northwest (Cook, 1982). These problems are not as severe in western Canada because drier soils restrict the development and survival of these pathogens (Bailey et al., 1989; Nilsson, 1969).
Each species of microorganism (both pathogens and beneficials) has an optimum temperature and moisture regime for growth and survival. Small changes in environment can favor one species and inhibit another. For example, the fungal pathogens that cause leaf spot diseases on barley in western Canada have slightly different temperature optima: Scald (R. secalis) does best at 15 to 20°C, net blotch (Pyr. teres) at 20 to 25°C, and spot blotch (Bi. sorokiniana) at 25 to 30°C (Mathre, 1997). As a result, the relative prevalence of these species differs among regions and seasons, with scald more common in northern regions and in years when mean temperatures are low and spot blotch more common in southern regions and in years of above-normal temperature.
Location influences this interaction through factors such as climate and soils. Crop management practices such as type of tillage, seeding rate, and even seeding date can affect the environment within a field, which can in turn affect pathogen populations. As a result, the risk of disease and the optimum combination of management techniques for a particular pathogen can differ from site to site or can be similar across vast regions, depending on a complex interaction among the pest, the crop, and the ecology of the region.
Most crops are particularly vulnerable to diseases during germination and establishment and again during crop senescence at harvest. The risk of seed establishment problems is higher in diversified rotations than in cereal-based systems because dicot crops are more susceptible to damping-off (decay or death of seeds or seedlings) than are cereals. As the use of diverse susceptible crops increases, greater use of seed treatments may be required to reduce the risk of poor stand establishment under cool, wet conditions in the spring.
Disease levels usually increase within the crop during the growing season because the crop canopy closes, providing optimum conditions for the spread of microorganisms. By harvest, pathogen inoculum in the crop is often at high levels, and many crops are vulnerable to attack by foliar pathogens and even saprophytes. Periods of rainfall can delay harvest and result in yield and quality losses due to disease (Gossen and Morrall, 1984). Early maturing cultivars and planting dates that are staggered to permit rapid harvest can be used to reduce these risks. Varied planting dates also reduce the risk of all fields reaching a particular growth stage at the same time, which might coincide with weather conditions having a high risk of diseases like FHB.
Maintaining a diverse and balanced population of biologically active microbial species creates a healthy soil and contributes to improved crop production. Many microbial species are natural enemies of soil-borne plant pathogens, affecting pathogens through competition, antagonism, or parasitism (Cook and Veseth, 1991). Despite the fact that ecosystem function is mainly governed by soil microbial dynamics (Kennedy and Smith, 1995), the microbial component of the soil is largely unknown, with existing species having been identified. Only 8% of the total soil C exists as microbial biomass, but microbes contribute 60 to 80% of the total metabolic activity in soil. The relative number of microbial propagules per gram of soil from the top 15 cm comprises 109 bacteria, 108 actinomycetes, 106 fungi, and 105 algae (Brady, 1974, p. 115–116). Their role in the soil includes decomposition of organic matter, mineralization and immobilization of nutrients for plant growth, N fixation, environmental remediation, plant growth promotion, disease initiation by plant pathogens, and biological control for crop protection. In cereal–fallow cropping systems, reduced microbial diversity results from low organic matter and limited plant diversity from monoculture and weed eradication. Incorporating management practices that increase and stabilize microbial populations, such as manipulating the source (i.e., crop residue) and placement of microbial food substrates, will help to maintain a diverse and active population of microorganisms, which antagonize root pathogens (Cook and Veseth, 1991).
Reduced tillage practices enhance species diversity and support larger microbial populations in the upper layers of soil. Cultivation redistributes the microorganisms throughout the upper and lower soil layers (Doran, 1980; Kennedy and Smith, 1995; Lupwayi et al., 1998). In the wheat phase of different cropping rotations, soil microbial biomass and bacterial diversity was greater in reduced tillage systems (Lupwayi et al., 1998, 1999). Ergosterol content (an indicator of fungal biomass) is greater with no tillage (Monreal et al., 2000). However, increases in microbial biomass may include increases in both beneficial and pathogenic microorganisms.
The crop residues that are available for microbial breakdown are determined by the crop rotation as well as by the crop species and cultivar selected in each phase of the rotation. This in turn determines the structure, nature, and size of the microbial community (Rothrock et al., 1995). Soil microbial biomass was greater in soils sown to wheat when the preceding crop was red clover (Trifolium pratense L.) and lowest after summer fallow (Lupwayi et al., 1999). Microbial diversity was higher under wheat preceded by leguminous crops (red clover and field pea) than following wheat or summer fallow (Lupwayi et al., 1998). Diversity in crop rotation provides a heterogeneous food base for microorganisms that offers more ecological niches and encourages microbial diversity. Reduced tillage contributes to this diversity because heterogeneous residues accumulate on the soil surface over time.
Under conventional tillage and traditional crop production systems, the soil's natural ability for biological control of plant diseases tends to decline. An exception is the take-all disease of wheat grown in monoculture. Continuous monocropping of wheat leads to an increased suppressiveness of the soil to the take-all fungus due to microbial activity (Cook and Baker, 1983). Vilich and Sikora (1998) proposed the term biological system management to describe the concept of managing relationships between the biological elements of the agroecosystem and the biological elements of the crop production system to develop new or altered crop production techniques that contribute to a self-regulating system. Manipulation of agronomic factors relating to residue management, such as tillage, crop rotation, choice of crop species, and cultivar selection, provide the framework by which natural biocontrol may be enhanced. Cook and Veseth (1991) stated that, "biological control is the flywheel of nature," and practices that upset activities of natural agents should be avoided.
Agricultural systems use tillage to bury crop residues; level, consolidate, and warm the seedbed in the spring; reduce surface compaction; break up a hardpan; incorporate pesticides and fertilizers; and reduce weed and disease problems. After one or two tillage operations, little residue from the previous crop is left on the soil surface, which contributes to soil erosion by wind and water. Tillage increases water loss due to surface runoff and increases evaporation by leaving the soil exposed to heat and wind. Tillage also contributes to the loss of organic matter. In contrast, surface residues from reduced tillage operations act as an insulator, reflecting solar radiation and conserving soil moisture. Under reduced tillage, soils warm more slowly in the spring and are cooler during the growing season than soils under conventional tillage. Moisture retention is also improved under reduced tillage because of reduced evaporation, increased snow trapping, and reduced surface runoff due to better water infiltration (Cook and Veseth, 1991).
Reduced tillage increases the risk of foliar disease epidemics compared with conventional tillage because increased levels of primary inoculum are present on crop residue at the soil surface. Many pathogens continue to sporulate longer on soil surface residue than on residue buried by tillage. For example, Mycosphaerella pinodes (Berk. & Bloxam) Vestergren, the cause of mycosphaerella blight of pea, survives much longer in pea residue at the soil surface than when buried (Sheridan, 1973). However, the survival of other pathogens such as Col. truncatum, the cause of anthracnose of lentil (Lens culinaris Medik.), is higher on buried residue than at the soil surface (Buchwaldt et al., 1996).
Once a pathogen is established in the crop, local weather conditions determine how quickly the disease progresses. Therefore, levels of primary inoculum on crop residue are likely to have a more consistent and predictable impact on disease risk where conditions conducive to pathogen dissemination and infection (e.g., rainfall events) occur frequently than in regions like the northern Great Plains where conducive conditions occur sporadically. Changes in the prevalence of individual pathogens (and the associated risk of crop loss) in response to changes in tillage practice have been difficult to predict. For example, reduced tillage is associated with increased root diseases of wheat in parts of North America, but levels of common root rot were lower under reduced tillage than conventional tillage in a 12-yr trial in southern Saskatchewan (Bailey et al., 1992; Bailey and Gossen, unpublished data, 1999). Similarly, reduced tillage often results in increased foliar disease severity, but these differences are not always economically important in the more arid regions of the northern Great Plains (Anderson et al., 1999; Bailey and Duczek, 1996; Bailey et al., 1992, 2000).
With reduced tillage, other management practices may have an impact on diseases. For example, the application of glyphosate [N-(phosphonomethyl)glycine] herbicide reduces sporulation of Lep. maculans, the cause of blackleg on canola and oilseed rape (Br. napus L.) (Humpherson-Jones and Burchill, 1982; Petrie, 1995). This herbicide is commonly used in reduced tillage management systems where it may reduce levels of inoculum for blackleg. Good crop rotations should also be integrated with reduced tillage (Bockus and Shroyer, 1998).
Tillage operations may influence the soil microbial community by the placement of crop residues and by the modification of soil moisture and soil temperature. Changes in tillage system can affect the risk of root rot disease epidemics because the higher soil moisture levels under reduced tillage affect the prevalence of fungal species. For example, the prevalence of Coc. sativus on cereal roots declined under reduced tillage, but the prevalence of Fusarium spp. increased (Bailey, 1996; Bailey and Duczek, 1996). Reduced tillage is occasionally associated with reduced survival of fungal overwintering structures, such as sclerotia of S. sclerotiorum, because it favors the bacteria that breakdown these structures (Nasser et al., 1995). Reduced tillage systems may increase the sustainability of production systems by taking advantage of natural biological processes for pest management.
Stand density has an important impact on disease risk because higher plant densities and denser plant canopies result in conditions more conducive to disease increase and spread. Row spacing and seeding rate affect disease risk by changing the proximity of individual plants and plant parts, which influences the movement of pathogens from plant to plant. Seeding date can also impact canopy structure and stand density. Stand density influences air movement, shading, and moisture retention within the canopy. For example, narrower rows and denser canopies result in higher levels of damage by sclerotinia disease than in thinner stands of grain legumes (Blad et al., 1978; Grau and Radke, 1984; Haas and Bolwyn, 1972; Steadman et al., 1973). Similarly, soybean [Glycine max (L.) Merr.] cultivars with a more compact, dense, and vine-like canopy tend to have more severe symptoms of sclerotinia disease than those with an upright, open, and bushy canopy (Schwartz et al., 1978). High levels of N fertilizer and frequent irrigation also promote the development of dense plant stands and increase risk of sclerotinia disease on bean (Blad et al., 1978). Similarly, botrytis stem and pod rot (Botrytis cinerea Pers.) is often more severe in dense stands of lentil than in thinner stands (Gossen, unpublished data, 1999). In canola, Turkington and Morrall (1993) found that incidence of sclerotinia disease in commercial canola crops was significantly related to canopy density. As noted with sclerotinia disease in soybean, plants in dense stands must compete for light, moisture, and nutrients and so may not fully express their disease resistance potential compared with plants in thinner stands (Pennypacker and Risius, 1999). Lower disease resistance of individual plants in dense stands would further increase disease risk.
Effects of management practices can be subtle. For example, increasing seeding rates may have the unintended affect of increasing primary disease inoculum in the field if the causal agent is carried on seed. Dense populations of weeds in a field may contribute to disease development by increasing the density of the plant canopy or serving as an inoculum reservoir in the absence of a susceptible host crop, or they may reduce disease incidence by trapping spores that might have landed on the susceptible crop (Duczek et al., 1996).
Balanced and adequate fertility for any crop reduces plant stress, improves physiological resistance, and decreases disease risk. For example, wheat fields with low levels of soil N often have higher levels of tan spot disease than adequately fertilized fields (Fernandez et al., 1998; Krupinsky et al., 1997a). Micronutrients may also play an important role in plant health. Chloride, applied as KCl, has been shown to reduce foliar and root rot diseases of small grains when Cl levels are -1 (Fixen et al., 1986). Copper deficiency was associated with increased leaf and head diseases in wheat at some sites (Evans et al., 1990; Franzen and McMullen, 1999). Research in Scotland on oilseed rape has shown that adequate S fertilization may help to decrease the severity of alternaria blackspot (Alternaria spp.) (Walker and Booth, 1994). Balanced fertility may be better maintained with a diverse cropping rotation because each crop species has different nutritional requirements for optimum growth and development and so draws individual nutrients from the soil at different rates.