-
PDF
- Split View
-
Views
-
Cite
Cite
Elias Fereres, María Auxiliadora Soriano, Deficit irrigation for reducing agricultural water use, Journal of Experimental Botany, Volume 58, Issue 2, January 2007, Pages 147–159, https://doi.org/10.1093/jxb/erl165
- Share Icon Share
Abstract
At present and more so in the future, irrigated agriculture will take place under water scarcity. Insufficient water supply for irrigation will be the norm rather than the exception, and irrigation management will shift from emphasizing production per unit area towards maximizing the production per unit of water consumed, the water productivity. To cope with scarce supplies, deficit irrigation, defined as the application of water below full crop-water requirements (evapotranspiration), is an important tool to achieve the goal of reducing irrigation water use. While deficit irrigation is widely practised over millions of hectares for a number of reasons—from inadequate network design to excessive irrigation expansion relative to catchment supplies—it has not received sufficient attention in research. Its use in reducing water consumption for biomass production, and for irrigation of annual and perennial crops is reviewed here. There is potential for improving water productivity in many field crops and there is sufficient information for defining the best deficit irrigation strategy for many situations. One conclusion is that the level of irrigation supply under deficit irrigation should be relatively high in most cases, one that permits achieving 60–100% of full evapotranspiration. Several cases on the successful use of regulated deficit irrigation (RDI) in fruit trees and vines are reviewed, showing that RDI not only increases water productivity, but also farmers’ profits. Research linking the physiological basis of these responses to the design of RDI strategies is likely to have a significant impact in increasing its adoption in water-limited areas.
Introduction
Forecasts of water withdrawals on a global scale predict sharp increases in future demand to meet the needs of the urban, industrial, and environmental sectors. This is due to the fact than more than one billion people do not yet have access to running water or sanitary facilities, and also to insufficient attention being paid, until now, to meet the water requirements of natural ecosystems. Given that the single biggest water problem worldwide is scarcity (Jury and Vaux, 2005) there is significant uncertainty about what the level of water supply will be for future generations.
Irrigated agriculture is the primary user of diverted water globally, reaching a proportion that exceeds 70–80% of the total in the arid and semi-arid zones. It is therefore not surprising that irrigated agriculture is perceived in those areas as the primary source of water, especially in emergency drought situations. Currently, irrigated agriculture is caught between two perceptions that are contradictory; some perceive that agriculture is highly inefficient by growing ‘water-guzzling crops’ (Postel et al., 1996), while others emphasize that irrigation is essential for the production of sufficient food in the future, given the anticipated increases in food demand due to world population growth and changes in diets (Dyson, 1999). Globally, food production from irrigation represents >40% of the total and uses only about 17% of the land area devoted to food production (Fereres and Connor, 2004). Nevertheless, irrigated agriculture is still practised in many areas in the world with complete disregard to basic principles of resource conservation and sustainability. Therefore, irrigation water management in an era of water scarcity will have to be carried out most efficiently, aiming at saving water and at maximizing its productivity.
Irrigation is applied to avoid water deficits that reduce crop production. The process of crop water use has two main components: one due to evaporation losses from the soil and the crop, usually called evapotranspiration (ET), and the other that includes all the losses resulting from the distribution of water to the land. All irrigation waters contain salts and, as water evaporates, salts concentrate in the soil profile and must be displaced below the root zone before they reach a concentration that limits crop production. Salt leaching is achieved by the movement of water applied in excess of ET. Thus, some of the water losses are unavoidable and are needed to maintain the salt balance; however, they can be minimized with efficient irrigation methods and by appropriate management. Reducing ET without a penalty in crop production is much more difficult, however, because evaporation from crop canopies is tightly coupled with the assimilation of carbon (Tanner and Sinclair, 1983; Monteith, 1990; Steduto et al., 2006). A water supply constraint that decreases transpiration below the rate dictated by the evaporative demand of the environment is paralleled by a reduction in biomass production. Given the high costs of irrigation development, until now the paradigmatic irrigation strategy has been to supply irrigated areas with sufficient water so that the crops transpire at their maximum potential and the full ET requirements are met throughout the season. This approach is increasingly challenged by segments of society in regions where water is scarce, because of both the large amounts of water required by irrigation and the negative effects that such diversions and use have on nature. Thus, a strategic change in irrigation management is taking place, one that limits the supply available for irrigation to what is left after all other sectors of higher priority satisfy their needs. Under such situations, farmers often receive water allocations below the maximum ET needs, and either have to concentrate the supply over a smaller land area or have to irrigate the total area with levels below full ET.
The application of water below the ET requirements is termed deficit irrigation (DI). Irrigation supply under DI is reduced relative to that needed to meet maximum ET (English, 1990). Therefore, water demand for irrigation can be reduced and the water saved can be diverted for alternative uses. Even though DI is simply a technique aimed at the optimization of economic output when water is limited, the reduction in the supply for irrigation to an area imposes many adjustments in the agricultural system. Thus, DI practices are multifaceted, inducing changes at the technical, socio-economical, and institutional levels. Nevertheless, the focus of this paper is on providing further understanding of the DI concept for biological scientists interested in the relationships between plants and water, leading to the broader issues that govern the optimization of a limited supply of water in crop production. Loveys et al. (2004) have proposed a number of physiological approaches to enhance irrigation practices under limited water conditions. Hopefully, the application of research conducted at the various levels of biological organization, that is, from molecular to whole plant physiology (Sinclair and Purcell, 2005), will offer new avenues for involving plant biologists in the improvement of DI practices in the future.
Features of deficit irrigation
In the humid and sub-humid zones, irrigation has been used for some time to supplement rainfall as a tactical measure during drought spells to stabilize production. This practice has been called supplemental irrigation (Cabelguenne et al., 1995; Debaeke and Aboudrare, 2004) and, although it uses limited amounts of water due to the relatively high rainfall levels, the goal is to achieve maximum yields and to eliminate yield fluctuations caused by water deficits. Furthermore, supplemental irrigation in humid climates has often been advocated as more efficient than irrigation in the arid zones, because the lower water vapour deficits of the humid zones lead to higher transpiration efficiency than in the arid zones (Tanner and Sinclair, 1983). More recently, the term supplemental irrigation has been used in arid zones to define the practice of applying small amounts of irrigation water to winter crops that are normally grown under rain-fed conditions (Oweis et al., 1998). In this case, this is a form of DI, as maximum yields are not sought. Thus, the terms deficit or supplemental irrigation are not interchangeable, and each DI situation should be defined in terms of the level of water supply in relation to maximum crop ET. One consequence of reducing irrigation water use by DI is the greater risk of increased soil salinity due to reduced leaching, and its impact on the sustainability of the irrigation (Schoups et al., 2005).
To quantify the level of DI it is first necessary to define the full crop ET requirements. Fortunately, since Penman (1948) developed the combination approach to calculate ET, research on crop water requirements has produced several reliable methods for its calculation. At present, the Penman–Monteith equation (Monteith and Unsworth, 1990; Allen et al., 1998) is the established method for determining the ET of the major herbaceous crops with sufficient precision for management purposes. There is, however, more uncertainty when using the same approach to determine the ET requirements of tree crops and vines (Fereres and Goldhamer, 1990; Dragoni et al., 2004; Testi et al., 2006).
When irrigation is applied at rates below the ET, the crop extracts water from the soil reservoir to compensate for the deficit. Two situations may then develop. In one case, if sufficient water is stored in the soil and transpiration is not limited by soil water, even though the volume of irrigation water is reduced, the consumptive use (ET) is unaffected. However, if the soil water supply is insufficient to meet the crop demand, growth and transpiration are reduced, and DI induces an ET reduction below its maximum potential. The difference between the two situations has important implications at the basin scale (Fereres et al., 2003). In the first case, DI does not induce net water savings and yields should not be affected. If the stored soil water that was extracted is replenished by seasonal rainfall, the DI practice is sustainable and has the advantage of reducing irrigation water use. In the second case, both water use and consumption (ET) are reduced by DI but yields may be negatively affected. The challenge of quantifying the ET reduction effected by DI (net water savings) remains, as direct measurements are complex (Burba and Verma, 2005), and the models used to estimate the actual ET of stressed canopies are still quite empirical (Burba and Verma, 2006).
In many world areas, irrigation delivery at the farm outlet is less than what is required. The high costs of irrigation and its benefits offer a justification to expand the networks beyond reasonable limits in order to reach the highest possible number of farmers. This approach has been used in many countries and has led to chronic DI (Trimmer, 1990). Sometimes, the cropping intensity used in the original design becomes obsolete due to marketing reasons, and another of higher intensity and thus of greater water demand is adopted. Inadequate estimation of the crop water requirements in project design is another reason for insufficient network capacity. Finally, in drought periods, irrigated agriculture has the lowest priority and the delivery from irrigation networks may be drastically curtailed. In most of the cases described above, the farmers are at the mercy of the delivery agencies and there is very little margin for them to manage the limited supply efficiently (Tyagi et al., 2005). In particular, drought periods represent a threat to the sustainability of irrigation, not only because water supply is restricted, but also because of the uncertainty in determining when it will be available. Because of chronic water scarcity, in some areas inadequate irrigation supply is becoming the norm rather than the exception, as in Andalusia, Spain, where during the period between 1980 and 1995 in the Guadalquivir Valley, only in four years was there a normal irrigation supply (Fereres and Ceña, 1997). When the supply is restricted, farmers are often faced with having to use DI to achieve the highest possible returns. Even though the economics of DI are relatively straightforward (English, 1990), the reality is that there are many engineering, social, institutional, and cultural issues that determine the distribution and the management of irrigation water. Furthermore, in any attempt to optimize water use for irrigation, there is significant uncertainty in the anticipated results and, often, the alternatives that anticipate higher net returns also have higher risks (English et al., 2002). To reduce uncertainty and risk, computer models that simulate irrigation performance (Lorite et al., 2005), together with social research, can aid in assisting water managers to optimize a limited supply of irrigation water. Nevertheless, until now there has been little or no flexibility in most collective networks to manage irrigation with the degree of precision needed in optimal DI programmes, where controlling the timing of application is essential for avoiding the detrimental effects of stress.
Contrary to the rigid delivery schedules experienced by farmers located in many collective networks, those that have access to water supply on demand or can irrigate directly from groundwater sources, have the capability of managing water with much more flexibility. The ability to adjust the timing and amount of irrigation makes it possible to design first and then to manage and control the best possible DI programme when supply is restricted. The use of permanent, pressurized irrigation systems also makes it possible for small amounts at frequent intervals to be applied, providing an additional tool for stress management. It is therefore possible in water-limited situations, if sufficient knowledge exists, to manage DI optimally with the objective of maintaining or even increasing farmers’ profits while reducing irrigation water use.
Deficit irrigation and water productivity
When water supplies are limiting, the farmer's goal should be to maximize net income per unit water used rather than per land unit. Recently, emphasis has been placed on the concept of water productivity (WP), defined here either as the yield or net income per unit of water used in ET (Kijne et al., 2003). WP increases under DI, relative to its value under full irrigation, as shown experimentally for many crops (Zwart and Bastiaansen, 2004; Fan et al., 2005).
There are several reasons for the increase in WP under DI. Figure 1 presents the generalized relationship between yield and irrigation water for an annual crop. Small irrigation amounts increase crop ET, more or less linearly up to a point where the relationship becomes curvilinear because part of the water applied is not used in ET and is lost (Fig. 1). At one point (IM; Fig. 1), yield reaches its maximum value and additional amounts of irrigation do not increase it any further. The location of that point is not easily defined and thus, when water is not limited or is cheap, irrigation is applied in excess to avoid the risk of a yield penalty. The amount of water needed to ensure maximum yields depends on the uniformity of irrigation. In the simulation of Fereres et al. (1993), the seasonal irrigation depth required for maximum yield increased from 1.3 IM to 2.0 IM, when the coefficient of uniformity decreased from 90% to 70%. Under low uniformity, irrigation efficiency decreases and water losses are high. By contrast, in DI the level of water application is less than IM and the losses are of much less magnitude (Fig. 1). Thus, under the situation depicted in the Fig. 1, the WP of irrigation water under DI must be higher than that under full irrigation. Another, more realistic way of illustrating the fact that WP is higher under DI, is by displaying the distribution of irrigation water over a field in two dimensions (Fig. 2). Because water cannot be applied with perfect uniformity, variations in applied water over the field are ranked and plotted against the fraction of the area (Fig. 2). The depth of water is normalized against the required depth, XR, needed to refill the soil water deficit (Losada et al., 1990; Mantovani et al., 1995). Under full irrigation, the linear distribution of applied water intercepts XR at the 0.5 fraction of the total area. Thus, half of the field is over-irrigated and the other half has a deficit, the slope of the line being indicative of the distribution uniformity of the application method. Under DI, the depth of application is less than XR and, in the case of Fig. 2, all of the applied water remains in the root zone and may be used in ET. Evidently, in the case of DI in Fig. 2, the whole field has some soil water deficit after irrigation and there will be areas with a level of deficit that may be detrimental for production. The DI line of Fig. 2 emphasizes the need to have irrigation systems of high application uniformity under DI (the lowest possible slope) to limit the level of deficit in the areas of the field that receive the lowest depths. It is also evident from Fig. 2 that the WP of irrigation water under DI must be higher than that under full irrigation.

Generalized relationships between applied irrigation water, ET, and crop grain yield. IW indicates the point beyond which the productivity of irrigation water starts to decrease, and IM indicates the point beyond which yield does not increase any further with additional water application.

Distribution of irrigation depth, X, as a function of fractional irrigated area. Hypothesized relationships, resulting from the spatial distribution of irrigation water over a field, between the depth of water applied (X, normalized with respect to the required depth to refill the soil water deficit) as a function of the fraction of the area irrigated for full and deficit irrigation. Note that under full irrigation, 50% of the area receives water in excess of the required depth, XR, needed to refill the root zone.
In addition to the factors associated with the disposition of irrigation water, WP is also affected by the yield response to irrigation. Yield responses to irrigation and to ET deficits have been studied empirically for decades (Stewart and Hagan, 1973; Vaux and Pruitt, 1983; Stewart and Nielsen, 1990; Howell, 2001). It turned out that it is not only biomass production that is linearly related to transpiration, but the yield of many crops is also linearly related to ET, as shown in Fig. 1. The design of a DI programme must be based on knowledge of this response but the exact characteristics of the response function are not known in advance. Also, the response varies with location, stress patterns, cultivar, planting dates, and other factors. In particular, many crops have different sensitivities to water stress at various stages of development, and the DI programme must be designed to manage the stress so that yield decline is minimized. However, when the yield decline, in relative terms, is less than the ET decrease, WP under DI increases relative to that under full irrigation. Nevertheless, from the standpoint of the farmer, the objective is not WP per se, but net income, low risk, and other issues related to the sustainability of irrigation are more important. Knowledge of the crop response to DI is essential to achieve such objectives when water is limited.
Deficit irrigation for biomass production
The close link between biomass production and water use makes it difficult to use DI when the objective is the production of total biomass. Nevertheless, one major irrigated crop in the arid zones that is grown for its biomass, alfalfa, has been the subject of many studies aimed at reducing its water consumption. Alfalfa WP is relatively low and its ET is quite high; in the western states of the USA, ET normally ranges from 900 mm to 1200 mm, but it can reach 1800 mm in desert areas (Grismer, 2001). A reduction in alfalfa transpiration due to water deficits is associated with a decrease in biomass production and there seems to be little opportunity to reduce its consumptive use (Sammis, 1981). However, because evaporative demand changes throughout the season, it may be possible to limit or eliminate irrigation in the months of high evaporative demand and to produce alfalfa in periods of low evaporative demand. Figure 3 presents the monthly ET requirements to produce 1 ton of alfalfa at Cordoba in Spain. In spring and autumn, the consumptive use is about half of what evaporates in the peak summer months. Thus, if irrigation is reduced during summer, the WP of alfalfa would increase (Tayfur et al., 1995). One limitation is that the longevity of the stand may be affected by summer water deficits (Ottman et al., 1996) and that may be related to the pattern of accumulation of root reserves (Rapoport and Travis, 1984) and to their role in regrowth following water deficits. This is one area where research in plant physiology can aid in determining the minimum irrigation levels during summer that would be required for optimal alfalfa DI.

Average monthly consumptive use (ET) requirements for producing 1 metric ton of alfalfa at Cordoba, Spain. Biomass was calculated with a simple model that used transpiration efficiency values, obtained by Asseng and Hsiao (2000) in Davis, CA, USA, the long-term average consumptive use of alfalfa at Cordoba, Spain, and a correction for root dry matter estimated from Rapoport and Travis (1984).
Deficit irrigation in annual crops
Harvestable yield of annual crops is normally a fraction of the biomass produced (Evans, 1993). Water deficits, by affecting growth, development, and carbon assimilation, reduce the yield of most annual crops (Hsiao and Bradford, 1983). The reduction in yield by water deficits is caused by a decrease in biomass production and/or by a decrease in the fraction of biomass that is harvested, termed the harvest index (HI). It should be noted that here reference is only made to above-ground biomass production. This is because, in most studies, information on roots is scant, given the difficulties in quantifying root biomass under field conditions. Past research has shown that the response to water deficits very much depends on the pattern of stress imposed (Dorenboos and Kassam, 1979). In one pattern that has been frequently used, the water deficit increases progressively as the season advances due to a combination of the uniform application of a reduced amount and the depletion of the soil water reserve. This pattern, hereafter called sustained deficit irrigation (SDI), allows for water stress to develop slowly and for the plants to adapt to the water deficits, in soils with significant water storage capacity. Under an SDI regime, the differential sensitivity of expansive growth and photosynthesis to water deficits (Hsiao, 1973) leads to reduced biomass production under moderate water stress due to a reduction in canopy size and in radiation interception. However, dry matter partitioning is usually not affected and the HI is maintained. As the water stress increases in severity, though, there could be direct effects on the HI in many determinate crops, particularly when the post-anthesis fraction of total transpiration is too low (Fischer, 1979).
The response to SDI described above has been documented extensively in the major field crops and Fig. 4 exemplifies the response of maize, wheat, and sunflower. As biomass production (B) is reduced, the HI stays constant until it starts to decrease, in the case of Fig. 4, at about 60% of maximum biomass. That declining point varies in different reports and it can be less or more depending on the rate of development of water deficits, in turn determined by the root-zone water storage capacity and the evaporative demand. Deficit irrigation in this case should be designed within a domain where the HI is conserved at its maximum value; that is, at irrigation targets that produce at least 60% of maximum biomass. That SDI regimes should be designed at relatively high levels of irrigation supply has been verified in numerous experiments summarized in wheat by Musick et al. (1994), by many recent DI experiments in China, primarily with maize and wheat (Li et al., 2005; Zhang et al., 2005), and by Oweis and co-workers in the Middle East working with grain legumes (Oweis et al., 2004, 2005).

Relationship between harvest index (HIR) as a function of biomass production (BR) in response to water deficits. Both are expressed relative to the values observed under full irrigation and all were obtained in experiments conducted under sustained DI. The maize data are from Farré and Faci (2006), the sunflower data from Soriano et al. (2002), and the wheat data from two 4-year experiments reported by Ilbeyi et al. (2006).
There are a few major crops where the HI response differs from that of Fig. 4. Figure 5 presents the HI–B relationship for grain sorghum and for two cotton cultivars under SDI. When sorghum is subjected to mild-to-moderate stress, its HI increases above that of full irrigation, in particular in deep, open soils such as the Yolo loam in Davis, CA, USA (Hsiao et al., 1976). As stress increases in severity, HI is conserved until it starts to decline at levels below 0.4 BR (Fig. 5). In the case of cotton, an indeterminate crop, the HI of one cultivar (Coker-310; Fig. 5) increases significantly over a wide range of water deficits, while the HI of another cultivar (Jaen) does not vary much with water deficits. The two varieties differed in maturity date; Jaen in the environment where it was grown was able to complete the process of fruit development in all water treatments, while the maturation of Coker-310 fruits was enhanced by water deficits relative to full irrigation (Orgaz et al., 1992). Because cotton cultivars are chosen to maximize potential yield, SDI is an excellent tool to match the water supply available to the maturity date of a given cultivar. Thus, in the cases of crops such as sorghum and cotton (Fig. 5), DI can and should be used to achieve maximum WP and profits by growing the crop at ET levels below its maximum potential.

Relationship between harvest index (HIR) as a function of biomass production (BR) in response to water deficits for sorghum (closed circles) and cotton (open and crossed squares) under SDI regimes. The sorghum data are from Farré and Faci (2006) and Faci and Fereres (1980). The open squares are for the cotton cv. Coker-310, and the crossed squares for cv. Jaen. The cotton data were originally reported by Orgaz et al. (1992).
The differential sensitivity of crop yield to water deficits at different developmental stages has been a classic topic of research (Taylor et al., 1983). Figure 6 shows, for maize and sunflower, the responses to pre- and post-anthesis deficits, relative to the response to SDI in the HI–B plot. The well-known response to post-anthesis stress is negative and should be avoided by appropriate irrigation scheduling. The increase in HI in response to pre-anthesis stress of Fig. 6, similar to that shown in cotton and sorghum under SDI (Fig. 5), offers an opportunity to use DI to achieve higher WP and profits at ET levels below the maximum. One practical limitation is that stress during the vegetative phase reduces leaf area, and that such a reduction can have an effect on the partitioning of ET into evaporation and transpiration, favouring evaporation and negating some of the potential improvement in WP. Perhaps a change in planting patterns could overcome this limitation by using smaller plants and increasing planting density, although the duration of the vegetative phase is quite short in intensive production systems and thus the potential ET reduction in this phase may be limited.

Relationship between the harvest index (HIR) as a function of biomass production (BR) in response to pre- and post-anthesis water deficits for maize (circles) and sunflower (triangles). The maize data are from Farré (1998) and from NeSmith and Ritchie (1992a, b), and the sunflower data are from Soriano et al. (2002). The dashed line is the same as that depicted in Fig. 4 for SDI regimes. (DPre, DPost, DFl: water deficits during pre- and post-anthesis and at flowering, respectively.)
The basis for designing DI strategies lies on the response of the HI to the watering regime. Thus, it would be desirable to have a model that could predict the HI response to water supply. Sadras and Connor (1991) have proposed a model for predicting the HI of sunflower and other determinate crops as a function of post-anthesis transpiration. The model calculates HI, corrected for biomass composition, as a function of the fraction the transpiration during post-anthesis and also normalized for the vapour pressure deficit of that period. In that model, HI is nearly constant until the normalized fraction of post-anthesis transpiration does not decline below about 0.2. Data from two sunflower experiments conducted at Cordoba were fitted to the model of Sadras and Connor (1991) and the resulting curve is plotted in Fig. 7. The results of Fereres and Soriano (Soriano, 2001), obtained under SDI, do not fit their relationship (Fig. 7), perhaps because the experiments of Fereres and Soriano were conducted in open, deep soils and those of Sadras and Connor (1991) were done in pots. Also, it was found that the model is very sensitive to small changes in the date of anthesis, as a few days could displace the curve significantly. Furthermore, when the model was tested with data of other treatments that included one with N limitation (Fig. 8), the curve obtained for the SDI treatment did not fit the data from other treatments, as there were substantial differences in HI for the same fraction of post-anthesis transpiration (Fig. 8). It appears that there are no simple answers to modelling HI, at least when the imposed water stress patterns differ from those in SDI regimes. However, if the best DI strategy is to impose a sustained deficit throughout the season, the assumption of a constant HI over a range of mild-to-moderate water deficits is supported by strong experimental evidence in most of the major crop plants, as discussed above.
![Harvest index (HIPV) as a function of post-anthesis transpiration fraction (fTVPD) for sunflower, standardized with respect to the production value (PV=amount of biomass produced per unit of hexose substrate; Penning de Vries et al., 1974) of biomass and to the vapour pressure deficit (VPD), respectively. The dashed line represents Sadras and Connor (1991) model: HIPV=fTVPD/[1–(a–b fTVPD)]; (a=0.91; b=1.63); a and b were obtained following the derivation of Sadras and Connor (1991) but using original data obtained in Cordoba in two sunflower experiments under an SDI regime. The data used were: closed circles and triangles, from an SDI regime measured in a summer experiment reported by Soriano (2001). Squares are from an SDI regime under high N, measured in a spring experiment in 1985 (Álvarez, 1987). The continuous line represents the best fit to all the Cordoba experimental data (HIPV=0.116 ln(fTVPD)+0.608; r2=0.89).](https://oup.silverchair-cdn.com/oup/backfile/Content_public/Journal/jxb/58/2/10.1093/jxb/erl165/2/m_jexboterl165f07_lw.gif?Expires=1747861363&Signature=T0IicNkJibrGsRIGVcyhoq1v9d9waL0JoijrETLy2zoc5ITaw04YVz5FwmZziwNvqk4HKGh8y~AWlZCi2lBYJ5yEdm2kEmsoQSqetrMAgKrd5pIBCnTbHqrXgUzlw1YRUeLvLRNBAz10dz7e-cdf55vyjQZFWds9pG2Sr3E~dMAPK59beaW2FB3YdoQisxv0XIB7aglKIm1v~Sk4CMdMl7Fd5NmCCYn2baPlK5STOZ4YrrgPBSA4aocD2z7fo4dqGy1912YlF1JdrBLmmW-qg87OKjCgcsj7WDp8S8Jjbzi1HsVkBklaEyvzZEvz6seG8GlydV5T~IYGMO9G1WrV8Q__&Key-Pair-Id=APKAIE5G5CRDK6RD3PGA)
Harvest index (HIPV) as a function of post-anthesis transpiration fraction (fTVPD) for sunflower, standardized with respect to the production value (PV=amount of biomass produced per unit of hexose substrate; Penning de Vries et al., 1974) of biomass and to the vapour pressure deficit (VPD), respectively. The dashed line represents Sadras and Connor (1991) model: HIPV=fTVPD/[1–(a–b fTVPD)]; (a=0.91; b=1.63); a and b were obtained following the derivation of Sadras and Connor (1991) but using original data obtained in Cordoba in two sunflower experiments under an SDI regime. The data used were: closed circles and triangles, from an SDI regime measured in a summer experiment reported by Soriano (2001). Squares are from an SDI regime under high N, measured in a spring experiment in 1985 (Álvarez, 1987). The continuous line represents the best fit to all the Cordoba experimental data (HIPV=0.116 ln(fTVPD)+0.608; r2=0.89).

Harvest index (HIPV) as a function of post-anthesis transpiration fraction (fTVPD). Closed triangles are from a DI regime that had pre-anthesis deficits while open triangles are from another DI regime in the same experiment under post-anthesis deficits, as reported by Soriano (2001). Squares are from an SDI regime under N limitation (no N fertilizer applied) measured in the 1985 experiment of Álvarez (1987). The two lines are the same as those depicted in Fig. 7.
Deficit irrigation in fruit trees and vines
Deficit irrigation so far has had significantly more success in tree crops and vines than in field crops for a number of reasons (Fereres et al., 2003). First, economic return in tree crops is often associated with factors such as crop quality, not directly related to biomass production and water use. The yield-determining processes in many fruit trees are not sensitive to water deprivation at some developmental stages (Uriu and Magness, 1967; Johnson and Handley, 2000). Because of their high WP, tree crops and vines can afford high-frequency, micro-irrigation systems that are ideally suited for controlling water application and thus for stress management (Fereres and Goldhamer, 1990). From the standpoint of water conservation, a given reduction in water supply to trees and vine canopies will be translated into a greater decrease in transpiration than in field crops, leading to more net water savings. This is because tall, rough canopies are better coupled to the atmosphere than the short, smooth canopies of field crops (Jarvis and McNaughton, 1986), and a reduction in stomatal conductance is scaled up to a greater extent in the canopies of tree crops and vines.
Traditionally, fruit tree irrigation recommendations allowed for some stress development (Veihmeyer, 1972) and there has been awareness of the benefits of water stress in some aspects of fruit production such as fruit quality for a long time (Uriu and Magness, 1967). The imposition of water stress at certain developmental periods could therefore benefit yield and quality in fruit tree and vine production. The concept of regulated deficit irrigation (RDI) was first proposed by Chalmers et al. (1981) and Mitchell and Chalmers (1982) to control vegetative growth in peach orchards, and they found that savings in irrigation water could be realized without reducing yield. Even though similar results were reported for pears (Mitchell et al., 1989), RDI was found not to be as successful in other environments (Girona et al., 1993). Nevertheless, experiments with RDI have been successful in many fruit and nut tree species such as almond (Goldhamer et al., 2000), pistachio (Goldhamer and Beede, 2004), citrus (Domingo et al., 1996; González-Altozano and Castel, 1999; Goldhamer and Salinas, 2000), apple (Ebel et al., 1995), apricot (Ruiz-Sánchez et al., 2000), wine grapes (Bravdo and Naor, 1996; McCarthy et al., 2002), and olive (Moriana et al., 2003), almost always with positive results. Thus, there is sufficient evidence at present that supplying the full ET requirements to tree crops and vines may not be the best irrigation strategy in many situations (Fereres and Evans, 2006).
Regardless of the type of irrigation programme used, there is a need to develop scientific irrigation scheduling procedures (Fereres, 1996). In particular, if DI is used, monitoring the soil or plant water status is even more critical for minimizing risk, given the uncertainties in determining the exact water requirements. In the case of fruit trees, because of the complexity of monitoring the root-zone water status under localized irrigation, plant-based methods for detecting water deficits have important advantages (Fereres and Goldhamer, 1990). Jones (2004) has recently reviewed the recent advances in plant-based methods for irrigation scheduling and has thoroughly described the many options currently available. One of the major limitations of currently established methods in fruit trees such as the measurement of stem water potential (SWP) is the high labour requirements of the monitoring process. Alternative methods to using SWP that can be automated, such as the use of dendrometry (Goldhamer and Fereres, 2001), have relatively high variability (Intrigliolo and Castel, 2004; Naor et al., 2006) and, thus, require additional research to reduce uncertainty in their use before they can be recommended for adoption. The use of infrared thermometry and thermal imaging may be a very promising option for stress monitoring in trees and vines (Jones, 2004), as shown in a recent report on stress detection in olive trees from infrared imagery (Sepulcre-Cantó et al., 2006).
The mechanisms responsible for the lack of yield decline under RDI have been explored (Chalmers et al., 1986; Girona et al., 1993). The obvious explanation is that high sensitivity of expansive growth of the aerial parts to water deficits must affect the partitioning of assimilated carbon, as photosynthesis is unaffected by mild water deficits. It has been shown that root growth is favoured under water deficits (Sharp and Davies, 1979; Hsiao and Xu, 2000), and partitioning to fruit growth must also be unaffected (Gucci and Minchin, 2002). More research is needed to elucidate the basis for observed responses, in view of the interactions between water stress and crop load (Naor et al., 1999). One feature of the yield response of tree crops to ET deficits is that, contrary to the linearity observed in annual crops (Fig. 1), the response appears to be curvilinear (Moriana et al., 2003). This means that WP is highest at low levels of water application and that DI is the appropriate irrigation strategy.
Another developmental period when water deficits may be applied safely is between harvest and leaf fall. Johnson et al. (1992) found that, in peach, relatively severe water deficits may be imposed during that period, although severe stress increased the number of double fruits and other fruit-shape disorders the following year. The RDI response is very dependent on the timing and degree of severity of the water deficits, as well as on crop load (Marsal and Girona, 1997). There are significant differences among species, however. For instance, Goldhamer et al. (2006) proved that in almond trees an SDI regime is the least detrimental to yield. The results of this study, summarized in Fig. 9, indicate that for the same level of applied water, yields were less affected under SDI than under two RDI regimes that biased the water deficits, either pre- or post-harvest (Fig. 9). The treatment with post-harvest stress had a significant decline in fruit number due to carry-over effects, with a reduction in the number of fruiting buds the following year (Goldhamer et al., 2006). By contrast, fruiting density was enhanced above the control in the pre-harvest RDI treatment, although tree canopy and nut sizes were reduced (Goldhamer et al., 2006).

Response of almond yield to three deficit irrigation regimes. Average results of a 4-year experiment conducted in California where three different DI regimes were applied. Drawn from data of Goldhamer et al. (2006).
One limitation of many studies on RDI is that comparisons among treatments are often not fair because the amount of applied water in the different DI treatments is not the same. A long-term experiment has been conducted on a peach farm located on deep alluvial soil near Cordoba where SDI was compared with RDI, using the same amount seasonally. The RDI regime concentrated the application of water to the period of rapid expansion of fruit growth (Stage III), while the SDI applied the water throughout the irrigation season. In both DI treatments, water application was about two-thirds that of the control. Figure 10 presents the evolution of SWP for the three treatments during the fourth experimental year (2005). In RDI, SWP declined in early summer to values about twice those of the control and the SDI treatment. Recovery of SWP in RDI following irrigation was rapid, reaching control values in <1 week, while the SWP of SDI declined during Stage III to values that were 0.3 MPa lower than in the other two treatments. Following harvest, irrigation was interrupted in RDI but continued in the other two treatments. Yield response, shown in Table 1, indicates that the RDI treatment had the same yield and fruit size as the control despite its lower water status, while the SDI had a 10% decrease in yield and 15% reduction in fruit size that were statistically significant. This was despite the fact that the value of SWP integrated over the season was more negative in RDI than in the other two treatments (Table 1). Nevertheless, during the period of rapid fruit growth, the absolute value of SWP of RDI was less than that of SDI (Table 1). From this experiment it can be concluded that, for the same amount of applied water, RDI is advantageous over SDI in peach production.
Stem water potential (MPa, integrated over the irrigation season and the RDI irrigation period, see Fig. 10), yield (t ha−1), and fruit volume (cm3) in three irrigation treatments (RDI, SDI, and full irrigation) in the fourth year (2005) of a peach experiment near Cordoba, Spain
Treatment | Stem water potential (MPa) integrated over | Yield (t ha−1) | Fruit volume (cm3) | |
Seasona | RDI irrigation period | |||
RDI | −125.6 | −34.7 | 48.1 a | 178 a |
SDI | −108.1 | −39.2 | 43.8 b | 155 b |
FI | −86.7 | −31.2 | 49.2 a | 171 a |
Treatment | Stem water potential (MPa) integrated over | Yield (t ha−1) | Fruit volume (cm3) | |
Seasona | RDI irrigation period | |||
RDI | −125.6 | −34.7 | 48.1 a | 178 a |
SDI | −108.1 | −39.2 | 43.8 b | 155 b |
FI | −86.7 | −31.2 | 49.2 a | 171 a |
Means followed by a different letter (within a column) are significantly different at the 0.05 probability level according to LSD.
Irrigation season (1 May to mid-September).
Stem water potential (MPa, integrated over the irrigation season and the RDI irrigation period, see Fig. 10), yield (t ha−1), and fruit volume (cm3) in three irrigation treatments (RDI, SDI, and full irrigation) in the fourth year (2005) of a peach experiment near Cordoba, Spain
Treatment | Stem water potential (MPa) integrated over | Yield (t ha−1) | Fruit volume (cm3) | |
Seasona | RDI irrigation period | |||
RDI | −125.6 | −34.7 | 48.1 a | 178 a |
SDI | −108.1 | −39.2 | 43.8 b | 155 b |
FI | −86.7 | −31.2 | 49.2 a | 171 a |
Treatment | Stem water potential (MPa) integrated over | Yield (t ha−1) | Fruit volume (cm3) | |
Seasona | RDI irrigation period | |||
RDI | −125.6 | −34.7 | 48.1 a | 178 a |
SDI | −108.1 | −39.2 | 43.8 b | 155 b |
FI | −86.7 | −31.2 | 49.2 a | 171 a |
Means followed by a different letter (within a column) are significantly different at the 0.05 probability level according to LSD.
Irrigation season (1 May to mid-September).

Seasonal patterns of stem water potential (SWP, MPa) of peach trees in response to the irrigation treatments (RDI, SDI, and full irrigation) during the fourth experimental year (2005); fruit growth stages (I, II, and III) are shown and the arrow H indicates harvest date. Error bars indicate ±standard error.
Experience that full irrigation is not the best strategy abounds in many perennial horticultural crops, but in none is it more evident than in wine grapes. The quality of wine in semi-arid areas is strongly associated by enologists with water stress (Williams and Matthews, 1990) to the point that, as an example, irrigation of vineyards was forbidden by law in Spain until 1996. Nevertheless, the benefits of RDI to the yield and quality of wine grapes have been clearly demonstrated relative to rain-fed production (Girona et al., 2006). Among the techniques used for imposing RDI on wine grapes is one that alternates drip irrigation about every 2 weeks on either side of the vine row; this is called partial root drying (PRD) (Dry and Loveys, 1998). The PRD technique has its foundation in the root-to-shoot signalling that regulates the plant response to drying soil (Davies and Zhang, 1991; Dodd, 2005). Shoot physiological processes are affected by root signalling, including leaf expansion (Passioura, 1988). The control of vegetative growth is of paramount importance in the production of high-quality wine grapes (Loveys et al., 2004), and it has been shown that PRD controls canopy growth and is advantageous over full irrigation in wine production (McCarthy et al., 2002). There have been commercial applications of PRD and the system has already been tested in vineyards located in many environments (Dos Santos et al., 2003; Girona et al., 2006).
The PRD technique has also been tested in other crops, notably fruit trees. While positive results have been reported (Kang et al., 2000), it appears that, when meaningful comparisons under field conditions have been carried out that have avoided the interactions between the amount and the mode of placement of irrigation water, PRD has not improved the crop response over an RDI regime that applied the same amount of water, as shown in peach (Goldhamer et al., 2002), apple (Leib et al., 2006), and olive (Wahbi et al., 2005), among others. Nevertheless, the PRD is a useful water application technique that, by reducing the number of emission points that wet the soil at one time, alters the partitioning between evaporation and transpiration. The reduction in evaporation under PRD relative to an RDI regime that has twice the number of emitters, increases the WP of a limited supply of water. The alternate wetting in PRD reduces drainage losses relative to a regime that always wets the same side of the plant row (Kang et al., 2000). Another factor that needs exploration is the observation that the alternate wetting and drying of PRD promotes root growth (Mingo et al., 2004). If this is confirmed in fruit trees, the recovery following stress periods could be enhanced by PRD. For instance, the time for recovery of SWP in the RDI treatment in Fig. 10 was about 1 week, and any shortening of that period would have had a positive influence on fruit expansion rate, and probably on fruit size.
Conclusion
Today, irrigation is the largest single consumer on the planet. Competition for water from other sectors will force irrigation to operate under water scarcity. Deficit irrigation, by reducing irrigation water use, can aid in coping with situations where supply is restricted. In field crops, a well-designed DI regime can optimize WP over an area when full irrigation is not possible. In many horticultural crops, RDI has been shown to improve not only WP but farmers’ net income as well. It would be important to investigate the basis for the positive responses to water deficits observed in the cases where RDI is beneficial. While DI can be used as a tactical measure to reduce irrigation water use when supplies are limited by droughts or other factors, it is not known whether it can be used over long time periods. It is imperative to investigate the sustainability of DI via long-term experiments and modelling efforts to determine to what extent it can contribute to the permanent reduction of irrigation water use.
We acknowledge the support of grants from INIA (RTA02-070) and the European Commission DIMAS project (INCO-CT-2004-509087), and the skilled technical assistance of C Ruz in the peach experiment.
Comments