Silvicultural strategies for Fraxinus excelsior in response to diebackcaused by Hymenoscyphus fraxineus

Ash dieback caused by the invasive alien fungal pathogen Hymenoscyphus fraxineus often has devastating consequences for the survival, growth and wood quality of Fraxinus excelsior . We analyse the silvicultural implications of ash dieback in forest stands in Europe and review the advice on how to modify management accordingly. We draw on literature as well as unpublished observations and personal experience. The relevant strategy depends on the management objective, the site type (moist or dry), the stand type (pure or mixed stands, even-aged or uneven-aged stands), the age and the degree of dieback. Generally, the strategy should be conservative and trees that are healthy or slightly damaged may be marked and retained. Where dieback is severe, the suggested approach is to harvest remaining commercial timber before value depreciation and to regenerate or replant the area with other tree species. In forests of high value for habitat conservation, it may be advisable to let natural succession proceed unhindered. In all situations, forestry practice plays a key role in implementing in situ and ex situ conservation strategies for ash by preserving trees with low damage levels in all phases of stand development. Wherever there are infected ash trees, risks for operational staff, forest visi-tors and infrastructure posed by damaged, destabilized ash trees must be minimized.


Introduction
European ash (Fraxinus excelsior L.) is a valuable hardwood species of forest ecosystems as well as in the open landscape, and contributes significantly to rural economies, landscape aesthetics, other ecosystem services and biodiversity (Jahn, 1991;Roloff and Pietzarka, 1997;FRAXIGEN, 2005;Dobrowolska et al., 2011;Thomas, 2016). The abundance and importance of ash varies in different European countries, as do management objectives.
Although ash is an important component of forests in Europe, it accounts for less than one per cent of the forest area (13 000 km 2 /1 665 500 km 2 ; Hemery, 2008). Regionally or locally it can occupy large tracts of land, and occasionally occurs in pure stands or at high frequencies in mixed stands such as in the limestone valleys of England's Peak District, in many lowland and upland forests in northern and eastern France and in the floodplain forests along the Rhine in Germany and the Danube in Austria. In such regions, ash is very important and contributes significantly to local rural economies.
Ash is a flexible species and is found on many different site types. It particularly favours dry, shallow calcareous soils and moist, fertile alluvial soils with a pH greater than 5 (Wardle, 1961;Marigo et al., 2000;Thomas, 2016). Ash does not tolerate stagnant standing water and grows best on moist soils with a winter water table of 40-100 cm below the soil surface (Kerr and Cahalan, 2004). For good growth, soils rich in nutrients, particularly nitrogen, are necessary.
The disease has led to widespread dieback, tree mortality and grave concern for the future of ash in most of Europe. The immediate impacts as well as the long-term consequences of the disease can be serious and may lead to substantial reduction or even loss of ash at numerous locations (Pautasso et al., 2013;Lygis et al., 2014;McKinney et al., 2014). Due to the rapid and widespread occurrence and intensification of the disease, the management of ash has become increasingly challenging. Management modifications in established stands tend to be reactive to the magnitude of damage levels caused by the disease and the establishment of new stands is presently not encouraged.
In this paper, we analyse the silvicultural implications of crown dieback, collar lesions and collar rot in ash in forest stands in Europe and review the advice on how to modify management accordingly. We draw on the literature as well as unpublished observations and personal experience. Throughout, we consider any unknowns and best guesses as hypotheses for scientific testing.
Following an overview of the role of primary and secondary agents of ash decline and dieback, the review includes summaries of the genetically determined level of tolerance to H. fraxineus and of the influence of climate and site factors on dieback severity. Next, a simple method is outlined for the assessment of ash dieback severity. The remaining part of the review is devoted to silvicultural practices, their impact on ash dieback and the options for future management.
The review is not restricted to silvicultural implications caused by H. fraxineus, but considers a wider range of associated problems which forestry practice cannot separate completely from each other. The review refers mainly to ash in high forest systems where commercial timber production is an essential component of the management objective, and does not include coppice and pollarding practices. The special case of conservation forests of non-intervention status is mentioned only briefly.

Primary and secondary agents of ash decline and dieback
Ash dieback caused by H. fraxineus is a primary disease affecting foliage, shoots, bark and wood. Ascospores produced in apothecia on fallen leaf rachises in the litter from the previous year are wind-dispersed during the summer between June and September and are the main source of infection by H. fraxineus (Timmermann et al., 2011;Gross et al., 2012Gross et al., , 2014Cleary et al., 2013).
Characteristic macroscopic symptoms of the disease include necrotic lesions on leaves, leaf rachises and in the bark of shoots, wilting and premature shedding of foliage and dieback of shoots, twigs and branches (Kowalski, 2006;Skovsgaard et al., 2010;Cleary et al., 2013;Gross et al., 2014;McKinney et al., 2014). On woody parts elongated perennial dry cankers develop which are accompanied by brownish to greyish wood discolouration.
Disease severity tends to be less on trees of greater stem diameter (Skovsgaard et al., 2010) or the disease takes longer to severely damage larger trees (Husson et al., 2012;Marçais et al., 2016;Queloz, 2016;Havrdová et al., 2017). Vigorous trees may respond with prolific regrowth on affected shoots and development of epicormic branches in the crown, leading to a characteristic, bushy appearance. Epicormic branches on the stem indicate high disease susceptibility in the crown (Jakobsen, 2011;Enderle et al., 2015) and may act as entry points of H. fraxineus into the stem, leading to wood discolouration .
Trees affected by ash dieback may develop necrotic lesions in the bark and wood discolouration on the lower part of the stem and on surface roots. The aetiology of this damage is still debated (Enderle et al., 2013;Hauptman et al., 2016), but there is evidence that it is caused by H. fraxineus and appears as a primary symptom of ash dieback (Husson et al., 2012;Langer et al., 2015;Chandelier et al., 2016;Marçais et al., 2016). Honey fungus (Armillaria spp.), most commonly A. gallica Marxm. & Romagn. and A. cepistipes Velen, is also often present (Skovsgaard et al., 2010), and is believed to be a secondary agent to H. fraxineus (Husson et al., 2012;Enderle et al., 2013;Rosenvald et al., 2015;Chandelier et al., 2016;Hauptman et al., 2016;Marçais et al., 2016). Honey fungus, recognizable by white mycelial fans and black rhizomorphs in necrotic bark tissue, kills cambium and bark and causes wood decay at the butt and in roots. This, in turn, destabilizes trees and aggravates their decline.
We refer to damage at the stem base caused by H. fraxineus as basal or collar lesions and where it occurs in conjunction with Armillaria as collar rot, indicating their location at stem base in the root collar area, i.e. in the transition zone between stem and root system. Crown damage and the incidence or severity of collar lesions generally correlate well (Husson et al., 2012;Chandelier et al., 2016), although collar lesions also occur on trees with little or no crown dieback (Enderle et al., 2013).
Dying, defoliated and recently killed trees may all act as hosts for breeding ash bark beetles (Hylesinus fraxini Panzer, H. crenatus Fabricius, H. oleiperda Fabricius and H. wachtli orni Fuchs (Coleoptera: Scolytinae)). Where breeding material is unavailable or occurs in low quantities, ash bark beetles can be considered secondary pests because they rarely cause tree death (Escherich, 1923;Postner, 1974;Bejer, 1979;Boas, 1896Boas, -98, 1923 and are also not associated with dieback due to H. fraxineus (Skovsgaard et al., 2010;. In contrast to breeding, maturation feeding by ash bark beetles and hibernation cavities occur in healthy trees. Either or both may lead to the formation of bark proliferations, so-called ash roses (photographs in Ehnström and Axelsson, 2002;Zúbrik et al., 2013), and sometimes an associated deterioration of wood quality due to minor wounds or cracks in the bark (Boas, 1923;Bejer, 1979).

Forestry
With increasing levels of ash dieback and, consequently, increasing quantities of dying ash trees and deadwood that remain in the forest, ash bark beetles may multiply to unprecedented levels. In particular, H. fraxini sometimes proliferates in the wake of ash dieback  but, so far, primary damage due to this species in stands affected by ash dieback has been reported only from Styria in Austria (Pfister, 2012).
The role of genetics in tolerance to ash dieback Several studies have reported that a low proportion of genotypes, typically 1-5 per cent of the population, may possess a durable, high, but partial resistance to H. fraxineus (McKinney et al., 2011(McKinney et al., , 2014Pliūra et al., 2011;Kjaer et al., 2012;Stener, 2013;Enderle et al., 2015;Havrdová et al., 2016). Most studies on resistance of ash to H. fraxineus assessed crown damage, but there is also a genetic component in susceptibility to collar lesions (Muñoz et al., 2016). More-or-less 'resistant' trees may show symptoms of ash dieback but are better able to tolerate the pathogen such that disease expression is relatively minor. The expression of tolerance in terms of ash dieback symptoms may vary among individuals as well as over time. In line with common terminology (Agrios, 2005), we consequently refer to such trees as tolerant of H. fraxineus.
There are studies of the Holocene migration lineages (cpDNA haplotypes) and the genetic variation (nuclear diversity) within populations of ash (Heuertz et al., 2004a(Heuertz et al., , 2004bFRAXIGEN, 2005), but so far none of these have been related directly to the genetically determined level of tolerance to H. fraxineus in natural populations. Some variation in susceptibility has been observed, but the within-provenance variation was much higher than that between provenances (Pliūra et al., 2011;Metzler et al., 2012;Enderle et al., 2013;Havrdová et al., 2016). Moreover, observations from several countries indicate that ash genotypes tolerant of H. fraxineus occur across Europe rather than in specific regions.
In situ and ex situ conservation strategies for ash have been proposed as a means to cope with ash dieback. While natural adaptation of ash to H. fraxineus may be hindered by population fragmentation and large distances between tolerant trees, ex situ conservation of tolerant genotypes and breeding for tolerance, assisted by contemporary molecular genetic tools (Harper et al., 2016;Sollars et al., 2017), could be a long-term management option for ash dieback (McKinney et al., 2014).
Forestry practice can play a key role in implementing in situ and ex situ conservation strategies by preserving trees with exceptionally low damage levels in all phases of stand development. Consequently, retaining such trees are one of the key recommendations in many countries (e.g. Skovsgaard et al., 2009;Thomsen andSkovsgaard, 2009, 2012;Kirisits and Cech, 2010;Metzler et al., 2013;Freinschlag, 2014, 2015).
In severely damaged stands, the retention of tolerant ash trees for in situ conservation may not be a useful measure for future stand development because the number of such trees is too low or because it may not be justified from an economic perspective. It is still, however, recommended to preserve the few best trees with minimal crown damage and no root collar lesions in order to facilitate the possible long-term adaptation of ash populations to H. fraxineus. Felling all ash trees regardless of their health condition carries the risk that potentially tolerant genotypes are eliminated.
Ash stands in areas where there is no intervention, such as forests for habitat conservation, are also important for in situ conservation of ash. Letting ash dieback 'run its course' without human intervention may illustrate which and how many trees exhibit durable tolerance after many years of repeated infection by H. fraxineus. This may result in the temporary or permanent reduction or total loss of ash from the forest, but is in line with the concept of natural selection and will select for trees with high disease tolerance.
For ex situ conservation of ash, tolerant trees may be secured by grafting and placed in clonal orchards and can form the basis for breeding populations (Clark, 2014;McKinney et al., 2014;Pliūra et al., 2014;Kirisits et al., 2016). Various research projects in breeding for tolerance against H. fraxineus have already been initiated in several countries. Moreover, molecular markers for tolerance of H. fraxineus in ash were recently developed (Harper et al., 2016) and are subject to further intensive research. There is a good possibility that tolerant plant material will be available in the future, enabling foresters to retain ash as a viable timber species in Europe.

The role of climate and site in ash dieback severity
Observations in forestry practice as well as scientific investigations report climate and site as important factors influencing disease progression. There is a consensus that soil moisture, air humidity, air temperature or factors that correlate well with these, influence the impact of H. fraxineus (Skovsgaard, 2008;Schumacher, 2011;Husson et al., 2012;Kessler et al., 2012;Enderle et al., 2013;Hauptman et al., 2013;Hietala et al., 2013;Havrdová, 2015;Dvorak et al., 2016;Marçais et al., 2016;Muñoz et al., 2016), but few investigations have quantified the effect of individual factors or any interactions.

Soil moisture
Crown dieback as well as collar rot correlate with soil moisture, with trees on moister sites being more severely affected (Husson et al., 2012;Enderle et al., 2013;Marçais et al., 2016;Muñoz et al., 2016;Thomsen et al., 2017a). This is due to higher inoculum quantities on sites with moister soils and thus a greater infection pressure by H. fraxineus and subsequently by Armillaria spp. Moreover, site-dependent fluctuations in groundwater and variation in local topography may directly influence the development of crown dieback through their impact on rooting depth and drainage (Skovsgaard, 2008;Jakobsen, 2011;Ahlberg, 2014).
Observations in a thinning experiment in Denmark may serve as an example of these effects. Here, the dry spring of 2008 led to more severe damage on a moist floodplain site with restricted root development than on a nearby site with welldrained glacial till in high terrain (Skovsgaard, 2008). The floodplain site had fluctuating groundwater that, due to drought, retreated below its usual minimum. Subsequent crown recovery Silvicultural strategies for Fraxinus excelsior was better on the well-drained site with vigorous regrowth of shoots in summers with ample rainfall.
Another example can be found in floodplain forests of the Czech Republic. Ash growing in gleysol or fluvisol, with or without an admixture of black alder (Alnus glutinosa (L.) Gaertn.), was more affected than ash on wind-exposed sites (ridges, peaks and steep slopes) or on shallow soil over bedrock (Černý et al., 2016).

Air humidity and temperature
Air humidity has a fundamental influence on sporulation, spore release and the rate of infection, with stands on moist sites being at higher risk of infection by ash dieback (Schumacher, 2011;Hietala et al., 2013;Havrdová, 2015;Dvorak et al., 2016;Thomsen et al., 2017a). However, there is likely to be variation in how these factors influence the spread of the pathogen and emergence of the disease depending on precipitation pattern, soil type, local topography and ash genotype.
Consistent with these observations, laboratory experiments with nursery transplants have shown that H. fraxineus is sensitive to high temperatures and a dry climate, which may lead to lower levels of dieback in such conditions (Hauptman et al., 2013). The very hot and dry summer of 2015 in Central Europe hampered apothecia formation of H. fraxineus (unpublished observations by B. Metzler, R. Enderle, T. Kirisits and L. Havrdová;Lenz and Mayer, 2016) and evidently, few infections took place. Premature leaf fall due to infections of leaves, which is a common observation on ash in this region in August, did not occur.
Overall, it may be questionable whether one or a few individual years with weather conditions unfavourable for sporulation of and infection by H. fraxineus play any significant role in the long-term development of ash dieback severity. This is emphasized since H. fraxineus can postpone sporulation and repeatedly form apothecia on rachises over several years and thereby form an inoculum reservoir in the leaf litter (Gross and Holdenrieder, 2013;Gross et al., 2014;Kirisits, 2015).

The combined effect of multiple factors
In a large-scale survey of ash in operational stands in the Czech Republic (Havrdová et al., 2017), the frequency and severity of crown dieback increased with increasing temperature (i.e. decreasing altitude), site productivity, soil moisture and proximity to a water course (i.e. increasing air humidity and soil moisture). Moreover, crown damage levels increased with decreasing stand age and with increasing stocking density, both of which reflect canopy closure and, to some extent, management practices. Topographic heterogeneity modified the extent of dieback. Due to collinearity and interaction, the effect of specific variables could not be quantified completely, but essential explanatory variables all correlated with air temperature and soil moisture and, to some degree, soil fertility.

Disease escape
Finally, it should also be noted that climatic conditions unfavourable for H. fraxineus can lead to 'disease escape' in ash individuals. Disease escape occurs when susceptible plants do not become diseased, or not severely diseased, due to a lack of coincidence between the host and the pathogen or due to environmental conditions that are not conducive to disease development (Agrios, 2005). Apparently, disease escape is part of the reason why solitary and small groups of ash trees in the open landscape are often less affected than trees in the forest (Thomsen, 2014;Queloz, 2016).

Climate and site summary
In summary, ash dieback is more severe on all-year moist and humus-rich sites than on dryer sites. Consequently, dry calcareous sites may be among the best sites for ash in the future not only because of species fitness but also because of less infection pressure and the low incidence of collar lesions and rot. Although ash is sometimes planted on dry nutrient-poor sites and on swampy sites, these site types will remain unsuitable for ash. Moreover, vigorous trees can better compensate for the effects of H. fraxineus (Enderle et al., 2015;Havrdová et al., 2017).
The observed response pattern may relate to two independent, and for ash dieback, unquantified mechanisms: (1) When climate and site conditions promote the well-being of ash this leads to better growth and an improved potential for recovery. (2) When climate and site conditions are less favourable for the pathogen, throughout the season or temporarily during phases of sporulation or spore dispersal, this may hamper conditions for infection.
When quantified, such mechanisms could possibly help explain observed seasonal or annual fluctuations in disease severity as well as site-specific variation in disease progression.

Silvicultural challenges due to ash dieback
Before the occurrence of ash dieback, when the production of high-quality timber was a main management objective, silvicultural prescriptions would generally aim to maximize the growth rate of individual trees rather than of the whole stand (Dobrowolska et al., 2011;Wilhelm and Rieger, 2013). Due to its inherent natural pruning ability and low frequency of epicormic branches, ash would be thinned heavily to maximize diameter growth, once natural pruning had progressed sufficiently. Because of the ring-porous wood structure of ash, this regime would result in improved wood quality as wider annual rings lead to a larger proportion of late-wood and, in turn, higher basic density, increased elasticity and strength (Kollmann, 1941;Oliver-Villanueva and Becker, 1993). Moreover, high growth rates of stem diameter would minimize the risk of brown or black heart, a visually significant, but for physical wood properties unimportant, discolouration of stem wood that has a tendency to increase in incidence and extent at a given site with increasing age (Thill, 1970;Kerr, 1998).
The number of potential final crop trees would depend on the target diameter at breast height (dbh) and would typically correspond to a crown diameter slightly larger than 20 times the target dbh (Hein, 2003;Hemery et al., 2005). For example, for a target dbh of 60 cm, a final crown diameter of 13-14 m Forestry would be envisaged for ash. The planned number of potential final crop trees would consequently be 60 ha −1 or less trees for even-aged pure stands of ash (Dobrowolska et al., 2011;Wilhelm and Rieger, 2013). For other stand types, this number would be modified according to age structure and species composition. Due to market demands for white timber and the risk of brown heart, the target diameter would sometimes be set as low as 40 cm, corresponding to a final crop of 155-180 trees ha −1 .
Although the thinning strategy outlined above was ideal for producing ash timber of high quality in the absence of dieback, there were numerous alternatives and variations to this regime (Dobrowolska et al., 2011), including thinning for individual trees with full crown release of sometimes less than 40 crop trees ha −1 (Wilhelm and Rieger, 2013). When specific targets are disregarded it becomes increasingly difficult or impossible to plan for future stem densities once the production cycle is disrupted by the occurrence of ash dieback, simply because the canopy breaks up and the future health and quality of individual trees become less predictable.
There are four interrelated immediate effects of ash dieback that may have direct implications for silviculture: (1) increased mortality, (2) reductions in growth rate, (3) reductions in wood quality, (4) reductions in the mechanical stability of individual trees.
Where these challenges occur collectively, some or all of the original stand management objectives may no longer be tenable.

Increased mortality
In silviculture, stand density control is key to achieving several management objectives, including wood production. Increased mortality in ash stands may lead to loss of stand density control. The immediate effect is that dead trees leave gaps in the canopy. This can be a serious problem for future stand management, because of reduction in growth at stand level and because re-stocking at gap level is often critical for practical as well as economic reasons. In turn, increasing levels of light at the forest floor may lead to excessive ground vegetation. This can be a significant challenge in ash stands, particularly if grasses dominate. First, if the stand is to be regenerated, either by planting or natural regeneration, a dense cover of ground vegetation may severely hamper seedling establishment (Willoughby and Jinks, 2009). Second, dominance of grass in the ground vegetation may reduce the growth of ash (Bedford and Pickering, 1919, p. 268), mainly by below-ground competition (Bloor et al., 2008).
In summary, increased mortality may lead to increased regeneration costs or even prevent successful regeneration. If the stand is to be regenerated prior to full rotation the problem is often more severe because more ground vegetation is usually present in younger stands. Re-stocking restricted to clumps or clusters at large spacing offers an interesting option under such conditions (Wilhelm and Rieger, 2013;Mettendorf and Vetter, 2016). Each cluster could for example include 10-25 trees planted either in natural gaps, at a regular distance of 15-25 m between clusters, or irregularly to promote spatial variability in the succeeding stand, depending on management objectives and local site and stand conditions. The choice of tree species should be carefully considered in relation to local light conditions as well as to soil and other site factors.

Reductions in growth performance and wood quality
Reduction in growth is an immediate and critical consequence of crown dieback (Thomsen and Jørgensen, 2011;Metzler et al., 2012), usually leading to an increase in rotation length at both the stand and the individual tree level. Obviously, growth reduction per se leads to reduction in economic revenue. The ultimate result of reduced growth may be loss of tree vigour and, for some trees, eventually tree death.
As ash is a ring-porous tree species, reductions in growth rate will lead to reductions in wood quality. This problem, however, is marginal compared to the risk of stem wood discolouration and decay. The increased risk of this damage in ash affected by dieback is due to emerging epicormic branches, oxygen influx through dead branches and rot developing in the root collar area (Schmidt, 2006).
Epicormic branches are uncommon in healthy ash, but often proliferate on trees affected by ash dieback. While this may help alleviate poor growth rates in infected trees due to greater photosynthetic capacity, once epicormic branches become infected, they may act as entry points of H. fraxineus to the stem causing wood discolouration. Consequently, epicormic branches are critical indicators of potentially reduced wood quality when they occur on the merchantable part of the stem.
Root collar infections lead to wood discolouration and subsequent rot at the basal part of the stem. This usually extends no more than 1 m in height above the ground , but the affected stem parts must be discarded in the process of preparing logs for sale.
As a side-effect, dying trees are commonly infested by ash bark beetles leading to further discrimination of logs on the timber market. Additionally, wood deterioration caused by saprobic decay fungi increases the longer dead trees with salvageable timber are left in the forest.

Reductions in tree stability
Ash trees affected by dieback may rapidly lose their structural integrity and anchorage in the soil because of collar and root rot. The reduction in individual tree stability will increase risks for forest personnel, forest visitors, traffic along roads and forest edges and damage to adjacent infrastructure (Metzler et al., 2013;Kirisits and Freinschlag, 2014).

Assessing trees and stands for ash dieback
To aid decision-making for silvicultural prescriptions of stands affected by ash dieback, damage appraisals, incorporating inspections of the tree crown and the stem base, are advisable. In forestry practice, crown dieback is most commonly identified 'at a glance', but should be assessed by careful observation based on a combination of macroscopic symptoms. Collar lesions are usually easy to detect on young trees with smooth bark. On older trees, however, rough bark or moss may hinder Silvicultural strategies for Fraxinus excelsior their detection considerably. For both crown dieback and collar lesions, there are numerous photo guides at national level to help ensure accurate disease diagnosis.
The extent and severity of ash dieback can be rated for individual trees as well as at stand level. An individual tree should be assessed on its survival potential as well as its potential for timber production. At stand level, assessments should rate the potential for the stand to close undesired canopy gaps. In some circumstances, survival potential and contribution to stand closure are sufficient criteria to keep a tree in the stand. In others, the commercial potential of the individual tree may be decisive.

Silviculture for ash with dieback
The key problems in adjusting silviculture to alleviate or mitigate the effects of ash dieback are that no efficient prevention or delay measures and no efficient treatment or cure are known. This is typical for many infectious diseases of forest trees caused by invasive alien pathogens. Some fungicides, e.g. prochloraz  and those with chemicals from the triazole, strobilurin, succinate dehydrogenase inhibitor (SDHI), quinone or organosulphur groups (Defra, 2016;Hrabětová et al., 2016), are effective in controlling mycelial growth and useful especially in forest nurseries. The number of applications necessary for efficient control is such, however, that this is an inefficient and costly method, and it would not be recommended in forest situations due to environmental concerns.
For ash, breeding for disease tolerance is a medium-to longterm option, ultimately providing seed from orchards consisting of genotypes with heritable tolerance. Meanwhile, silviculture must act based on how the disease has developed for ash of different genetic origin, under different site conditions, in different forest types, and for a range of observed stand treatment practices.
Based on the specified circumstances, silvicultural practices should be modified and targeted mainly to alleviate the immediate consequences of ash dieback, but also towards ensuring potentially disease tolerant ash has a future role in our forests. The silvicultural strategy for dealing with infected stands should depend on the original management objective, site conditions, stand type, age and the extent of dieback Thomsen and Skovsgaard, 2012;Metzler et al., 2013).
In this section, we discuss the effect that site conditions (moist or dry), stand age (young or old), forest type (mixed, pure or conservation forest) and different silvicultural practices (tending, thinning, harvesting of mature timber or regeneration) may have on the disease, and to what extent a modification in silviculture may possibly alleviate the progress and impact of dieback. We also comment on the possible use of other species of Fraxinus. Overall, we recommend a conservative approach that aims at maintaining ash in forest ecosystems while salvaging commercial timber values and reducing costs associated with the disease.

Site conditions
The potential influence of site conditions on the extent and severity of ash dieback indicates that the risk associated with a given site type should be considered carefully when modifying silvicultural practices. While ash on all-year moist and humusrich sites generally suffer more from dieback than ash on dryer sites, the development of dieback also depends on climate characteristics and weather conditions.
No reliable forecast can be given in operational terms for this combination of influential factors. The best advice is to observe the development at tree and stand level at regular intervals and act accordingly. Eventually, stand replacement options, too, obviously depend on site conditions.

Stand age
The transition from tending to thinning (Figure 1) is probably the most critical stage of stand development when considering silviculture prescriptions for stands with ash dieback. This includes the period when trees are too tall for formative pruning, but have not grown tall enough to develop a sufficiently long, merchantable bole. At this stage of stand development it is usually too late for replacement planting and too early for commercial timber sale. This means that the costs associated with either replacing or continuing the crop of ash are high. If the stand of declining ash is replaced, the investment in the already established crop will be lost. If the stand is preserved for continued management, a reduction in future economic revenue can be expected. Figure 1 Ash dieback in the perspective of some main factors in silviculture. Silviculture may be defined as a planned sequence of forest operations to pursue original or modified management objectives. The three main activities in silviculture, tending, thinning and regeneration, always operate under the frame conditions set by site characteristics, tree genetics and forest type. We identify the transition from tending to thinning as the most critical phase. At this stage of stand development it is usually too late for replacement planting and too early for commercial timber sale.

Forestry
For young stands, the main aim is to identify putatively tolerant trees and promote their long-term survival and wood quality. For older stands, the aim is to delay the final harvest for as long as possible without jeopardizing wood quality. In either case, severely infected stands with ash as the only or dominant tree species should be felled and replanted. Prior to this decision, the extent and severity of dieback should be assessed objectively (Appendix 1). Note specifically that the intensity of premature leaf shedding (usually beginning in August) should never be used to decide on future management. Unprofitable removal of young ash may be of limited use or even disadvantageous. The risk is that any tolerant ash trees will be removed from the stand and that undesirable ground vegetation will further proliferate.
The decision to refrain from clearing a stand completely will often depend mainly on the number of healthy trees within and their (potential) commercial value: can remaining trees close canopy gaps created by dieback-related mortality, can their value be maintained and for how long? In addition to age, the decision may be influenced by site conditions (moist or dry), forest type (mixed or pure), thinning practice (heavy thinning, light thinning or thinning only for crop trees), management objective and management approach (e.g. plantation forestry vs close-to-nature forestry, commercial forestry vs nature conservation, management for recreation or management for aesthetics).
Forest type: mixed vs pure stands Throughout Europe, the general impression is that mature ash and ash in mixed stands suffers less, with disease progression tending to be faster in pure stands of young ash. So far, this remains essentially unquantified, except for three studies with somewhat contrasting results.
In young naturally regenerated stands in Latvia, ash mixed with other tree species in various proportions had more crown dieback when mixed with Norway spruce (Picea abies (L.) Karst.) or Norway maple (Acer platanoides L.), less when mixed with small-leaved lime (Tilia cordata Mill.), grey alder (Alnus incana (L.) Moensch) or black alder (A. glutinosa (L.) Gaertn.), and least when mixed with silver birch (Betula pendula Roth) or European aspen (Populus tremula L.); stands of (essentially) pure ash ranked intermediate (Pušpure et al., 2017). In a large-scale survey of ash in operational stands in the Czech Republic it was noted that in mixed stands the extent of crown dieback was highest in the presence of oak (Quercus spp.), beech (Fagus sylvatica L.) or birch (Betula spp.), while stands with an admixture of pine (Pinus spp.), fir (Abies spp.) or maple (Acer spp.) had lower levels of dieback (Havrdová et al., 2017). It was hypothesized that this could relate to biogeochemical litter characteristics having an effect on the sporulation of H. fraxineus. In the northeast of France, neither crown dieback level nor collar lesion severity related to the relative proportion of ash in forest stands .
Observations in Denmark indicate that an understorey of conifers, including western red cedar (Thuja plicata D. Don), tends to alleviate the extent or severity of ash dieback, but only marginally as compared with the influence of tree genetics (H.C. Graversgaard and J.P. Skovsgaard pers. comm.). Observations in floodplain forests in Austria indicate that this may also be true for an understorey of broadleaved tree species (M. Bubna pers. comm.). In Poland (D. Dobrowolska pers. comm.), there appears to be less ash dieback in uneven-aged, multi-storied stands sparsely populated by ash (<10 per cent by wood volume) and with an admixture of black alder (Alnus glutinosa (L.) Gaertn.), silver birch (Betula pendula Roth), European silver fir (Abies alba Mill.) and Norway spruce (Picea abies (L.) Karst.).
Collectively, these observations are somewhat in contrast to evidence for the influence of climate and site conditions on ash dieback, namely that higher air humidity within the stand leads to more dieback. For example, air humidity is usually higher in multi-storeyed stands or in stands with an understorey of conifers as compared with even-aged pure stands. One possible explanation is that the positive effect of the admixture may override climate simply by the general mixed-forest effect that some species apparently thrive better when growing together with other species because of the differentiation in resource utilization. Another possible explanation is that there is less inoculum in mixed stands simply because there are fewer ash trees. However, in the study in northeast France there was no statistical correlation between the frequency of F. excelsior in forest stands and the number of H. fraxineus apothecia on rachises in litter at the base of ash trees . The role of admixed tree species in the epidemiology of ash dieback consequently requires further study.
Overall, mixed stands are the most flexible in terms of management options for the future. If the proportion of ash is not too great, one may rely on natural regeneration of tolerant individuals and of other species to fill in the gaps and compensate for the loss of ash. If the dieback is severe, future management options approach those for pure stands of ash.

Forest type: pure stands
When plantation ash in pure stands is infected, management decisions soon become limited and generally more critical than for mixed stands. At a stand level, we differentiate between severely infected stands and stands with a high proportion of apparently healthy trees. There is no sharp demarcation between either of these. Management decisions, therefore, often rely on an assessment of canopy or gap closure potential and the risk of losing commercial value.
In severely infected commercial stands, the reasonable solution from a managerial perspective will usually be to fell only if this is profitable and to replant. Healthy or slightly damaged ash trees may serve temporarily as shelter trees for the next generation of forest, even if the remaining ash trees are young. Depending on the area and the number of trees involved, however, the operational costs of subsequent interventions to remove originally maintained trees, which subsequently decline in health, may be overly high, especially in the case of older trees.
In stands with a sufficiently high proportion of 'healthy' trees, disease progression should be observed over some years (Appendix 1). This will allow the forest manager time to plan for alternative management options.
For pure stands of ash, additional tree species should be introduced. For stands that are clearcut the species should be changed. If the number and health of trees remaining on the Silvicultural strategies for Fraxinus excelsior area allows for maintenance of overhead shelter this may help ensure a gradual transition to the next generation of forest. Otherwise, the abrupt change by planting on clearcut land may be unavoidable. Moreover, the cost of regeneration and operational efficiency may influence this decision.
Due to species-specific expectations on rotation length, a change of tree species will influence the balance of age classes both within an individual stand and at the estate level. Important factors for the outcome of this process include light demands, potential growth rate, potential longevity and the commercial potential of the species involved.
On moist sites, black alder ( In addition to tree species already mentioned, a number of potentially invasive species can be candidates for the next generation of forest, either because they are already present on or near the area and may spread, or because they could be planted. These include, for example, the non-native black locust (Robinia pseudoacacia L.), honey locust (Gleditsia triacanthos L.), maple species such as box elder or ashleaf maple (Acer negundo L.) and tree-of-heaven (Ailanthus altissima (Mill.) Swingle) (e.g. Richardson and Rejmánek, 2011 and their online supporting information), but even native European species such as sycamore may outside of its natural range act as invasive (Binggeli, 1992;Hein et al., 2009;Straigyte and Balickas, 2015). It varies considerably with local conditions and management philosophy whether invasive species are considered an asset or a threat, but in or near conservation forest they inevitably rank as undesired foreign elements in the ecosystem.

Forest type: non-intervention conservation forest
Various types of forest for habitat conservation are often managed without thinning or harvesting intervention. There is a strong case, therefore, of 'doing nothing and letting nature take its course' in such areas. This holds for mixed as well as monospecific stands of ash. Additional considerations of alternative options regarding the ecological role of ash can be found elsewhere (Pautasso et al., 2013;Mitchell et al., 2014Mitchell et al., , 2016.

Tending
In young stands there are four immediate management options which can be considered: (1) pruning of infected tissue, (2) formative pruning, (3) removal or treatment of autumn foliage, (4) reducing the shrub and herbaceous layers.
Pruning of branches can potentially remove infected tissue, however, since fungal spread extends well beyond the actual visible lesion in the bark, a judgement of precisely where to cut in order to achieve complete exclusion of the pathogen is difficult in practice (Marčiulynienė et al., 2017). In contrast to pollarded landscape trees (e.g. Bengtsson et al., 2013) and urban ash trees (e.g. Roloff, 2016), ash in high forest is rarely pruned because natural pruning usually progresses sufficiently fast and costs are not justified. Moreover, pruning can promote adventitious shoot regeneration which in turn may become infected and promote disease developing on the larger stem Thomsen andSkovsgaard, 2009, 2012). On balance, the risks outweigh the possible benefits and pruning of ash in high forest stands is not recommended.
Formative pruning is sometimes used in forestry on young trees before canopy closure to encourage the development of a single straight stem (Bulfin and Radford, 1998a, 1998b, 2001. For ash, however, the effect on stem form is insignificant for any level of formative pruning (Kerr and Morgan, 2006). In line with this result, we discourage formative pruning.
Routine removal and destruction of autumn foliage to reduce the local source of fungal inoculum may be somewhat effective (Danquah and Costanzo, 2013) depending on the level of humidity and the inoculum quantity in the vicinity. This practice may be beneficial in decreasing infection rates in urban environments (Thomsen, 2014;Queloz, 2016) and nurseries , but is labour intensive, expensive and not feasible at the forest scale.
Fungicide treatment of pseudosclerotial leaf rachises has been shown to inhibit the development of apothecia of H. fraxineus in the laboratory (Hauptman et al., 2014), but its effectiveness in the field is questionable. Moreover, large-scale use of fungicides in the forest can pose risks for the environment and is not recommended. Treatment of leaf rachises with urea in the laboratory also inhibits the formation of apothecia. Urea accelerates the breakdown of leaf debris by the stimulation of saprobic fungi and has a positive influence on microorganisms acting as antagonists of the ash dieback pathogen (Hauptman et al., 2014).
Finally, reducing the shrub and herbaceous layers (especially grasses) may reduce air humidity in the immediate vicinity, thereby reducing potential spore production and dispersal. However, at the operational level such practices would be overly expensive based on currently available technologies.

Thinning: experimental evidence
The influence of thinning practice or stocking density on young ash infected by H. fraxineus has been studied in a thinning experiment in Denmark (Thomsen andSkovsgaard, 2006, 2007;Skovsgaard, 2008;Skovsgaard et al., 2010;Jakobsen, 2011;Bakys et al., 2013;Ahlberg, 2014). A single thinning intervention resulted in widely contrasting residual stem densities (100-2500 trees ha −1 ) replicated in six blocks at four different locations. The proportion of trees suffering from dieback Forestry tended to decrease with decreasing stem density or stand basal area, but only marginally as compared with the influence of site and genetic constitution (Jakobsen, 2011;Bakys et al., 2013). Moreover, trees in strictly unthinned control plots tended to suffer more from dieback as judged by reductions in the proportion of crown foliage. In some plots, all trees were killed. This clearly related to flooding events or fluctuations in groundwater (Jakobsen, 2011;Ahlberg, 2014), possibly triggered by the reduction in canopy foliage due to thinning.
The stem quality of pre-selected potential future crop trees was generally better in plots with 1500 residual trees ha −1 , as compared to unthinned control plots (≥2500 trees ha −1 ) and to plots with 500 trees ha −1 (Ahlberg, 2014). This was due mainly to fewer epicormic branches and could be ascribed to a larger proportion of class A timber and a smaller proportion of class C timber. It was observed that 10-15 per cent of potential future crop trees selected for a combination of crown vitality and stem quality deteriorated on either or both of these criteria on an annual basis, regardless of thinning intensity (Ahlberg, 2014). This number is consistent with the observed annual transition probability of 83 per cent for healthy, much older retention trees in Estonia (Rosenvald et al., 2015).

Thinning: operational advice
Thinning in stands affected by dieback is essentially a matter of keeping or regaining stand density control. As a general rule thinning should be avoided until control over density has been regained, but severely declining trees with residual value should be salvaged where possible, since these will die anyway.
In young single-species stands, it is advisable to mark and retain 'healthy' trees and thin only among unmarked trees Thomsen and Skovsgaard, 2012). Although there is still no scientific evidence whether this strategy is beneficial long-term in the presence of the ash dieback epidemic, it is consistent with common perceptions of survival probability, and alternative strategies seem no better or worse. Marking~200 trees ha −1 is consistent with production goals for commercial timber and at the same time allows for a decline and subsequent removal of 25-75 per cent of the pre-selected trees. In mixed stands, other species should be favoured when selecting trees to be retained.
An alternative to stand-based ('classical') thinning is to apply crop tree management, in Europe initially developed mainly for beech (Schädelin, 1932(Schädelin, , 1942Bryndum, 1980) and oak (Løvengreen 1949;Jobling and Pearce, 1977;Ståål, 1986;Kerr, 1996;Attocchi and Skovsgaard, 2015) and subsequently refined and explored for ash (Hein, 2003) and other tree species (Wilhelm and Rieger, 2013). Under this management strategy, vigorous trees of superior stem quality are selected and marked as future crop trees and subsequently subjected to crown release at regular intervals. By experience, the number of preselected future crop trees should be consistent with the expected number of final crop trees.
There are indications that under the ash dieback epidemic, dieback may override the possible influence of crop tree management on crown development Metzler et al., 2013;Queloz, 2016;Heinze et al., 2017). This is similar to the above-mentioned results from experiments on regular thinning in young stands of even-aged ash and probably relates to the influence of site and genetics rather than to management practice per se. However, in the absence of experimental evidence, crop tree management may be advocated for the following reasons: (1) A tree with a larger crown will more easily recover (for ash, dieback tends to be less severe on trees of larger stem diameter (Skovsgaard et al., 2010) or takes longer to damage larger trees (Husson et al., 2012;Marçais et al., 2016;Queloz, 2016;Havrdová et al., 2017)), (2) A tree with a larger crown will have a higher growth rate (for ash, see for example Hein, 2003) and will consequently increase its commercial value at a faster rate, (3) A tree with a larger crown will usually produce more seed; this may be desirable in some forest types, in particular if the tree is tolerant of H. fraxineus, (4) A tree with a larger crown will leave a larger gap for regeneration if it dies, and this in turn will usually result in better (light) conditions for regeneration.
The application of crop tree management should be modified depending on the observed development of dieback and other local conditions. Note in particular that crown release of infected ash trees will most likely not result in crown expansion. Moreover, to conserve as many putatively tolerant trees as possible, ash trees free of dieback symptoms should not be cut in favour of selected crop trees.
Thinning: selection of potential future crop trees To identify alternative candidates as potential future crop trees for timber production or for in situ or ex situ conservation purposes, spring and late season phenology may be used as the final selection criterion. Ash that flushes (Pliūra and Baliuckas, 2007;Bakys et al., 2013) or senesces (McKinney et al., 2011(McKinney et al., , 2014 early tends to suffer less from crown dieback. Consequently, such trees should be given preference to remain in the stand when there is a choice of which trees to remove. In some parts of Europe, leaf yellowing (as an indicator of leaf senescence) is rarely observed (Roloff and Pietzarka, 1997), whereas in other locations it is common (J. Clark pers. comm.). In Denmark, trees with early senescence of leaves appear to be more tolerant of H. fraxineus (McKinney et al., 2011(McKinney et al., , 2014. In Austria and Sweden, ash trees with little crown dieback, no collar lesion and staying densely foliated until late in the growing season are considered as disease tolerant and are, therefore, recommended for retention (Kirisits, 2013;Stener, 2013;Freinschlag, 2014, 2015).

Harvesting in older stands
Harvesting should be delayed as long as possible without jeopardizing wood quality to maximize economic returns, but also to allow for tolerant trees to become apparent. These may contribute to natural regeneration of the stand, with seedlings showing a range of tolerance. Due to the genetic component in susceptibility both to crown damage (McKinney et al., 2014) and collar lesions (Muñoz et al., 2016)  Silvicultural strategies for Fraxinus excelsior Chandelier et al., 2016;Havrdová et al., 2017) both types of damage should be considered for selection purposes.
Assessment of mature trees can be difficult and the rate at which the disease will progress is uncertain, so harvesting recommendations for older stands are usually based on rather coarse classifications: -A tree should be retained when less than 25 per cent of the crown is dead (i.e. more than 75 per cent of full crown foliage remains) as such individuals are likely to have better than average tolerance to dieback. -A tree with 25-75 per cent crown mortality should be considered for retention or harvest. -A tree should be harvested when more than 75 per cent of the crown is dead or mainly comprises epicormic regrowth (which may make the tree appear to have a greater proportion of healthy crown than in previous years).
In all cases, trees should be checked for collar lesions and rot. There appears to be no consensus in the literature or among practitioners as to an acceptable degree of collar lesion. A pragmatic approach is to remove trees with any collar lesion, assuming that their condition will deteriorate in the future anyway. Alternatively, ash trees with little crown damage and negligible collar lesion, for example less than 5-10 per cent of the collar circumference, may be temporarily retained and regularly re-inspected. So far, ash bark beetles have not been observed changing behaviour to become a primary pest due to ash dieback or other 'weakening' of ash (Skovsgaard et al., 2010;Metzler et al., 2013;Hauptman et al., 2016;. Consequently, sanitation measures for bark beetles are currently considered unimportant. Only live or recently dead trees should be harvested and timber should be removed from the forest as usual to avoid stain or decay. Apothecia of H. fraxineus form mainly on dead leaf rachises, rarely on twigs, and have not been observed on timber or on bark or wood affected by collar symptoms (Gross et al., 2012(Gross et al., , 2014Metzler et al., 2013). However, recent evidence claiming that the asexual spores (conidia) of H. fraxineus may be infectious (Fones et al., 2016), would present another pathway of dispersal with potentially far reaching implications, but, so far, no advice against timber transport has emerged. Cutting out the basal part of a log affected by collar infections will help minimize this risk (Husson et al., 2012;Marçais et al., 2016).

Regeneration
Ash generally produces much seed and regenerates abundantly on many site types (Roloff and Pietzarka, 1997;Dobrowolska et al., 2011;Thomas, 2016). This may help sustain a genetically diverse fraction of ash in the forest through processes of natural adaptation. Clearcutting of diseased stands will generally result in scarce natural regeneration of ash , whereas high densities of saplings are often observed under a canopy of shelter trees (Pušpure et al., 2017). Due to reduced competitiveness and mortality of diseased ash saplings, regeneration consisting mainly of ash will gradually change into a mix of tree species or be at risk of failing. However, there are indications that moderately diseased saplings may stay competitive in regeneration (Enderle et al., 2017).
In order to promote natural adaption to ash dieback there should be a selection for healthy seed trees, if possible. A regular distribution of these is preferred to ensure an even dispersal of seed and regeneration over the area (Matthews, 1989), but for ash this may be less important due to its wind-dispersed seeds. Previously exposed edge trees tend to suffer less from dieback and sudden exposure than retention trees in the former stand interior (Rosenvald et al., 2015). In forestry practice, it will often be difficult to consider this information when selecting seed or retention trees. A mix of species in the regeneration is recommended rather than pure ash. The admixture in the new stand may be enriched by planting tree species that are already present or by introducing new species.
Sowing of harvested ash seeds is not recommended for regeneration, since highly susceptible trees may still be dominant in natural pollination. Disease tolerant seed stock may be available after some years from seed orchards consisting of genotypes of proven tolerance. Based on our current understanding of ash dieback, nursery stock of tolerant genotypes could be planted for experimental or demonstration purposes. In such cases, the establishment should be recorded in a report to ensure the validity of future interpretations.
Autumn planting is generally recommended for ash because the roots of ash will keep growing during mild winters (Ladefoged, 1939), leading to better and faster establishment of the seedlings. Unfortunately, this is no longer advisable as seedlings may already be latently infected in the autumn, but still appear externally disease-free Černý et al., 2016;Thomsen et al., 2017b). Therefore, planting should now be done in the spring when it is easiest to detect symptoms of ash dieback on the nursery stock.
Although fungicide treatment of nursery transplants may produce healthy trees in the short-term, a large proportion of these may turn out to be highly susceptible when transferred to the forest (Metzler et al., 2013). In years ahead, planting stock tolerant of H. fraxineus may become available, for example based on propagation by tissue culture or cuttings from tolerant ash trees in clonal seed orchards.
Finally, it is generally observed that coppice regrowth often becomes severely infected (see, for example, Lygis et al., 2014). Consequently, regeneration of ash should not rely on coppice practices.

Other ash species
There are two other naturally occurring species of Fraxinus in Europe, F. angustifolia Vahl (narrow-leaved ash) and F. ornus L. (manna ash). They both co-exist with F. excelsior in some of the southern part of its natural range. Fraxinus angustifolia is closely related to F. excelsior and has such a similar appearance that they are sometimes difficult to distinguish, especially when growing in mixed stands. Moreover, they easily hybridize (FRAXIGEN, 2005).
Although F. angustifolia is slightly less damaged by H. fraxineus than F. excelsior Schwanda and Kirisits, 2016) it is still highly susceptible Hauptman et al., 2016). Fraxinus ornus can show symptoms on leaves but not on woody parts and is a more tolerant host than F. excelsior (Kirisits and Schwanda, 2015). However, it is not as Forestry valuable as a timber species and at most sites outside of southern Europe cannot replace F. excelsior.
Asian Fraxinus species planted in Europe generally show no or remarkably less visible crown damage compared with F. excelsior (Cleary et al., 2016;Nielsen et al., 2017). This holds true for example for F. mandshurica Rupr. (Manshurian ash), on which H. fraxineus is either a benign associate (Cleary et al., 2016) or causes relatively little damage (Drenkhan and Hanso, 2010;Drenkhan et al., 2014Drenkhan et al., , 2017, but this species can be sensitive to winter as well as spring frost (Krüssmann, 1977;Wang and Dai, 1997) and occurs only as a small tree in Europe (Drenkhan et al., 2014).
The North American species, F. pennsylvanica Marshall (green ash) and F. americana L. (white ash) are to some degree also susceptible to H. fraxineus, though less so than F. excelsior (Drenkhan and Hanso, 2010;Gross et al., 2014;McKinney et al., 2014;Nielsen et al., 2017), and this poses them at risk for use in forestry in Europe. Previous experience with F. pennsylvanica in floodplain areas did not fulfil expectations in terms of growth potential and wood quality (in comparison to F. excelsior), and this species is viewed as invasive in areas under nature conservation (Kirisits et al., 2009). Several other North American species, including F. nigra Marshall (black ash) and F. quadrangulata Michx. (blue ash), show high susceptibility to natural infection by H. fraxineus, while others may be susceptible to leaf infection, but not shoot infection (Nielsen et al., 2017).
In summary, exotic Fraxinus species are poor alternatives to replace F. excelsior in forest settings in Europe. More information documenting species performance of exotic Fraxinus species will help establish their potential to serve as a viable alternative option, for example, in urban landscapes.

Discussion
Ash dieback is an example of a serious disease caused by an introduced pathogen that devalues the commercial value of ash. It emphasizes the negative consequences of pathogen introductions for forestry and forest conservation. These biological invasions have increased at an alarming rate in recent decades (Santini et al., 2013;Freer-Smith and Webber, 2015). There is no indication that this trend will slow down in the future. It is stating the obvious that greater control efforts could be made to prevent the introduction of novel pests and pathogens. However, the extent of global transport and anthropogenic movement of plants and plant products is such that the efficacy of any control measures is questionable.
Unfortunately, the statistics on actual volume losses attributable to ash dieback are unreliable. One reason is that tolerant or even completely healthy ash trees have been cut, based on either somewhat irrational criteria (overestimating the magnitude of the problem) or economic decision-making at standlevel . Another reason is that no inventory data until now have differentiated the severity of ash dieback. Interestingly, however, there are clear indications in some countries of increases in salvage cuttings some years after the incidence of the epidemic (Sweden: Cleary et al., 2017;Denmark: McKinney et al., 2014;Germany: Enderle et al., 2017). Obviously, this is due to significant health deterioration and tree mortality.
The ultimate effects of ash dieback will undoubtedly be far reaching, both ecologically (Pautasso et al., 2013;Mitchell et al., 2014) and economically (Petucco et al., 2015). However, the significance of these impacts may perhaps not be as great as originally feared. The high level of genetic diversity found within ash means that some individuals have a tolerance which could enable ash to 'hold on' while breeding programmes are instigated to ensure we retain ash as a viable timber and landscape species.
Silviculture should be increasingly responsive to alleviate the consequences of emerging infectious tree diseases. Based on our current state of understanding very little can be done in silviculture to prevent the spread and reduce the impact of H. fraxineus. The advice compiled and reviewed in this paper may serve as a guide in alleviating some consequences of ash dieback.

Conclusion
In summary, the main advice for silviculture of F. excelsior in response to dieback caused by H. fraxineus is to retain apparently disease tolerant ash trees wherever this is reasonable within the context of management objectives and overall forest structure. Moreover, lessons learned so far clearly indicate that ash is least vulnerable within certain limits of site conditions. This emphasizes the overall importance of site-adapted tree species and site-specific silviculture. Drenkhan

Forestry
The proposed procedure should be scaled and modified according to the quantity of ash on the estate and depending on the magnitude of the problem. On estates with much ash, it is advisable to employ the procedure for a random sample of stands to establish an objective basis for an estate-wide assessment. Alternatively, it may be used in stands of particular value (financial, ecological or otherwise). Reassessments should initially be conducted annually and their frequency should then be revised depending on disease progression.

Crown dieback and defoliation
In damage assessments, care must be taken not to mistake crown dieback with premature leaf shedding. The optimal time for rating crown symptoms is during summer or, more specifically, after trees have fully flushed (June) and before premature leaf shedding (which often occurs in August and subsequently increases cumulatively).
For operational purposes, the rating of crown symptoms can be based simply on the completeness of foliage in the primary crown or, alternatively, on crown transparency (see, for example, Innes, 1990;Müller and Stierlin, 1990). We recommend a scale in steps of 10 per cent, with 100 per cent for a crown with complete foliage (Figure 2). For faster assessment, or in young stands with small crowns, a rougher scale in steps of 25 per cent (i.e. 0, 25, 50, 75 and 100 per cent) may be used. The use of a continuous numeric scale is preferable to discrete classes (e.g. 0, 1, 2, 3 and 4) because quantitative calculations are more easily interpreted. During crown and foliage rating, one will be looking specifically for dead crown branches and branch tips as these indicate less than full crown development (depending somewhat on the competitive status of the individual tree and the proximity of neighbour trees).
Prior to each full assessment, the observer should assess and reassess some trees, for example 10, to ensure proper 'calibration'. 'Calibration' of different observers, by joint rating of a number of trees and aligning the scorings, is also important. Obviously, a rating of foliage completeness or crown transparency based on visual assessment will be only approximate and associated with large variation, as will any similar method of crown condition assessment.
Finally, the occurrence of many epicormic branches and replacement shoots essentially always indicates severe dieback. Epicormic branches on the stem are a separate indicator of wood quality and of the deterioration of wood quality. Consequently, their number may be estimated by 'rough' counting (for example, 1, 2, 3, 4, 5, 10, 25 or >25) and noted separately, allowing for assessment of the temporal development.

Collar lesions
Collar symptoms can be assessed throughout the year. In the majority of cases in operational forestry, it may be sufficient to record the presence or absence of collar lesions on a per tree basis, such that the proportion of affected ash trees at stand level can be estimated. Moreover, a specific recording of individual Armillaria infections may be relevant.
Identification of collar lesions is relatively easy on young trees, where they appear as red-brown discolouration and, later, depressions in the still smooth and thin bark (Figure 3). On older trees with rough and thick bark, recognition of collar symptoms is more difficult. In the case of older lesions, the bark cracks and loosens from the stem.
During inspection for Armillaria one should look for the usual diagnostic features of white mycelial fans and black melanised rhizomorphs under the bark.
Attack by Armillaria is often confined to roots where it may lead to spots of dead bark and decayed wood. Such lesions can be found on upper roots near the stem by scraping away the upper soil layer or moss covering surface roots and the root collar (this will also help in finding Figure 2 Photograph showing different degrees of ash dieback. Numbers indicate the estimated proportion of foliage remaining for each tree (in this example, 1-100 per cent). Due to the nature of the visual assessment process, such ratings are approximate and associated with large variation, but indicate the level of dieback. Figure 3 Photographs showing different stages of collar lesion and bark and wood discolouration. Top row (symptoms visible on outer bark): (A) early stage necrosis and red-brown discolouration of stem bark, (B) mid stage collar lesion with depression and one or few cracks in bark, (C) late stage collar lesion with depression and many cracks in bark. Middle row (symptoms visible below bark): (D) early stage collar necrosis caused mainly by Hymenoscyphus fraxineus, (E) mid stage collar necrosis due to H. fraxineus and Armillaria sp., with white mycelium of Armillaria sp., (F) advanced collar rot and discolouration in wood, with black zone lines due to Armillaria sp. Bottom row (symptoms on stump and cross-section of stem): (G) early stage brown discolouration of wood due to H. fraxineus (in this case entering through an epicormic branch), (H) mid stage discolouration due to H. fraxineus (entering at the root collar) and Armillaria sp., (I) late stage brown discolouration of the stump due to H. fraxineus and Armillaria sp., with black zone lines due to Armillaria sp. in progressively decaying parts of the wood.

Forestry
Silvicultural strategies for Fraxinus excelsior collar or surface root infections by H. fraxineus). For the purpose of an operational inventory, collar and root lesions may be noted as either absent or present. On mature trees, collar and root lesions and rot are often only discovered after felling. In these cases, wood discoloration or wood decay on stumps are conspicuous indicators of fungal attack (Figure 3).
If detailed information is needed, the extent of lesions at the stem base can be assessed numerically. We recommend a continuous scale reflecting the percentage (0-100 per cent) of stem circumference influenced by rot fungi or H. fraxineus at the root collar (operationally defined as, for example, the lower 0.5 or 1 m of the stem). Additionally, the stage of development/severity of symptoms on each tree (e.g. no damage, discolouration only, depression in bark, depression with one or few cracks in bark, depression in bark with many cracks) and the possible occurrence of Armillaria (absent, present or advanced) may be noted.

Stem quality
For stands of commercial value, stem quality is a critical indicator of the potential of each individual tree to contribute to the management objective. Potential future crop trees in such stands may be rated for the usual stem quality indicators (dbh, location of lower fork, stem straightness and lean) as well as for the number of epicormic branches on the lower stem (commercial log length).
Epicormic branches on the stem should be given special attention, as these are known entry points of wood discolouration due to H. fraxineus. Next, collar lesions are also important indicators of stem quality, although stem discolouration or rot in ash rarely extends higher than 1 m into the commercial log.

Spatial location of 'good' trees
The best trees for timber production, gene conservation or other objectives should be clearly marked, as these are the potential future crop trees or phenotypical plus trees. This reduces both the risk of unintended felling and the costs of future reassessments. When marking potential crop trees, spatial distribution needs to be considered as well as tree health and stem quality. A rough sketch map can be really helpful ( Figure 4).

Figure 4
Example of a rough map of the spatial location of potential future crop trees in a young stand of ash infected by dieback. The map was drafted in the field using only step size measurements. Each square is 10 m × 10 m. The thick line indicates stand borders. Numbers indicate the id number of potential future crop trees. Tree number 504 was marked for special attention.