To see the other types of publications on this topic, follow the link: Crop residue management Australia.

Journal articles on the topic 'Crop residue management Australia'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Crop residue management Australia.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Blair, Graeme J., Les Chapman, A. M. Whitbread, B. Ball-Coelho, P. Larsen, and H. Tiessen. "Soil carbon changes resulting from sugarcane trash management at two locations in Queensland, Australia, and in North-East Brazil." Soil Research 36, no. 6 (1998): 873. http://dx.doi.org/10.1071/s98021.

Full text
Abstract:
Sugarcane cropping produces a large amount of crop residues, which offers considerable scope for residue management. Soil samples, collected from 2 long-term experiments in Australia and an experiment in Pernambuco State, Brazil, were analysed for total carbon (CT) and for labile carbon (CL) by oxidation with 333 mM KMnO4. At the 2 locations in Australia, CT and CL concentrations were lower in the surface layer (0-1 cm) of the cropped soil compared with a nearby uncropped reference soil. Burning resulted in a greater loss in CT and CL at a depth of 0-1 cm than green cane trash management. At one of the sites, sugarcane cropping resulted in a decline in CT relative to the reference in the green trash management treatment but an increase in CL. In Brazil, trash management from one cane crop did not change CT over a 12-month period but green cane trash return increased CL. Sustainable sugarcane cropping systems must include crop residue return without burning in order to maintain an active C cycle in the system to drive nutrient cycles.
APA, Harvard, Vancouver, ISO, and other styles
2

Unkovich, Murray, Jeff Baldock, and Steve Marvanek. "Which crops should be included in a carbon accounting system for Australian agriculture?" Crop and Pasture Science 60, no. 7 (2009): 617. http://dx.doi.org/10.1071/cp08428.

Full text
Abstract:
Dryland agriculture is both a potential source and potential sink for CO2 and other greenhouse gases. Many carbon accounting systems apply simple emissions factors to production units to estimate greenhouse gas (GHG) fluxes. However, in Australia, substantial variation in climate, soils, and management across >20 Mha of field crop sowings and >30 Mha of sown pastures in the intensive land use zone, provides substantial challenges for a national carbon accounting system, and simple emission factors are unlikely to apply across the region. In Australia a model framework has been developed that requires estimates of crop dry matter production and harvested yield as the first step to obtain carbon (residue) inputs. We use Australian Bureau of Statistics data to identify which crops would need to be included in such a carbon accounting system. Wheat, barley, lupin, and canola accounted for >80% of field crop sowings in Australia in 2006, and a total of 22 crops account for >99% of the sowing area in all States. In some States, only four or six crops can account for 99% of the cropping area. We provide a ranking of these crops for Australia and for each Australian State as a focus for the establishment of a comprehensive carbon accounting framework. Horticultural crops, although diverse, are less important in terms of total area and thus C balances for generic viticulture, vegetables, and orchard fruit crops should suffice. The dataset of crop areas presented here is the most comprehensive account of crop sowings presented in the literature and provides a useful resource for those interested in Australian agriculture. The field crop rankings presented represent only the area of crop sowings and should not be taken as rankings of importance in terms of the magnitude of all GHG fluxes. This awaits a more detailed analysis of climate, soils, and management practices across each of the regions where the crops are grown and their relationships to CO2, nitrous oxide and methane fluxes. For pastures, there is a need for more detailed, up to date, spatially explicit information on the predominant sown pasture types across the Australian cropping belt before C balances for these can be more reliably modelled at the desired spatial scale.
APA, Harvard, Vancouver, ISO, and other styles
3

Bailey, P., and J. Comery. "Management of Heliothis punctigera on field peas in south-eastern Australia." Australian Journal of Experimental Agriculture 27, no. 3 (1987): 439. http://dx.doi.org/10.1071/ea9870439.

Full text
Abstract:
Cypermethrin was found to be an effective substitute for DDT in controlling Heliothis punctigera in field peas. A single spray of cypermethrin prevented significant damage by larvae to field peas in trials in South Australia and Victoria over 3 seasons. Endosulfan was not as effective as cypermethrin. Bioassays of leaf discs dipped in cypermethrin showed that residues of 0.1 mg a.i. kg-1 caused 50% feeding inhibition, 0.43 mg a.i. kg-1 caused 90% feeding inhibition and concentrations above this caused increasing acute mortality to fourth instar larvae. Residues from field pea crops sprayed at 40 g a.i. fell to 0.43 mg a.i. kg-1 2-3 weeks after application. Two to 3 weeks protection is probably the maximum time for residual activity to be useful because the crop outgrows the sprayed foliage. To ensure that larvae are exposed to the maximum area of treated surface, the spray should be timed to coincide with the appearance of larvae in the crop, rather than spraying at a particular growth stage of the crop.
APA, Harvard, Vancouver, ISO, and other styles
4

Kirkegaard, J. A., S. J. Sprague, P. J. Hamblin, J. M. Graham, and J. M. Lilley. "Refining crop and livestock management for dual-purpose spring canola (Brassica napus)." Crop and Pasture Science 63, no. 5 (2012): 429. http://dx.doi.org/10.1071/cp12163.

Full text
Abstract:
Dual-purpose canola (Brassica napus) describes the use of a canola crop for grazed winter forage before seed production, a practice that has only recently been developed in southern Australia. Long-season winter canola has been grazed without yield penalty in higher rainfall zones of Australia (>650 mm) and the USA, but the potential areas are small. The feasibility to graze spring canola varieties across wider areas of the medium-rainfall (450–650 mm), mixed-farming zone in Australia is therefore of interest. We conducted a series of six field experiments involving a range of canola cultivars and grazing management and agronomy systems from 2007 to 2009 at Young in southern New South Wales, Australia, to determine the feasibility of and refine the principles for grazing dual-purpose spring canola without significant yield penalty. Mid-season, Australian spring canola cultivars including conventional and hybrid varieties representing a range of herbicide tolerance (triazine-tolerant, Clearfield®, and Roundup Ready®) were sown from 16 April to 12 May and grazed with sheep at a range of growth stages from early vegetative (June) to mid-flowering (September). In general, early-sown crops (sown mid-April) provided significant grazing (~800 dry sheep equivalent grazing days/ha) in winter before bud elongation, and recovered with no impact on grain yield or oil content. As previously reported, yield was significantly reduced (by up to 1 t/ha) when grazing occurred after buds had elongated (late July), due to the delayed flowering associated with bud removal by sheep and insufficient time for biomass and yield recovery. However, yield was also reduced in crops grazed before bud elongation if insufficient residual biomass remained (<1.0 t/ha for late July lock-up) to facilitate crop recovery even when there was little delay in crop development. We suggest that refinements to the existing ‘phenology-based’ grazing recommendations would assist to avoid yield loss in grazed spring varieties, and propose three grazing stages (safe, sensitive, and unsafe) that integrate the impacts of time, crop growth stage, residual biomass, and seasonal conditions to avoid yield loss under different circumstances. Such refinements to reduce the likelihood of grazing-induced yield loss would provide more confidence for mixed farmers to maximise the benefits from dual-purpose canola in different environments. Based on the outcomes of these experiments, dual-purpose spring canola is likely to have significant potential for wider application in other mixed farming zones, with similar region-specific refinements based on the principles reported here.
APA, Harvard, Vancouver, ISO, and other styles
5

Hunt, J. R., C. Browne, T. M. McBeath, K. Verburg, S. Craig, and A. M. Whitbread. "Summer fallow weed control and residue management impacts on winter crop yield though soil water and N accumulation in a winter-dominant, low rainfall region of southern Australia." Crop and Pasture Science 64, no. 9 (2013): 922. http://dx.doi.org/10.1071/cp13237.

Full text
Abstract:
The majority of rain used by winter grain crops in the Mallee region of Victoria, Australia, falls during the cooler months of the year (April–October). However, rain falling during the summer fallow period (November–March) and stored as soil moisture contributes to grain yield. Strategies to better capture and store summer fallow rain include (i) retention of crop residues on the soil surface to improve water infiltration and evaporation; and (ii) chemical or mechanical control of summer fallow weeds to reduce transpiration. Despite the widespread adoption of no-till farming systems in the region, few published studies have considered the benefits of residue management during the summer fallow relative to weed control, and none quantify the impacts or identify the mechanisms by which summer fallow weeds influence subsequent crop yield. Over 3 years (2009–11), identical experiments on adjacent sand and clay soil types at Hopetoun in the southern Mallee were conducted to quantify the effect of residue management (standing, removed, or slashed) and summer fallow weed control (± chemical control) compared with cultivation on soil water and nitrogen (N) accumulation and subsequent crop yield. The presence of residue (2.4–5.8 t/ha) had no effect on soil water accumulation and a small negative effect on grain yield on the clay soil in 2011. Controlling summer weeds (Heliotropium europaeum and volunteer crop species) increased soil water accumulation (mean 45 mm) and mineral N (mean 45 kg/ha) before sowing on both soil types in 2 years of the experiment with significant amounts of summer fallow rain (2010 and 2011). Control of summer weeds increased grain yield of canola by 0.6 t/ha in 2010 and wheat by 1.4 t/ha in 2011. Using the data from these experiments to parameterise the APSIM model, simulation of selected treatments using historical climate data (1958–2011) showed that an extra 40 mm of stored soil water resulted in an average additional 0.4 t/ha yield, most of which was achieved in dry growing seasons. An additional 40 kg/ha N increased yield only in wetter growing seasons (mean 0.4 t/ha on both soil types). The combination of extra water and N that was found experimentally to result from control of summer fallow weeds increased subsequent crop yield in all season types (mean 0.7 t/ha on sand, 0.9 t/ha on clay). The co-limitation of yield by water and N in the Mallee environment means that yield increases due to summer weed control (and thus returns on investment) are very reliable.
APA, Harvard, Vancouver, ISO, and other styles
6

Anderson, W. K., M. A. Hamza, D. L. Sharma, M. F. D'Antuono, F. C. Hoyle, N. Hill, B. J. Shackley, M. Amjad, and C. Zaicou-Kunesch. "The role of management in yield improvement of the wheat crop—a review with special emphasis on Western Australia." Australian Journal of Agricultural Research 56, no. 11 (2005): 1137. http://dx.doi.org/10.1071/ar05077.

Full text
Abstract:
Modern bread wheat (Triticum aestivum) has been well adapted for survival and production in water-limited environments since it was first domesticated in the Mediterranean basin at least 8000 years ago. Adaptation to various environments has been assisted through selection and cross-breeding for traits that contribute to high and stable yield since that time. Improvements in crop management aimed at improving yield and grain quality probably developed more slowly but the rate of change has accelerated in recent decades. Many studies have shown that the contribution to increased yield from improved management has been about double that from breeding. Both processes have proceeded in parallel, although possibly at different rates in some periods, and positive interactions between breeding and management have been responsible for greater improvements than by either process alone. In southern Australia, management of the wheat crop has focused on improvement of yield and grain quality over the last century. Adaptation has come to be equated with profitability and, recently, with long-term economic and biological viability of the production system. Early emphases on water conservation through the use of bare fallow, crop nutrition through the use of fertilisers, crop rotation with legumes, and mechanisation, have been replaced by, or supplemented with, extensive use of herbicides for weed management, reduced tillage, earlier sowing, retention of crop residues, and the use of ‘break’ crops, largely for management of root diseases. Yields from rainfed wheat crops in Western Australia have doubled since the late 1980s and water-use efficiency has also doubled. The percentage of the crop in Western Australia that qualifies for premium payments for quality has increased 3–4 fold since 1990. Both these trends have been underpinned by the gradual elimination or management of the factors that have been identified as limiting grain yield, grain quality, or long-term viability of the cropping system.
APA, Harvard, Vancouver, ISO, and other styles
7

Jacob, Helen Spafford, David M. Minkey, Robert S. Gallagher, and Catherine P. Borger. "Variation in postdispersal weed seed predation in a crop field." Weed Science 54, no. 1 (February 2006): 148–55. http://dx.doi.org/10.1614/ws-05-075r.1.

Full text
Abstract:
Postdispersal weed seed predation by animals during the summer fallow period may lead to a reduction in the number of weeds that grow in the following winter cropping season. In this study, we investigated the patterns of weed seed removal, the influence of crop residue cover on seed removal, the types of granivores present and their seed preferences in a 16-ha postharvest cropping field in Western Australia during the summer months over 2 yr. Seed removal from caches was extremely variable (from 0 to 100%). Removal rates were generally highest along the edges of the field near bordering vegetation and lowest in the center of the field and within the bordering vegetation. However, there were many deviations from this general pattern. There was no change in rates of predation with different levels of residue cover. Ants or other small invertebrates were found to remove the most seeds. However, seed removal by other animals, such as rodents, was also evident. Annual ryegrass seeds were preferred over wild oat seeds, followed by wild radish pod segments. Seed harvesting was lowest in late January, peaked in February, and decreased in March. Results from this study suggest seed harvesters could reduce the number of surface seeds in the field, reducing the weed seed bank. Management options that increase the activity of the seed harvesters may lead to less variability in seed predation and could, therefore, be incorporated into an integrated weed management program.
APA, Harvard, Vancouver, ISO, and other styles
8

Lyon, Drew J., David R. Huggins, and John F. Spring. "Windrow Burning Eliminates Italian Ryegrass (Lolium perenne ssp. multiflorum) Seed Viability." Weed Technology 30, no. 1 (March 2016): 279–83. http://dx.doi.org/10.1614/wt-d-15-00118.1.

Full text
Abstract:
Windrow burning is one of several harvest weed seed control strategies that have been developed and evaluated in Australia to address the widespread evolution of multiple herbicide resistance in annual weeds. Herbicide-resistant Italian ryegrass populations are common in the Palouse region of eastern Washington and northern Idaho. Field and greenhouse studies were conducted to evaluate the effects of burning standing stubble and narrow windrows on the survival of Italian ryegrass seed on the soil surface and to determine the amount of crop residue remaining after both practices. Italian ryegrass emergence was 63, 48, and 1% for the nonburned check, burned standing stubble, and burned windrow treatments, respectively. Crop-residue dry weights were 9.94, 5.69, and 5.79 Mg ha−1 for these same treatments. Windrow burning can be an effective tactic in an integrated weed management strategy for Italian ryegrass control in the Palouse region of eastern Washington and northern Idaho.
APA, Harvard, Vancouver, ISO, and other styles
9

Mahajan, Gulshan, Amar Matloob, Michael Walsh, and Bhagirath S. Chauhan. "Germination Ecology of Two Australian Populations of African turnipweed (Sisymbrium thellungii)." Weed Science 66, no. 6 (September 14, 2018): 752–57. http://dx.doi.org/10.1017/wsc.2018.55.

Full text
Abstract:
AbstractAfrican turnipweed (Sisymbrium thellungiiO. E.Schulz) is an emerging problematic broadleaf weed of the northern grain region of Australia. Laboratory experiments were conducted to evaluate the effects of temperature, light, salinity, pH, seed burial depth, and the amount of wheat crop residue on germination and emergence of two AustralianS. thellungiiweed populations (population C, cropped area; population F, fence line). Both populations behaved similarly across different environmental conditions, except in the residue study. Although the seeds of both populations ofS. thellungiicould germinate under complete darkness, germination was best (~95%) under light/dark conditions at the 20/10 C temperature regime. Both populations ofS. thellungiigerminated over a wide range of day/night temperatures (15/5, 20/10, 25/15, and 30/20 C). Osmotic stress had negative effects on germination, with 54% seeds (averaged over populations) able to germinate at −0.1MPa. Complete germination inhibition for both populations was observed at −0.8MPa osmotic potential. Both populations germinated at sodium chloride (NaCl) concentrations ranging from 50 to 100 mM, beyond which germination was completely inhibited. There were substantial reductions in seed germination, 32% (averaged over populations) under highly acidic conditions (pH 4.0) as compared with the control (water: pH 6.4). Seed germination of both populations on the soil surface was 77%, and no seedlings emerged from a burial depth of 1 cm. The addition of 6 Mg ha−1of wheat (Triticum aestivumL.) residue reduced the emergence of the C and F populations ofS. thellungiiby 75% and 64%, respectively, as compared with the control (no residue). Information gathered from this study provides a better understanding of the factors favorable for germination and emergence ofS. thellungii, which will aid in developing management strategies in winter crops, especially wheat, barley (Hordeum vulgareL.), and chick pea (Cicer arietinumL.).
APA, Harvard, Vancouver, ISO, and other styles
10

Robertson, Fiona, Roger Armstrong, Debra Partington, Roger Perris, Ivanah Oliver, Colin Aumann, Doug Crawford, and David Rees. "Effect of cropping practices on soil organic carbon: evidence from long-term field experiments in Victoria, Australia." Soil Research 53, no. 6 (2015): 636. http://dx.doi.org/10.1071/sr14227.

Full text
Abstract:
Despite considerable research, predicting how soil organic carbon (SOC) in grain production systems will respond to conservation management practices, such as reduced tillage, residue retention and alternative rotations, remains difficult because of the slowness of change and apparent site specificity of the effects. We compared SOC stocks (equivalent soil mass to ~0–0.3 m depth) under various tillage, residue management and rotation treatments in three long-term (12-, 28- and 94-year-old) field experiments in two contrasting environments (Mallee and Wimmera regions). Our hypotheses were that SOC stocks are increased by: (1) minimum tillage rather than traditional tillage; (2) continuous cropping, rather than crop–fallow rotations; and (3) phases of crop or pasture legumes in rotations, relative to continuous cropping with cereals. We found that zero tillage and stubble retention increased SOC in some circumstances (by up to 1.5 Mg C ha–1, or 8%) but not in others. Inclusion of bare fallow in rotations reduced SOC (by 1.4–2.4 Mg C ha–1, or 8–12%) compared with continuous cropping. Including a pulse crop (field pea, where the grain was harvested) in rotations also increased SOC in some instances (by ~6–8 Mg C ha–1, or 29–35%) but not in others. Similarly, leguminous pasture (medic or lucerne) phases in rotations either increased SOC (by 3.5 Mg C ha–1, or 21%) or had no significant effect compared with continuous wheat. Inclusion of a vetch green manure or unfertilised oat pasture in the rotation did not significantly increase SOC compared with continuous wheat. The responses in SOC to these management treatments were likely to be due, in part, to differences in nitrogen and water availability (and their effects on carbon inputs and decomposition) and, in part, to other, unidentified, interactions. We conclude that the management practices examined in the present study may not reliably increase SOC on their own, but that significant increases in SOC are possible under some circumstances through the long-term use of multiple practices, such as stubble retention + zero tillage + legume N input + elimination of fallow. The circumstances under which increases in SOC can be achieved require further investigation.
APA, Harvard, Vancouver, ISO, and other styles
11

Robertson, M. J., G. J. Rebetzke, and R. M. Norton. "Assessing the place and role of crop simulation modelling in Australia." Crop and Pasture Science 66, no. 9 (2015): 877. http://dx.doi.org/10.1071/cp14361.

Full text
Abstract:
Computer-based crop simulation models (CSMs) are well entrenched as tools for a wide variety of research, development and extension applications. Despite this, critics remain and there are perceptions that CSMs have not contributed to impacts on-farm or in the research community, particularly with plant breeding. This study reviewed the literature, interviewed 45 stakeholders (modellers, institutional representatives and clients of modelling), and analysed the industry-funded project portfolio to ascertain the current state of use of CSMs in the grains industry in Australia, including scientific progress, impacts and development needs. We found that CSMs in Australia are widely used, with ~100 active and independent users, ~15 model developers, and at any one time ~10 postgraduate students, chiefly across six public research institutions. The dominant platform used is APSIM (Agricultural Production Systems Simulator). It is widely used in the agronomic domain. Several cases were documented where CSM use had a demonstrable impact on farm and research practice. The updating of both plant and soil process routines in the models has slowed and even stalled in recent years, and scientific limitations to future use were identified: the soil–plant nitrogen cycle, root growth and function, soil surface water and residue dynamics, impact of temperature extremes on plant function, and up-to-date cultivar parameter sets. There was a widespread appreciation of and optimism for the potential of CSMs to assist with plant-breeding activities, such as environmental characterisation, trait assessment, and design of plant-breeding programs. However, we found little evidence of models or model output being used by plant breeders in Australia, despite significant impacts that have emerged recently in larger international breeding programs. Closer cooperation between geneticists, physiologists and breeders will allow gene-based approaches to characterise and parameterise cultivars in CSMs, demonstrated by recent progress with phenology in wheat. This will give models the ability to deal with a wider range of potential genotype × environment × management scenarios.
APA, Harvard, Vancouver, ISO, and other styles
12

Angus, J. F., T. P. Bolger, J. A. Kirkegaard, and M. B. Peoples. "Nitrogen mineralisation in relation to previous crops and pastures." Soil Research 44, no. 4 (2006): 355. http://dx.doi.org/10.1071/sr05138.

Full text
Abstract:
Most of the nitrogen (N) used by Australian crops is mineralised from the residues of previous crops and pastures. Net N mineralisation was studied in 2 field experiments in southern NSW, one comparing different residue-management and tillage systems during continuous cropping and the other comparing residues of annual and perennial pastures in a pasture–crop system. After 14 years of continuous cropping, soil total N concentration had decreased by 50%. Neither stubble retention nor direct drilling affected potential N mineralisation or the decrease in total N. However, soil mineral N in the field was greater after direct drilling than cultivation and greater after stubble retention than stubble burning. There were 2 reasons for the discrepancy. One was because retained stubble conserved soil water, leading to periods of increased mineralisation. The other was that direct drilling and stubble retention reduced growth and N uptake by crops. In contrast to the similar rates of potential mineralisation under different tillage and stubble systems, there were significant differences following different pasture species. In a 5-year study of a pasture–crop system we measured net mineralisation following annual pasture based on subterranean clover and perennial pasture based on lucerne and/or the grasses phalaris and cocksfoot. Mineralisation generally decreased with number of years after pasture removal. Previous lucerne pastures led to slow net mineralisation in the first year after removal, apparently because of immobilisation by high C : N residues. Mineralisation in soil containing perennial grass residues was the highest measured. This high rate may be due to redistribution of N to the topsoil by roots of perennial grasses. The comparison of continuous crop and pasture–crop systems showed that the decline in soil N supply was not prevented by direct drilling and stubble conservation, but N mineralisation was increased by pastures, particularly those containing perennial grasses.
APA, Harvard, Vancouver, ISO, and other styles
13

Silsbury, JH. "Grain yield of wheat in rotation with pea, vetch or medic grown with three systems of management." Australian Journal of Experimental Agriculture 30, no. 5 (1990): 645. http://dx.doi.org/10.1071/ea9900645.

Full text
Abstract:
Pea (Pisum sativum L. cv. Alma), vetch (Vicia sativa L. cv. Languedoc) and annual medic (Medicago truncatula Gaertn. cv. Paraggio) were grown at Brinkworth, South Australia, in 1987 in large (0.75 ha) plots and subjected to 3 systems of management: (i) ploughing in at flowering as a green manure crop, (ii) harvesting for grain and ploughing in the dry residues, and (iii) harvesting for grain and removing the residues. A wheat crop was sown over the whole area in the following season (1988) and the effects of type of legume and management on grain yield and grain protein content were measured. The management system imposed on the legume had a highly significant (P<0.01) effect on the grain yield of the following wheat crop, but there were no significant differences between the 3 legumes in their effects on wheat yield or on grain protein content. Ploughing in the legumes as a green manure crop at flowering added about 100 kg/ha more nitrogen (N) to the soil than allowing the legumes to mature, harvesting for seed, and removing residues. Incorporating the dry residues rather than removing them added about 26 kg N/ha. The green manure crop significantly increased subsequent wheat yield (by 49%; P<0.001) and protein content of the grain (by 13%; P<0.05) compared with the treatment in which the legumes were harvested for grain and all residues removed; incorporating the dry residues increased yield by 10%. It is concluded that the amount of N added during the legume phase in a rotation is more important than the kind of legume from which the N is derived. The occasional use of a dense legume crop as a green manure may rapidly add a large amount of N to a soil to be slowly exploited by following grain crops.
APA, Harvard, Vancouver, ISO, and other styles
14

Bhathal, J. S., and R. Loughman. "Ability of retained stubble to carry-over leaf diseases of wheat in rotation crops." Australian Journal of Experimental Agriculture 41, no. 5 (2001): 649. http://dx.doi.org/10.1071/ea00134.

Full text
Abstract:
Increasingly, wheat rotations on sand-plain soils in Western Australia are being managed with stubble retention practices for reasons of moisture and soil conservation. A major concern in stubble retention practices is an associated increase in risk from septoria nodorum blotch (Phaeosphaeria nodorum) and yellow spot (Pyrenophora tritici-repentis). These pathogens frequently occur together in the region and survive in crop surface residues. The amount of disease carry-over on stubble is an important determinant of the severity of leaf diseases during the entire crop season. To provide a rationale for wheat leaf disease management in stubble retention rotation systems the extent to which retained wheat stubble induces disease in rotated crops was investigated. The frequency with which wheat stubble, which had been retained through a 1-year rotation, induced significant disease in seedling wheat was low (14%) over the 4-year period of study. While disease carry-over from wheat stubble retention in rotations is possible, it appears to be uncommon. The small proportion (1–8%) of retained wheat stubble that remained after germination of the return wheat crop in typical Western Australian farming systems further indicates that in general retained wheat stubble is not a significant source of disease carry-over in rotation wheat crops in this environment.
APA, Harvard, Vancouver, ISO, and other styles
15

Abdallah, Ahmed M., Hanuman S. Jat, Madhu Choudhary, Emad F. Abdelaty, Parbodh C. Sharma, and Mangi L. Jat. "Conservation Agriculture Effects on Soil Water Holding Capacity and Water-Saving Varied with Management Practices and Agroecological Conditions: A Review." Agronomy 11, no. 9 (August 24, 2021): 1681. http://dx.doi.org/10.3390/agronomy11091681.

Full text
Abstract:
Improving soil water holding capacity (WHC) through conservation agriculture (CA)-practices, i.e., minimum mechanical soil disturbance, crop diversification, and soil mulch cover/crop residue retention, could buffer soil resilience against climate change. CA-practices could increase soil organic carbon (SOC) and alter pore size distribution (PSD); thus, they could improve soil WHC. This paper aims to review to what extent CA-practices can influence soil WHC and water-availability through SOC build-up and the change of the PSD. In general, the sequestered SOC due to the adoption of CA does not translate into a significant increase in soil WHC, because the increase in SOC is limited to the top 5–10 cm, which limits the capacity of SOC to increase the WHC of the whole soil profile. The effect of CA-practices on PSD had a slight effect on soil WHC, because long-term adoption of CA-practices increases macro- and bio-porosity at the expense of the water-holding pores. However, a positive effect of CA-practices on water-saving and availability has been widely reported. Researchers attributed this positive effect to the increase in water infiltration and reduction in evaporation from the soil surface (due to mulching crop residue). In conclusion, the benefits of CA in the SOC and soil WHC requires considering the whole soil profile, not only the top soil layer. The positive effect of CA on water-saving is attributed to increasing water infiltration and reducing evaporation from the soil surface. CA-practices’ effects are more evident in arid and semi-arid regions; therefore, arable-lands in Sub-Sahara Africa, Australia, and South-Asia are expected to benefit more. This review enhances our understanding of the role of SOC and its quantitative effect in increasing water availability and soil resilience to climate change.
APA, Harvard, Vancouver, ISO, and other styles
16

Bell, M., N. Seymour, G. R. Stirling, A. M. Stirling, L. Van Zwieten, T. Vancov, G. Sutton, and P. Moody. "Impacts of management on soil biota in Vertosols supporting the broadacre grains industry in northern Australia." Soil Research 44, no. 4 (2006): 433. http://dx.doi.org/10.1071/sr05137.

Full text
Abstract:
The grain-producing regions of northern New South Wales and southern and central Queensland are characterised by cropping systems that are strongly dependent on stored soil moisture rather than in-crop rainfall, and tillage systems that are increasingly reliant on zero or minimum tillage. Crops are grown relatively infrequently and crop rotations are dominated by winter and summer grains (wheat [Triticum aestivum L.] and sorghum [Sorghum bicolor L. Moench], respectively), with smaller areas of grain legumes and cotton (Gossypium hirsutum L.). The grey, black, and brown Vertosols represent the more productive soils in the region under rainfed cropping, and are the focus of work reported in this study. Soil samples were collected from surface soils (0–0.30 m) across the region, utilising sites of long term tillage and residue management studies, fertiliser trials, and commercial fields to enable an assessment of the impact of various management practices on soil biological properties. A number of biological and biochemical parameters were measured (microbial biomass C, total organic C and labile C fractions, total C and N, microbial activity using FDA, cellulase activity, free living nematodes, total DNA and fatty acid profiles), and the response of wheat, sorghum, and chickpea (Cicer arietinum L.) to steam pasteurisation was assessed in glasshouse bioassays. The objective was to obtain an indication of the biological status of grain-growing soils and assess the impact of biological constraints in soils from different regions and management systems. Results showed that biological activity in cropped soils was consistently low relative to other land uses in northern Australia, with management practices like stubble retention and adoption of zero tillage producing relatively small benefits. In the case of zero tillage, many of these benefits were confined to the top 0.05 m of the soil profile. Fallowing to recharge soil moisture reserves significantly reduced all soil biological parameters, while pasture leys produced consistent positive benefits. Breaking a long fallow with a short duration grain or brown manure crop significantly moderated the negative effects of a long bare fallow on soil biology. Use of inorganic N and P fertilisers produced minimal effects on soil biota, with the exception of one component of the free-living nematode community (the Dorylaimida). The glasshouse bioassays provided consistent evidence that soil biota were constraining growth of both grain crops (sorghum and wheat) but not the grain legume (chickpea). The biota associated with this constraint have not yet been identified, but effects were consistent across the region and were not associated with the presence of any known pathogen or correlated with any of the measured soil biological or biochemical properties. Further work to confirm the existence and significance of these constraints under field conditions is needed. None of the measured biological or biochemical parameters consistently changed in response to management practices, while conflicting conclusions could sometimes be drawn from different measurements on the same soil sample. This highlights the need for further work on diagnostic tools to quantify soil biological communities, and suggests there is no clear link between measured changes in soil biological communities and economically or ecologically important soil attributes.
APA, Harvard, Vancouver, ISO, and other styles
17

Kleemann, Samuel G. L., Christopher Preston, and Gurjeet S. Gill. "Influence of Management on Long-Term Seedbank Dynamics of Rigid Ryegrass (Lolium rigidum) in Cropping Systems of Southern Australia." Weed Science 64, no. 2 (June 2016): 303–11. http://dx.doi.org/10.1614/ws-d-15-00119.1.

Full text
Abstract:
A field study was undertaken to investigate the influence of different management strategies on rigid ryegrass plant density and seedbank dynamics over 4 yr. Even though weed seedbank declined by 86% after oaten hay in year 1, the residual seedbank enabled rigid ryegrass to reinfest field peas the next year, and the population rebounded sharply when weed control relied solely on PPI trifluralin. However, use of POST clethodim followed by crop-topping for seed-set prevention of rigid ryegrass in field pea was highly effective and caused a further decline in the weed seedbank. Integration of effective management tactics over 3 yr significantly reduced rigid ryegrass weed and spike density (90 and 81%) in the final year of the 4-yr cropping sequence. Use of oaten hay in year 1, followed by effective weed control in field pea and wheat crops, depleted the high initial seedbank (4,820 seeds m−2) to moderate levels (< 200 seeds m−2) within 3 yr. Effective weed-management treatments depleted the rigid ryegrass seedbank, reduced in-crop weed infestation, and returned higher grain yields and profitability. The results of this study clearly show that large rigid ryegrass populations can be managed effectively without reducing crop productivity and profitability provided multiyear weed-management programs are implemented effectively.
APA, Harvard, Vancouver, ISO, and other styles
18

Shabbir, Asad, Bhagirath S. Chauhan, and Michael J. Walsh. "Biology and management of Echinochloa colona and E. crus-galli in the northern grain regions of Australia." Crop and Pasture Science 70, no. 11 (2019): 917. http://dx.doi.org/10.1071/cp19261.

Full text
Abstract:
Echinochloa colona and E. crus-galli are two important annual grass weeds distributed throughout the summer cropping regions of Australia. Both species are highly problematic weeds, responsible for yield losses of up to 50% in summer grain crops. The success of Echinochloa species as weeds is attributed to their rapid growth, prolific seed production, seed dormancy and adaptability to a wide range of environments. Importantly, E. colona has evolved resistance to glyphosate in Australia, with resistant populations now widespread across the summer cropping regions. Fallow management of E. colona with glyphosate alone is risky in terms of increasing the chance of resistance and highly unsustainable; other control strategies (residual herbicides, strategic tillage, etc.) should be considered to complement herbicides. This review provides a summary of current information on the biology, ecology and management of Echinochloa species. The knowledge gaps and research opportunities identified will have pragmatic implications for the management of these species in Australian grain cropping systems.
APA, Harvard, Vancouver, ISO, and other styles
19

Chen, W., R. W. Bell, R. F. Brennan, J. W. Bowden, A. Dobermann, Z. Rengel, and W. Porter. "Key crop nutrient management issues in the Western Australia grains industry: a review." Soil Research 47, no. 1 (2009): 1. http://dx.doi.org/10.1071/sr08097.

Full text
Abstract:
In the present paper, we review 4 topics that were identified through extensive consultation with stakeholders as issues of high impact and influence for the grains industry: improving soil testing and interpretation; role of fluid fertilisers in the Western Australian (WA) grains industry; using spatial and temporal information to improve crop nutrient management, particularly for nitrogen; and developing recommendations for managing emerging nutrient deficiencies. The key findings are summarised below. To further improve soil testing and interpretation, the review suggests that future research should focus on addressing soil sampling and interpretation questions, as they are important factors affecting the accuracy of fertiliser recommendations with changing cropping practices. There have been several studies to compare fluid with granular forms of P in WA, but the responses have not, so far, been consistent. More work may be needed to understand different crop response to fluid P fertiliser additions, particularly on low pH soils in WA. An understanding of the long-term performance of fluid P will also require an assessment of the residual value of fluid P compared with granular P under field conditions. Precision agriculture (PA) technology has potential to improve crop nutrient management and farm profitability in WA. The review indicates that understanding both spatial and temporal yield variation is critical for the successful adoption of PA technology by growers. The review also suggests that in WA, there is a need to explore the use of different layers of spatial information for determining management zones. In response to wide adoption of no-till and stubble retention cropping systems, increased use of fluid fertilisers together with advanced application technologies, and increased interest in cropping in the high rainfall zone, there is need to better understand growers’ practices and attitudes to crop nitrogen (N) management, and thus to better position research and extension activities. The review also suggests the need to evaluate new fertiliser products and site-specific N management concepts and develop N management practices for waterlogging-prone soils for improved N use efficiency in cereal production systems in WA. The negative balance of magnesium (Mg) observed in WA cropping systems, together with the information reviewed on factors affecting soil Mg content and crop response to Mg application across Australia, suggests that there is a need to evaluate the risk and impact of Mg deficiency on acidic sandy soils of WA.
APA, Harvard, Vancouver, ISO, and other styles
20

McNeill, A. M., and C. M. Penfold. "Agronomic management options for phosphorus in Australian dryland organic and low-input cropping systems." Crop and Pasture Science 60, no. 2 (2009): 163. http://dx.doi.org/10.1071/cp07381.

Full text
Abstract:
Maintenance of available phosphorus (P) is a problem faced by both conventional and organic systems but it is exacerbated in the latter given that manufactured inorganic sources of P fertiliser are not permitted under the International Federation of Organic Agriculture Movements certification guidelines. The focus of this paper is a discussion of potential agronomic strategies to assist in sustainable management of the soil P resource in organic and low-input broadacre farming systems within the Australian rain-fed cereal–livestock belt. The paper considers three broad strategies for agronomic management of P in this context and draws on reported research from overseas and within Australia. An analysis of the current knowledge suggests that the option most likely to ensure that soluble P is not a limitation in the system is the importation of allowable inputs that contain P from off-farm, although for much of the Australian cereal–livestock belt the immediate issue may be access to economically viable sources. Research targeted at quantifying the economic and biological benefits to the whole-farm system associated with the adoption of these practices is required. Improving the P-use efficiency of the system by incorporating species into rotation or intercropping systems that are able to use P from less soluble sources has been a successful strategy in parts of the world with climate similar to much of the Australian cereal–sheep belt, and deserves further research effort in Australia. Agronomic management to maximise quantity and quality of pasture and crop plant residues undoubtedly builds labile soil organic matter and facilitates P cycling, but the strategy may be of limited benefit in low-rainfall areas that do not have the capacity to produce large biomass inputs. Evidence that organic or low-input systems naturally increase the numbers and diversity of soil organisms is sparse and published studies from Australian systems suggest that P nutrition is not enhanced. However, seed and soil microbial inoculants to facilitate improved P uptake have been developed and are currently being field tested in Australia. Progress in selection and breeding for cereal genotypes that are more P efficient and other plant genotypes that can use less labile P sources, is gaining momentum but still remains a long-term prospect, and may involve genetic modification which will not be acceptable for organic systems.
APA, Harvard, Vancouver, ISO, and other styles
21

Littleboy, M., DM Silburn, DM Freebairn, DR Woodruff, GL Hammer, and JK Leslie. "Impact of soil erosion on production in cropping systems .I. Development and validation of a simulation model." Soil Research 30, no. 5 (1992): 757. http://dx.doi.org/10.1071/sr9920757.

Full text
Abstract:
A computer simulation model to analyse risks of soil erosion to long-term crop production is described. The model, called PERFECT, simulates interactions between soil type, climate, fallow management strategy and crop sequence. It contains six main modules; data input, water balance, crop growth, crop residue, erosion and model output. Modules are arranged in a framework that allows alternative modules to be used as required for the potential range of applications. The model contains dynamic crop growth models for wheat, sorghum and sunflower. Validation of PERFECT against small catchment and contour bay data collected throughout Queensland showed that PERFECT explained up to 84% of the variation in total available soil water, 89% of the variation in daily runoff, and up to 75% of the variation in grain yield. Average annual soil erosion was accurately predicted but daily erosion totals were less accurate due to the exclusion of rainfall intensity in erosion prediction. Variability in climate dominates agricultural production in the subtropical region of Australia. The validated model can be coupled with long-term climate and soils databases to simulate probabilities of production and erosion risks due to climatic variability. It provides a method to determine the impact of soil erosion on long-term productivity.
APA, Harvard, Vancouver, ISO, and other styles
22

Loura, Deepak, Sahil, Singarayer Florentine, and Bhagirath Singh Chauhan. "Germination ecology of hairy fleabane (Conyza bonariensis) and its implications for weed management." Weed Science 68, no. 4 (April 16, 2020): 411–17. http://dx.doi.org/10.1017/wsc.2020.28.

Full text
Abstract:
AbstractHairy fleabane [Conyza bonariensis (L.) Cronquist] is a problematic weed in Australian no-till cropping systems. Consequently, a study was conducted to examine the effect of temperature, light, salt stress, osmotic stress, burial depth, and sorghum crop residue on germination and emergence in two populations (C and W: collected from chick pea [Cicer arietinum L.] and wheat [Triticum aestivum L.] fields, respectively) of C. bonariensis. Both populations were able to germinate over a wide range of alternating day/night temperatures (15/5 to 35/25 C); however, the C population had optimum (and similar) germination over the range of 20/10 and 30/20 C, while the W population showed maximum germination at 25/15 C. A negative relationship was observed between osmotic potential and germination, with 31% and 14% germination of the C and W populations at −0.6 MPa, respectively. These observations suggest that population C was more tolerant to higher osmotic potentials than population W. Seeds of both populations germinated when exposed to a wide range of sodium chloride levels (NaCl, 0 to 200 mM); however, beyond 200 mM NaCl, no germination was observed in either population. Maximum germination of the C (70%) and W (41%) populations was observed on the soil surface with no emergence from a burial depth of 1 cm. The application of sorghum residue at an amount of 6,000 kg ha−1 reduced emergence of the C and W populations by 55% and 58%, respectively, compared with the no-residue treatment. Knowledge gained from this study suggests that the following strategies could be used for more efficacious management of C. bonariensis: (1) a shallow-tillage operation to bury weed seeds in conventional tillage systems, and (2) retention of sorghum residue on the soil surface in no-till systems.
APA, Harvard, Vancouver, ISO, and other styles
23

Armstrong, R. D., J. Fitzpatrick, M. A. Rab, M. Abuzar, P. D. Fisher, and G. J. O'Leary. "Advances in precision agriculture in south-eastern Australia. III. Interactions between soil properties and water use help explain spatial variability of crop production in the Victorian Mallee." Crop and Pasture Science 60, no. 9 (2009): 870. http://dx.doi.org/10.1071/cp08349.

Full text
Abstract:
A major barrier to the adoption of precision agriculture in dryland cropping systems is our current inability to reliably predict spatial patterns of grain yield for future crops for a specific paddock. An experiment was undertaken to develop a better understanding of how edaphic and climatic factors interact to influence the spatial variation in the growth, water use, and grain yield of different crops in a single paddock so as to improve predictions of the likely spatial pattern of grain yields in future crops. Changes in a range of crop and soil properties were monitored over 3 consecutive seasons (barley in 2005 and 2007 and lentils in 2006) in the southern section of a 167-ha paddock in the Mallee region of Victoria, which had been classified into 3 different yield (low, moderate, and high) and seasonal variability (stable and variable) zones using normalised difference vegetation index (NDVI) and historic yield maps. The different management zones reflected marked differences in a range of soil properties including both texture in the topsoil and potential chemical-physical constraints in the subsoil (SSCs) to root growth and water use. Dry matter production, grain yield, and quality differed significantly between the yield zones but the relative difference between zones was reduced when supplementary irrigation was applied to barley in 2005, suggesting that some other factor, e.g. nitrogen (N), may have become limiting in that year. There was a strong relationship between crop growth and the use of soil water and nitrate across the management zones, with most water use by the crop occurring in the pre-anthesis/flowering period, but the nature of this relationship appeared to vary with year and/or crop type. In 2006, lentil yield was strongly related to crop establishment, which varied with soil texture and differences in plant-available water. In 2007 the presence of soil water following a good break to the season permitted root growth into the subsoil where there was evidence that SSCs may have adversely affected crop growth. Because of potential residual effects of one crop on another, e.g. through differential N supply and use, we conclude that the utility of the NDVI methodology for developing zone management maps could be improved by using historical records and data for a range of crop types rather than pooling data from a range of seasons.
APA, Harvard, Vancouver, ISO, and other styles
24

Brennan, R. F., B. Penrose, and R. W. Bell. "Micronutrients limiting pasture production in Australia." Crop and Pasture Science 70, no. 12 (2019): 1053. http://dx.doi.org/10.1071/cp19087.

Full text
Abstract:
Low levels of plant-available micronutrients were an inherent feature of many agricultural soils in Australia, mostly due to the prevalence of highly weathered soil parent materials. The diagnosis and correction of the widespread deficiencies of micronutrients, especially copper (Cu), molybdenum (Mo) and zinc (Zn), were prerequisites for the development of productive, legume-based pastures in southern Australia. In subtropical and tropical regions, Mo deficiency commonly limited pasture-legume production. Soil treatments involving micronutrient fertiliser incorporated in soils, or applied as additives to superphosphate, were generally effective in alleviating micronutrient deficiencies. In the low-output dryland pasture systems, the annual removal of micronutrients in wool and meat is small compared with rates added in fertiliser. Hence, in general, the residues of soil-applied micronutrient fertilisers remain effective for many years, for example, up to 30 years for Cu. By contrast, shorter residual values occur for manganese (Mn) fertiliser on highly calcareous soils, and for Zn in high-output pasture systems such as intensive dairy production. In the last two decades since the recommendations for micronutrient management of pastures were developed, there have been many changes to farming systems, with likely implications for micronutrient status in pastures. First, increased cropping intensity and low prices for wool and meat have meant lower nutrient inputs to pastures or to the pasture phase of rotations with crops. However, when pastures have been rotated with crops, ongoing small additions of Cu, Zn and Mo have been common. In cropping phases of farming systems, lime application and no-till may have altered the chemical and positional availability of micronutrients in soils to pastures. However, there has been little study of the impacts of these farming-systems changes on micronutrient status of pastures or profitability of the production system. The intensification of dairy production systems may also have altered the demand for, and removal rates of, micronutrients. Soil tests are not very reliable for Mn or Mo deficiencies, and well-calibrated soil tests for boron, Cu and Zn have been developed only for limited areas of pasture production and for a limited range of species. There is limited use of plant tests for nutrient management of pastures. In conclusion, there is limited knowledge of the current micronutrient status of pastures and their effects on animal health. Pasture production would benefit from targeted investigation of micronutrients status of pasture soils, pasture plants and micronutrient-linked animal-health issues.
APA, Harvard, Vancouver, ISO, and other styles
25

Tozer, K. N., D. F. Chapman, P. E. Quigley, P. M. Dowling, R. D. Cousens, and G. A. Kearney. "Integrated management of vulpia in dryland perennial pastures of southern Australia." Crop and Pasture Science 60, no. 1 (2009): 32. http://dx.doi.org/10.1071/cp07445.

Full text
Abstract:
Vulpia (Vulpia species C.C. Gmel.) are annual grass weeds that can reduce pasture quality and stock-carrying capacity of perennial pastures throughout southern Australia. To develop more effective strategies to control vulpia, an experiment was established in western Victoria (average annual rainfall 565 mm) in phalaris (Phalaris aquatica L.) pastures comparing the effects of control methods [comprising combinations of fertiliser addition (Fert), a single herbicide (simazine) application (Sim), and pasture rest from grazing (Rest)] on vulpia populations. A further herbicide treatment [paraquat-diquat (SpraySeed®)] was imposed on some of these treatments. Measurements included botanical composition, phalaris and vulpia tiller density, seed production, and number of residual seeds in the soil. Vulpia content remained unchanged in the Sim-Rest treatment but increased in all other management treatments over the duration of the 3 year study and especially where paraquat-diquat was applied, despite paraquat-diquat causing an initial reduction in vulpia content. Vulpia content was lowest in the Fert-Sim-Rest treatment. The Fert-Sim treatment and in some cases paraquat-diquat application reduced vulpia tiller production. Vulpia seed production and the residual seed population were not influenced by any of the management treatments, while the single paraquat-diquat application increased vulpia seed production 18 months after application. Phalaris content was enhanced by the Sim-Rest and Fert-Sim-Rest treatments and initially by paraquat-diquat. No treatment affected phalaris tiller production and basal cover. The subterranean clover (Trifolium subterraneum L.) content declined during the experiment, but to a lesser extent where paraquat-diquat was applied. Volunteer species content was initially suppressed in the year following paraquat-application, although populations recovered after this time. Of the two Vulpia spp. present (V. bromoides (L.) S.F. Gray and V. myuros (L.) C.C. Gmelin), V. bromoides was the most prevalent. Results show how a double herbicide application can increase vulpia fecundity and rate of re-infestation of herbicide-treated sites. Pasture rest shows some promise, but to a lesser extent than in the New South Wales tablelands, where summer rainfall may increase the growth of perennial species. In lower rainfall, summer dry areas, responses to pasture rest may be slower. Despite this, integrated management (which combines strategies such as pasture rest, herbicide application, and fertiliser application) increases the perennial content and reduces vulpia seed production, thus improving vulpia control.
APA, Harvard, Vancouver, ISO, and other styles
26

Higgins, Vaughan, Caroline Love, and Tony Dunn. "Flexible adoption of conservation agriculture principles: practices of care and the management of crop residue in Australian mixed farming systems." International Journal of Agricultural Sustainability 17, no. 1 (December 20, 2018): 49–59. http://dx.doi.org/10.1080/14735903.2018.1559526.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Page, K. L., R. C. Dalal, S. H. Reeves, W. J. Wang, Somasundaram Jayaraman, and Y. P. Dang. "Changes in soil organic carbon and nitrogen after 47 years with different tillage, stubble and fertiliser management in a Vertisol of north-eastern Australia." Soil Research 58, no. 4 (2020): 346. http://dx.doi.org/10.1071/sr19314.

Full text
Abstract:
No-till (NT) farming has been widely adopted to assist in reducing erosion, lowering fuel costs, conserving soil moisture and improving soil physical, chemical and biological characteristics. Improvements in soil characteristics are often driven by the greater soil organic matter accumulation (as measured by soil organic carbon (SOC)) in NT compared to conventional tillage (CT) farming systems. However, to fully understand the effect of NT it is important to understand temporal changes in SOC by monitoring over an extended period. We investigated the long-term effect of NT and stubble retention (SR) on changes in SOC and total soil nitrogen (STN) using results from an experiment that has been running for 50 years in a semi-arid subtropical region of north-eastern Australia. In this experiment, the effects of tillage (CT vs NT), residue management (stubble burning (SB) vs SR), and nitrogen (N) fertiliser (0 and 90 kg-N ha–1) were measured in a balanced factorial experiment on a Vertisol (Ustic Pellusert). The use of NT, SR and N fertiliser generally improved SOC (by up to 12.8%) and STN stocks (by up to 31.7%) in the 0–0.1 m layer relative to CT, SB and no N fertiliser, with the greatest stocks observed where all three treatments were used in combination. However, declines in SOC (up to 20%) and STN (up to 25%) occurred in all treatments over the course of the experiment, indicating that changes in management practices were unable to prevent a loss of soil organic matter over time in this farming system. However, the NT and SR treatments did lose less SOC than CT and SB treatments, and SR also reduced STN loss. The δ13C analysis of samples collected in 2008 and 2015 highlighted that crop residues have significantly contributed to SOC stocks at the site and that their contribution is increasing over time.
APA, Harvard, Vancouver, ISO, and other styles
28

C., Pankhurst, Kirkby C., Hawke B., and Harch B. "Impact of a change in tillage and crop residue management practice on soil chemical and microbiological properties in a cereal-producing red duplex soil in NSW, Australia." Biology and Fertility of Soils 35, no. 3 (May 1, 2002): 189–96. http://dx.doi.org/10.1007/s00374-002-0459-3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Taylor, J. A., A. Herr, and A. W. Siggins. "The influence of distance from landfill and population density on degree of wood residue recycling in Australia." Biomass and Bioenergy 33, no. 10 (October 2009): 1474–80. http://dx.doi.org/10.1016/j.biombioe.2009.07.003.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

Li, Yong, Weijin Wang, Steven Reeves, and Ram C. Dalal. "Simulation of N2O emissions and mitigation options for rainfed wheat cropping on a Vertosol in the subtropics." Soil Research 51, no. 2 (2013): 152. http://dx.doi.org/10.1071/sr12274.

Full text
Abstract:
The Water and Nitrogen Management Model (WNMM) was applied to simulate nitrous oxide (N2O) emissions from a wheat-cropped Vertosol under long-term management of no-till, crop residue retention, and nitrogen (N) fertiliser application in southern Queensland, Australia, from July 2006 to June 2009. For the simulation study, eight treatments of combinations of conventional tillage (CT) or no-till (NT), stubble burning (SB) or stubble retention (SR), and N fertiliser application at nil (0N) or 90 (90N) kg N/ha.year were used. The results indicated that WNMM satisfactorily simulated the soil water content of the topsoil, mineral N content of the entire soil profile (0–1.5 m), and N2O emissions from the soil under the eight treatments, compared with the corresponding field measurements. For simulating daily N2O emissions from soil, WNMM performed best for the treatment CT-SB-90N (R2 = 0.48, P < 0.001; RMSE = 10.2 g N/ha.day) and worst for the treatment CT-SB-0N (R2 = 0.03, P = 0.174; RMSE = 1.2 g N/ha.day). WNMM predicted N2O emissions from the soil more accurately for the fertilised treatments (i.e. 90N v. 0N), and for the residue retained treatments (SR v. SB). To reduce N2O emissions from the no-till and fertilised treatments, three scenarios were examined: application of nitrification inhibitor, application of controlled-release fertiliser, and deep placement of liquid fertiliser (UAN32). Only the deep placement of UAN32 below the 35 cm depth was effective, and could reduce the N2O emissions from the soil by almost 40%.
APA, Harvard, Vancouver, ISO, and other styles
31

Dimes, JP, RL McCown, and PG Saffigna. "Nitrogen supply to no-tillage crops, as influenced by mulch type, soil type and season, following pasture leys in the semi-arid tropics." Australian Journal of Experimental Agriculture 36, no. 8 (1996): 937. http://dx.doi.org/10.1071/ea9960937.

Full text
Abstract:
Past cropping research in the semi-arid tropics of northern Australia has shown that in this climate and on the predominantly sesquioxidic soils, recovery of fertiliser nitrogen (N) by crops is often low. Conceptually, no-tillage, legume ley farming offers features for coping better with the constraints of climate, soil and high fertiliser transport costs to this remote region. This paper summarises the N cycle in a system in which pastures provide N for successive crops, and mulch at the time of crop establishment is provided by the killing of new pasture growth. The aim was further to provide a sound foundation for managing N supply in relation to demand in a climate that causes high variation and uncertainty for pasture N2 fixation and sequestering, the amount of early season re-growth (mulch), rate of mulch decomposition, nitrate leaching losses, and crop growth and N demand. The research approach combined field studies with simulation modelling. A series of field studies that included bare fallow and grass and legume pasture leys on clay loam and sandy loam soils, were conducted at Katherine over 4 wet seasons to study subsequent mineralisation of N. Experimental results were used to test the performance of a simulation model for predicting the observed variations consequent upon the various management options. Experimental results showed that the carbon (C) : N ratio of the residue and soil texture were important factors in determining N mineralisation, immobifisation, and nitrate leaching following chemical kill of pasture leys. However, the greatest variation was between seasons. A modified version of the CERES-Maize N model was able to simulate the accumulation of nitrate following a bare fallow and following pasture leys with high levels (above and below ground) of freshly killed residues with widely differing C:N ratio, the downward movement of nitrate-N in the soil and the interaction of these processes with seasonal rainfall. Despite a capability for simulation of the soil N dynamics in a cropping phase following pasture leys, ex~erimental results indicated how nitrate distribution following leys is influenced by pasture growth during the ley, and how this varied greatly with season and soil texture. The simulation capability reported here has been incorporated elsewhere into the development of a full system model, embracing both the ley phase and the crop phase.
APA, Harvard, Vancouver, ISO, and other styles
32

Walker, S. R., I. N. Taylor, G. Milne, V. A. Osten, Z. Hoque, and R. J. Farquharson. "A survey of management and economic impact of weeds in dryland cotton cropping systems of subtropical Australia." Australian Journal of Experimental Agriculture 45, no. 1 (2005): 79. http://dx.doi.org/10.1071/ea03189.

Full text
Abstract:
In dryland cotton cropping systems, the main weeds and effectiveness of management practices were identified, and the economic impact of weeds was estimated using information collected in a postal and a field survey of Southern Queensland and northern New South Wales. Forty-eight completed questionnaires were returned, and 32 paddocks were monitored in early and late summer for weed species and density. The main problem weeds were bladder ketmia (Hibiscus trionum), common sowthistle (Sonchus oleraceus), barnyard grasses (Echinochloa spp.), liverseed grass (Urochloa panicoides) and black bindweed (Fallopia convolvulus), but the relative importance of these differed with crops, fallows and crop rotations. The weed flora was diverse with 54 genera identified in the field survey. Control of weed growth in rotational crops and fallows depended largely on herbicides, particularly glyphosate in fallow and atrazine in sorghum, although effective control was not consistently achieved. Weed control in dryland cotton involved numerous combinations of selective herbicides, several non-selective herbicides, inter-row cultivation and some manual chipping. Despite this, residual weeds were found at 38–59% of initial densities in about 3-quarters of the survey paddocks. The on-farm financial costs of weeds ranged from $148 to 224/ha.year depending on the rotation, resulting in an estimated annual economic cost of $19.6 million. The approach of managing weed populations across the whole cropping system needs wider adoption to reduce the weed pressure in dryland cotton and the economic impact of weeds in the long term. Strategies that optimise herbicide performance and minimise return of weed seed to the soil are needed. Data from the surveys provide direction for research to improve weed management in this cropping system. The economic framework provides a valuable measure of evaluating likely future returns from technologies or weed management improvements.
APA, Harvard, Vancouver, ISO, and other styles
33

Turner, Neil C., Nicholas Molyneux, Sen Yang, You-Cai Xiong, and Kadambot H. M. Siddique. "Climate change in south-west Australia and north-west China: challenges and opportunities for crop production." Crop and Pasture Science 62, no. 6 (2011): 445. http://dx.doi.org/10.1071/cp10372.

Full text
Abstract:
Predictions from climate simulation models suggest that by 2050 mean temperatures on the Loess Plateau of China will increase by 2.5 to 3.75°C, while those in the cropping region of south-west Australia will increase by 1.25 to 1.75°C. By 2050, rainfall is not expected to change on the Loess Plateau of China, while in south-west Australia rainfall is predicted to decrease by 20 to 60 mm. The frequency of heat waves and dry spells is predicted to increase in both regions. The implications of rising temperatures are an acceleration of crop phenology and a reduction in crop yields, greater risk of reproductive failure from extreme temperatures, and greater risk of crop failure. The reduction in yield from increased phenological development can be countered by selecting longer-season cultivars and taking advantage of warmer minimum temperatures and reduced frost risk to plant earlier than with current temperatures. Breeding for tolerance of extreme temperatures will be necessary to counter the increased frequency of extreme temperatures, while a greater emphasis on breeding for increased drought resistance and precipitation-use efficiency will lessen the impact of reduced rainfall. Management options likely to be adopted in south-west Australia include the introduction of drought-tolerant perennial fodder species and shifting cropping to higher-rainfall areas. On the Loess Plateau of China, food security is paramount so that an increased area of heat-tolerant and high-yielding maize, mulching with residues and plastic film, better weed and pest control and strategic use of supplemental irrigation to improve rainfall-use efficiency are likely to be adopted.
APA, Harvard, Vancouver, ISO, and other styles
34

Ghaffariyan, Mohammad Reza, Mauricio Acuna, and Mark Brown. "Analysing the effect of five operational factors on forest residue supply chain costs: A case study in Western Australia." Biomass and Bioenergy 59 (December 2013): 486–93. http://dx.doi.org/10.1016/j.biombioe.2013.08.029.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Chauhan, Bhagirath S., and Prashant Jha. "Glyphosate Resistance in Sonchus oleraceus and Alternative Herbicide Options for Its Control in Southeast Australia." Sustainability 12, no. 20 (October 9, 2020): 8311. http://dx.doi.org/10.3390/su12208311.

Full text
Abstract:
Sonchus oleraceus is becoming a hard-to-control weed in Australian cropping systems, especially in glyphosate-tolerant cotton and during summer fallows. Several biotypes of this weed have developed resistance to glyphosate as a result of common management practices under conservation agriculture systems in the country. A series of pot experiments were conducted to evaluate the effect of temperature on glyphosate efficacy and performance of several post-emergence and pre-emergence herbicides on a glyphosate-resistant (GR) and a glyphosate-susceptible (GS) biotype of S. oleraceus. At low temperatures (19–24 °C), no plants of the GS biotype survived glyphosate application at 570 g/ha; however, in the high-temperature regime (28–30 °C), 83% of the plants survived this rate of glyphosate. Similarly, for the GR biotype, up to 58% of the plants survived at 2280 g/ha of glyphosate when applied during the high-temperature regime and no plants survived this rate during the low-temperature regime. A number of post-emergence herbicides were found to be effective for S. oleraceus control. However, herbicide application delayed to the six-leaf stage compared with the four-leaf stage reduced control, especially for bromoxynil and saflufenacil herbicides. Glufosinate and paraquat were the most effective herbicides for S. oleraceus control, resulting in no seedling survival for both biotypes. Isoxaflutole, pendimethalin or s-metolachlor efficacy was not reduced by the presence of crop residue, suggesting that these herbicides could be used to control S. oleraceus in conservation agriculture systems. The results of this study suggest that growers will need to reduce over-reliance on glyphosate for weed control in summer fallows and use alternative post-emergence herbicides.
APA, Harvard, Vancouver, ISO, and other styles
36

Bedard-Haughn, A. "Managing excess water in Canadian prairie soils: A review." Canadian Journal of Soil Science 89, no. 2 (May 1, 2009): 157–68. http://dx.doi.org/10.4141/cjss07071.

Full text
Abstract:
"Excess water" conditions develop when a soil is unable to transmit water, leading to the onset of saturated conditions harmful to soils and crops. Negative agricultural impacts include reduced trafficability, physical damage to crops under hypoxic or anoxic conditions, increased salinity or sodicity, reduced nutrient availability and uptake, and increased incidence of weeds and pests. There are two main objectives in managing landscapes prone to excess water, both of which must consider soil and landform characteristics. The first is to maximize infiltration and conductivity through tillage and residue management. The second is to remove water from the soil profile as quickly as possible through drainage or the adoption of high water use plants such as alfalfa (Medicago sativa), which increase water losses through transpiration. Changes to the overall cropping system can also be made, including selecting crops and forages that have shown reduced sensitivity to excess water, applying seed treatments that encourage the development of water-tolerant traits, timing fertilizer application to correspond with maximum plant uptake, and incorporating high water use crops into the rotation. Recent work from Australia suggests that management of excess water requires a multi-disciplinary approach; however, little research has been done on this problem in the semiarid to sub-humid Canadian Prairies. Key words: Waterlogging, beneficial management practices, infiltration, drainage
APA, Harvard, Vancouver, ISO, and other styles
37

Hulugalle, N. R., and F. Scott. "A review of the changes in soil quality and profitability accomplished by sowing rotation crops after cotton in Australian Vertosols from 1970 to 2006." Soil Research 46, no. 2 (2008): 173. http://dx.doi.org/10.1071/sr07077.

Full text
Abstract:
In agricultural systems, soil quality is thought of in terms of productive land that can maintain or increase farm profitability, as well as conserving soil resources so that future farming generations can make a living. Management practices which can modify soil quality include tillage systems and crop rotations. A major proportion of Australian cotton (Gossypium hirsutum L.) is grown on Vertosols (~75%), of which almost 80% is irrigated. These soils have high clay contents (40–80 g/100 g) and strong shrink–swell capacities, but are frequently sodic at depth and prone to deterioration in soil physical quality if incorrectly managed. Due to extensive yield losses caused by widespread deterioration of soil structure and declining fertility associated with tillage, trafficking, and picking under wet conditions during the middle and late 1970s, a major research program was initiated with the objective of developing soil management systems which could improve cotton yields while concurrently ameliorating and maintaining soil structure and fertility. An outcome of this research was the identification of cotton–winter crop sequences sown in a 1 : 1 rotation as being able to sustain lint yields while at the same time maintaining soil physical quality and minimising fertility decline. Consequently, today, a large proportion (~75%) of Australian cotton is grown in rotation with winter cereals such as wheat (Triticum aestivum L.), or legumes such as faba bean (Vicia faba L.). A second phase of research on cotton rotations in Vertosols was initiated during the early 1990s with the main objective of identifying sustainable cotton–rotation crop sequences; viz. crop sequences which maintained and improved soil quality, minimised disease incidence, facilitated soil organic carbon sequestration, and maximised economic returns and cotton water use efficiency in the major commercial cotton-growing regions of Australia. The objective of this review was to summarise the key findings of both these phases of Australian research with respect to soil quality and profitability, and identify future areas of for research. Wheat rotation crops under irrigated and dryland conditions and in a range of climates where cotton is grown can improve soil quality indicators such as subsoil structure, salinity, and sodicity under irrigated and dryland conditions, while leguminous crops can increase available nitrogen by fixing atmospheric nitrogen, and by reducing N volatilisation and leaching losses. Soil organic carbon in most locations has decreased with time, although the rate of decrease may be reduced by sowing crop sequences that return about 2 kg/m2.crop cycle of residues to the soil, minimising tillage and optimising N inputs. Although the beneficial effects of soil biodiversity on quality of soil are claimed to be many, except for a few studies on soil macrofauna such as ants, conclusive field-based evidence to demonstrate this has not been forthcoming with respect to cotton rotations. In general, lowest average lint yields per hectare were with cotton monoculture. The cotton–wheat systems generally returned higher average gross margins/ML irrigation water than cotton monoculture and other rotation crops. This indicates that where irrigation water, rather than land, is the limiting resource, cotton–wheat systems would be more profitable. Recently, the addition of vetch (Vicia villosa Roth.) to the cotton–wheat system has further improved average cotton yields and profitability. Profitability of cotton–wheat sequences varies with the relative price of cotton to wheat. In comparison with cotton monoculture, cotton–rotation crop sequences may be more resilient to price increases in fuel and fertiliser due to lower overall input costs. The profitability of cotton–rotation crop sequences such as cotton–wheat, where cotton is not sown in the same field every year, is more resilient to fluctuations in the price of cotton lint, fuel and nitrogen fertiliser. This review identified several issues with respect to cotton–rotation crop sequences where knowledge is lacking or very limited. These are: research into ‘new’ crop rotations; comparative soil quality effects of managing rotation crop stubble; machinery attachments for managing rotation crop stubble in situ in permanent bed systems; the minimum amount of crop stubble which needs to be returned per cropping cycle to increase SOC levels from present values; the relative efficacy of C3 and C4 rotation crops in relation to carbon sequestration; the interactions between soil biodiversity and soil physical and chemical quality indicators, and cotton yields; and the effects of sowing rotation crops after cotton on farm and cotton industry economic indicators such as the economic incentives for adopting new cotton rotations, farm level impacts of research and extension investments, and industry- and community/catchment-wide economic modelling of the impact of cotton research and extension activities.
APA, Harvard, Vancouver, ISO, and other styles
38

Gupta, V., MM Roper, JA Kirkegaard, and JF Angus. "Changes in microbial biomass and organic matter levels during the first year of modified tillage and stubble management practices on a red earth." Soil Research 32, no. 6 (1994): 1339. http://dx.doi.org/10.1071/sr9941339.

Full text
Abstract:
Farming practices involving stubble burning and excessive tillage in Australia have led to losses of organic matter from the soil. Crop residue retention and reduced tillage practices can reverse these trends, but changes in organic matter levels are evident only after a long term. Microbial biomass (MB), the living portion of soil organic matter, responds rapidly to changes in soil and crop management practices. We evaluated changes in microbial biomass and microbial activity in the first year following the modification of stubble management and tillage practices on a red earth near Harden, New South Wales. Following an oat crop harvested late in 1989, seven treatments involving stubble and tillage management were applied in February 1990. Wheat was planted in May 1990. Measurements of total organic carbon (C) and total nitrogen (N) in the top 15 cm of soil indicated no significant changes after 1 year, although there was a significant effect on the distribution of C and N. However, significant changes in MB were observed in the first year. Microbial biomass C in the top 5 cm of the soil ranged from 25 to 52 g C m-2 and these levels dropped by 50% or more with each 5 cm depth. Implementation of treatments altered MB, particularly in the top 5 cm where MB-C and MB-N were significantly greater in stubble-retained than in the top 5 cm where MB-C and MB-N were significantly greater in stubble-retained than in the stubble-burnt treatments, and in the direct drill treatment than in the stubble-incorporated treatment. Microbial biomass in soil increased during the growth of wheat in all treatments, but this was slower in the standing stubble-direct drill treatment, probably due to the delay in the decomposition of stubble. Microbial respiration, which was concentrated in the surface 5 cm of soil in all treatments, was greatest in the direct drill treatments. Microbial activity below 5 cm was higher with stubble incorporation than with direct drill. Specific microbial activity (microbial respiration per unit MB) had the greatest response to tillage at 10-15 cm depth. Microbial quotients (MB as a percentage of C or N) responded to changes in tillage but not significantly to stubble retention. Our studies, during the first year following the modification of stubble management and tillage practices, suggested that changes in MB and microbial activity may be sensitive and reliable indicators of long-term changes in organic matter in soils.
APA, Harvard, Vancouver, ISO, and other styles
39

Braunack, M. V. "Cotton farming systems in Australia: factors contributing to changed yield and fibre quality." Crop and Pasture Science 64, no. 8 (2013): 834. http://dx.doi.org/10.1071/cp13172.

Full text
Abstract:
This study was undertaken to identify factors in Australian cotton farming systems that influence yield and fibre quality of cotton and how these have changed with time after the wide adoption of Bollgard II® cultivars (containing the proteins Cry1Ac and Cry2Ab, providing easier control of Helicoverpa spp.) in the 2003–04 season. Data from Australian commercial cotton variety trials conducted from 2004 to 2011 were used to link management inputs, yield, and fibre quality. Restricted (residual) maximum likelihood (REML) and regression analyses were used to determine which factors had a significant effect on yield and fibre quality. Results showed that lint yield was significantly influenced by cultivar and growing region, and the interaction between region and the amount of applied nitrogen and phosphorus (kg ha–1), plant stand (plants ha–1), in-crop rainfall (mm) and the number of irrigations, season length (days), and days to defoliation. Generally, the same factors also influenced fibre quality. Regression analysis captured 41, 71, 50, 30, and 36% of the variability in lint yield, fibre length, micronaire, fibre strength, and trash, respectively, for irrigated systems. For dryland systems the variability captured was 97, 87, 77 80, and 78%, respectively. Changes in cotton farming systems from 2004 to 2011 have occurred with applied nitrogen fertiliser increasing under irrigation and decreasing under dryland systems. However, phosphorus fertiliser use has remained steady under irrigated and decreased under dryland systems, and the number of insect sprayings has decreased under both systems. Under irrigated systems, lint yield, fibre length, and trash levels increased while micronaire and fibre strength decreased. Under dryland systems, lint yield decreased while micronaire, fibre length, strength, and trash levels increased. All fibre quality parameters satisfied criteria that would not incur a penalty. The results considering which factors are the most important and which are of lesser importance provide some insight to changes in management in both irrigated and dryland systems and the effect on lint yield and fibre quality and provide some basis for future investment in research and development and extension to the Australian cotton industry.
APA, Harvard, Vancouver, ISO, and other styles
40

Sinclair, K., and P. J. Beale. "Critical factors influencing no-till establishment of short-term ryegrass (Lolium multiflorum) into a kikuyu (Pennisetum clandestinum) pasture." Crop and Pasture Science 61, no. 2 (2010): 192. http://dx.doi.org/10.1071/cp09071.

Full text
Abstract:
In the subtropical dairy region of Australia, poor establishment of short-term ryegrass (Lolium multiflorum) oversown into tropical grass pastures is a common occurrence requiring re-sowing. A survey of subtropical dairy farmers was undertaken to relate management practices used in oversowing ryegrass to sward establishment and subsequent growth. Two glasshouse studies were also conducted to examine (1) the effect of temperature, ploidy, seeding depth, and mulch cover on ryegrass emergence and (2) the effect of temperature and ploidy on growth and development of ryegrass seedlings. Subtropical dairy farmers only used grazing management to control the growth and residue levels of the tropical grass pasture before oversowing. The average residue was 1200 kg DM/ha to a 5 cm height, and where the residue amount and height were higher, the ryegrass failed to establish. Tetraploid cultivars were preferred in early sowings and diploid cultivars were favoured in later sowings. When direct-drilled, either seed type was sown to a depth of 1–3 cm. A 20–30 kg/ha sowing rate was common for diploids and was at least 2× that for tetraploids. A seedling count <600 plants/m2 resulted in 1317 tillers/m2 in spring compared with 1886 tillers/m2 for a count >600 plants/m2. The ryegrass seedling emergence study was conducted at 25/15°C (day 0600–1800 hours)/(night 1800–0600 hours) for 14 days after sowing and then repeated at 20/10°C. The treatment combinations were 2 seed types (tetraploid or diploid) × 4 sowing depths (0, 1, 3, or 6 cm) × 3 mulch heights (1, 5, or 10 cm above surface). The main effects, seed type, sowing depth, and mulch height had significant (P < 0.05) effects on seedling emergence, irrespective of temperature, and all interactions were significant (P < 0.05) with the exception of the seed type × mulch height interaction. At the higher temperature the proportion of emerged seedlings declined from 0.52 to 0.16 with increasing mulch cover, from 0.43 to 0.29 with increasing sowing depth, and was higher for tetraploid than for diploid cultivars (0.44 v. 0.26, respectively). At the lower temperature the proportion of emerged seedlings declined from 0.85 to 0.20 with increasing mulch cover, from 0.62 to 0.39 with increasing sowing depth, and was higher for tetraploid than for diploid cultivars (0.63 v. 0.52, respectively). The ryegrass seedling study used treatment combinations of 3 temperature regimes (25/15°C, 20/10°C, or 15/5°C) × 2 seed types (tetraploid or diploid) × 5 harvest times (3, 4, 5, 6, and 8 weeks after sowing). At 8 weeks after sowing tetraploid top DM was significantly (P < 0.05) higher than diploid top DM at low (4100 v. 3040 mg/plant) and medium (5370 v. 2600 mg/plant), but not high (2460 v. 2780 mg/plant) temperatures. Tetraploid tiller and leaf numbers were substantially reduced by high temperature but not for diploid cultivars at 8 weeks. Tetraploid root DM at 8 weeks was highest (2360 mg/plant) and lowest (1200 mg/plant) at medium and low temperatures, respectively, while diploid root DM (mean = 1440 mg/plant) was not affected by temperature. Top growth was most rapid at 6–8 weeks (700–3392 mg/plant) and even more so for root growth (260–1617 mg/plant). These results indicate that when oversowing, ryegrass establishment will be most successful if the ryegrass seed is not sown below 3 cm but, more importantly, if the tropical grass residue is restricted to a 5 cm height. Further, sowing a tetraploid cultivar may be preferable to a diploid cultivar, with its superior emergence and seedling growth over a range of temperatures and sowing conditions.
APA, Harvard, Vancouver, ISO, and other styles
41

Hulugalle, N. R., T. B. Weaver, L. A. Finlay, and V. Heimoana. "Soil organic carbon concentrations and storage in irrigated cotton cropping systems sown on permanent beds in a Vertosol with restricted subsoil drainage." Crop and Pasture Science 64, no. 8 (2013): 799. http://dx.doi.org/10.1071/cp12374.

Full text
Abstract:
Long-term studies of soil organic carbon dynamics in two- and three-crop rotations in irrigated cotton (Gossypium hirsutum L.) based cropping systems under varying stubble management practices in Australian Vertosols are relatively few. Our objective was to quantify soil organic carbon dynamics during a 9-year period in four irrigated, cotton-based cropping systems sown on permanent beds in a Vertosol with restricted subsoil drainage near Narrabri in north-western New South Wales, Australia. The experimental treatments were: cotton–cotton (CC); cotton–vetch (Vicia villosa Roth. in 2002–06, Vicia benghalensis L. in 2007–11) (CV); cotton–wheat (Triticum aestivum L.), where wheat stubble was incorporated (CW); and cotton–wheat–vetch, where wheat stubble was retained as in-situ mulch (CWV). Vetch was terminated during or just before flowering by a combination of mowing and contact herbicides, and the residues were retained as in situ mulch. Estimates of carbon sequestered by above- and below-ground biomass inputs were in the order CWV >> CW = CV > CC. Carbon concentrations in the 0–1.2 m depth and carbon storage in the 0–0.3 and 0–1.2 m depths were similar among all cropping systems. Net carbon sequestration rates did not differ among cropping systems and did not change significantly with time in the 0–0.3 m depth, but net losses occurred in the 0–1.2 m depth. The discrepancy between measured and estimated values of sequestered carbon suggests that either the value of 5% used to estimate carbon sequestration from biomass inputs was an overestimate for this site, or post-sequestration losses may have been high. The latter has not been investigated in Australian Vertosols. Future research efforts should identify the cause and quantify the magnitude of these losses of organic carbon from soil.
APA, Harvard, Vancouver, ISO, and other styles
42

Rogers, M. E., A. R. Lawson, and K. B. Kelly. "Summer production and survival of perennial ryegrass (Lolium perenne) and tall fescue (Festuca arundinacea) genotypes in northern Victoria under differing irrigation management." Crop and Pasture Science 70, no. 12 (2019): 1163. http://dx.doi.org/10.1071/cp18542.

Full text
Abstract:
Perennial ryegrass (Lolium perenne L.) is the predominant perennial forage species used in temperate irrigated dairy-production systems in Australia. However, when temperatures are high, even with optimal irrigation strategies and nutrient inputs, dry matter (DM) production can be compromised. This research investigated the effects of perennial ryegrass and tall fescue genotypes and summer irrigation on (DM) production and survival. Ten perennial ryegrass cultivars, three hybrid ryegrasses and two cultivars of tall fescue (Festuca arundinacea (Schreb) Darbysh.) were sown in northern Victoria, Australia, in May 2014, and were managed under full irrigation or restricted irrigation (no irrigation between late December and mid-March) over a 3-year period. Measurements included net pasture accumulation (DM production), sward density (plant frequency) and water-soluble carbohydrate concentration. Apart from the expected differences in DM yield over the summer period between full irrigation and restricted irrigation, there were few differences in DM production among perennial ryegrass or tall fescue cultivars. Plant frequency declined significantly under restricted irrigation in Years 2 and 3 compared with full irrigation but there were no differences among perennial ryegrass cultivars. In Year 2, plant frequency was higher in the tall fescue cultivars than the ryegrass cultivars. The recovery pattern in DM production following recommencement of irrigation in mid-March (autumn) varied across years. In Year 1, plants recovered rapidly once irrigation recommenced in autumn. However, in Years 2 and 3, autumn and winter pasture accumulation under restricted irrigation was 30–35% less than under full irrigation. These differences were possibly related to decreases in plant frequency, as well as to differences in the amounts of residual pasture mass (or carbohydrate reserves) present when growth ceased. Analyses of the water-soluble carbohydrate concentrations in the pseudostem during summer and autumn in Year 3 showed differences in total water-soluble carbohydrate and in fructan and sucrose concentrations between irrigation treatments but no consistent differences among genotypes.
APA, Harvard, Vancouver, ISO, and other styles
43

Mason, MG. "Effect of management of previous cereal stubble on nitrogen fertiliser requirement of wheat." Australian Journal of Experimental Agriculture 32, no. 3 (1992): 355. http://dx.doi.org/10.1071/ea9920355.

Full text
Abstract:
The effect of either burning stubble, or incorporating it in the soil, on the nitrogen (N) fertiliser requirement of the following wheat crop was examined over 10 years (1978-87) in a continuous wheat system at 2 sites (Wongan Hills and Nabawa), and in both continuous wheat and wheat-fallow systems at one site (Merredin). There were significant grain yield increases in response to N fertiliser in all years at Nabawa. At Wongan Hills there was no response in 1978 and 1985, a yield reduction in 1979, and a yield increase in all other years. At Merredin, there was no response in 1980, a yield decrease in 1984 and 1985, and an increase in all other years. In some years grain yield responses were small at Wongan Hills and Merredin. The only significant overall effects of stubble treatment were at Nabawa in 1978 (P<0.01) and 1985 (P<0.05). The interaction between stubble treatment and N rate was significant at Wongan Hills in 1980 and 1981 (P<0.05), and at Merredin in 1981 (P<0.001), 1983, and 1985 (both P<0.05). Response to N fertiliser was higher where the stubble was incorporated than where it was burnt. There was also a tendency for higher optimum economic rates of N fertiliser with stubble incorporated rather than burnt, but differences were not large. At Merredin, the overall yield increase with fallow was significant (P<0.001) in 1979 and 1983. The fallow x N fertiliser rate interaction was significant in all comparison years except 1987. Responses to N were greater in the non-fallow treatments. Soil organic carbon (C) levels were higher with stubble incorporation than where the stubble was burnt, and fallowing resulted in lower organic C. There was a downward trend with time, especially when fallowing was carried out. Effects on total N levels in the soil were similar to those for organic C but were less marked. The study indicates that at a level of stubble residues of 1-3 t/ha with continuous wheat in this winter rainfall environment in Western Australia, stubble treatment is unlikely to be a major factor in determining the rate of N fertiliser required for a wheat crop.
APA, Harvard, Vancouver, ISO, and other styles
44

Rab, M. A., P. D. Fisher, R. D. Armstrong, M. Abuzar, N. J. Robinson, and S. Chandra. "Advances in precision agriculture in south-eastern Australia. IV. Spatial variability in plant-available water capacity of soil and its relationship with yield in site-specific management zones." Crop and Pasture Science 60, no. 9 (2009): 885. http://dx.doi.org/10.1071/cp08350.

Full text
Abstract:
Spatial variability in grain yield can arise from variation in many different soil and terrain properties. Identification of important sources of variation that bear significant relationship with grain yield can help achieve more effective site-specific management. This study had three aims: (i) a geostatistical description/modelling of the paddock-level spatial structure in variability of plant-available water capacity (PAWC) and related soil properties, (ii) to determine optimal number of management zones in the paddock, and (iii) to assess if the variability in PAWC and related soil properties is significantly associated with the variability in grain yield across the management zones. Particle size distribution, bulk density (BD), field capacity (FC), permanent wilting point (PWP), and soil water content (SWC) at sowing were measured at 4 soil depths (to 0.60 m) at 50 representative spatial sampling locations across a paddock near Birchip (Victoria). PAWC and plant-available water at sowing (PAWs) were derived from these data. Moderate to strong spatial dependence across the paddock was observed. The magnitude of the structural variation and of range varied widely across different soil properties and depths. The south-east edge and the central areas of the paddock had higher clay content, FC, PWP, PAWC, and lower PAWs. The paddock was divided into 6 potential management zones using combined header yield and normalised difference vegetation index (NDVI). The adequacy of zoning was evaluated using relative variability (RV) of header yield and soil properties. The mean RV for 3 zones differed little from that of 6 management zones for header yield and for each measured soil property, indicating division of the paddock into 3 zones to be adequate. The results from residual maximum likelihood (ReML) analysis showed that low yield zones had significantly higher clay content, FC, PWP, SWC, and PAWC and significantly lower PAWs than both medium and high yield zones. The mean FC, PWP, and PAWC in the low yield zones were, respectively, 25%, 26%, and 28% higher, and PAWs 36% lower than their corresponding values in the high yield zones. Linear regression analysis indicated that 59–96% of the observed variation in grain yield across management zones could be explained by variation in PWP. The practical implications of these results are discussed.
APA, Harvard, Vancouver, ISO, and other styles
45

McCormick, Jeffrey I., Jim M. Virgona, and John A. Kirkegaard. "Growth, recovery, and yield of dual-purpose canola (Brassica napus) in the medium-rainfall zone of south-eastern Australia." Crop and Pasture Science 63, no. 7 (2012): 635. http://dx.doi.org/10.1071/cp12078.

Full text
Abstract:
The effect of grazing of vegetative canola (Brassica napus) with sheep on crop growth and yield was investigated in two field experiments (Expts 1 and 2) in 2008 at Wagga Wagga, New South Wales, Australia. The experiments included a range of cultivars, sowing rates, and grazing periods to investigate the influence of these factors on grazing biomass, crop recovery, and grain yield. Three spring canola cultivars (representing triazine-tolerant, conventional, and hybrid types) were used in both experiments and were sown at three sowing rates and grazed by sheep for 7 days in midwinter in Expt 1, while two different grazing periods were compared in Expt 2. Supplementary irrigation was applied to Expt 1 to approximate average growing season conditions, while Expt 2 received no irrigation. Increased sowing rate produced greater early shoot biomass for grazing, but the-triazine tolerant cultivar produced less biomass than the conventional or hybrid cultivars in both experiments. Grazing reduced dry matter and leaf area by >50%, delayed flowering by 4 days on average, and reduced biomass at flowering by 22–52%. However, there was no impact of cultivar or sowing rate on the recovery of biomass and leaf area after grazing. Grazing had no effect on final grain yield under supplementary irrigation in Expt 1, but reduced grain yield under the drier regrowth conditions in Expt 2. The results demonstrate that grazing canola is feasible under average seasonal conditions in a medium-rainfall environment (400–600 mm) without yield penalty, provided the timing and intensity of grazing are matched to available biomass and anticipated seasonal water supply to support grain production. More broadly, we suggest that grain yield reductions from grazing could be avoided if suitable conditions for regrowth (residual dry matter, length of regrowth period, and adequate moisture) can generate biomass levels in excess of a target value of ~5000 kg ha–1 at flowering. This target value represents a biomass level where >90% of photosynthetically active radiation was intercepted in our study, and in other studies represents a biomass level above which there is little further increase in potential yield. Such a target provides a basis for more objective grazing management but awaits further confirmation with experimentation and modelling.
APA, Harvard, Vancouver, ISO, and other styles
46

Sá Antunes, Tathiana F., Marlonni Maurastoni, L. Johana Madroñero, Gabriela Fuentes, Jorge M. Santamaría, José Aires Ventura, Emanuel F. Abreu, A. Alberto R. Fernandes, and Patricia M. B. Fernandes. "Battle of Three: The Curious Case of Papaya Sticky Disease." Plant Disease 104, no. 11 (November 2020): 2754–63. http://dx.doi.org/10.1094/pdis-12-19-2622-fe.

Full text
Abstract:
Among the most serious problems in papaya production are the viruses associated with papaya ringspot and papaya sticky disease (PSD). PSD concerns producers worldwide because its symptoms are extremely aggressive and appear only after flowering. As no resistant cultivar is available, several disease management strategies have been used in affected countries, such as the use of healthy seeds, exclusion of the pathogen, and roguing. In the 1990s, a dsRNA virus, papaya meleira virus (PMeV), was identified in Brazil as the causal agent of PSD. However, in 2016 a second virus, papaya meleira virus 2 (PMeV2), with an ssRNA genome, was also identified in PSD plants. Only PMeV is detected in asymptomatic plants, whereas all symptomatic plants contain both viral RNAs separately packaged in particles formed by the PMeV capsid protein. PSD also affects papaya plants in Mexico, Ecuador, and Australia. PMeV2-like viruses have been identified in the affected plants, but the partner virus(es) in these countries are still unknown. In Brazil, PMeV and PMeV2 reside in laticifers that promote spontaneous latex exudation, resulting in the affected papaya fruit’s sticky appearance. Genes modulated in plants affected by PSD include those involved in reactive oxygen species and salicylic acid signaling, proteasomal degradation, and photosynthesis, which are key plant defenses against PMeV complex infection. However, the complete activation of the defense response is impaired by the expression of negative effectors modulated by the virus. This review presents a summary of the current knowledge of the Carica papaya-PMeV complex interaction and management strategies.
APA, Harvard, Vancouver, ISO, and other styles
47

Chadha, Aakansha, Singarayer Florentine, Bhagirath S. Chauhan, Benjamin Long, Mithila Jayasundera, Muhammad M. Javaid, and Christopher Turville. "Environmental factors affecting the germination and seedling emergence of two populations of an emerging agricultural weed: wild lettuce (Lactuca serriola)." Crop and Pasture Science 70, no. 8 (2019): 709. http://dx.doi.org/10.1071/cp18594.

Full text
Abstract:
Wild lettuce (Lactuca serriola L.) is a significant emerging agricultural and environmental weed in many countries. This invasive species is now naturalised in Australia and is claimed to cause significant losses within the agricultural industry. Sustainable management of wild lettuce has been hampered by a lack of detailed knowledge of its seed ecology. Laboratory-based studies were performed to examine the potential influence of environmental factors including temperature and light conditions, salinity, pH, moisture availability and burial depth on the germination and emergence of two spatially distant populations of wild lettuce. Results suggested that the germination of wild lettuce seeds occurred across a broad range of temperature conditions (12-h cycle: 30°C/20°C, 25°C/15°C and 17°C/7°C) for both populations. We also found that these seeds are non-photoblastic; germination was not affected by darkness, with &gt;80% germination in darkness for both populations at all tested temperature ranges. Germination significantly declined as salinity and osmotic stress increased for both populations, with seeds from the Tempy population were more affected by NaCl &gt;100 mM than seeds from Werribee, but in neither population was there any observed effect of pH on germination (&gt;80% germination in both populations at all tested pH ranges). For both populations, germination significantly decreased as burial depth increased; however, the two populations differed with regard to response to burial depth treatment, whereby seeds from the Tempy population had higher emergence than those from Werribee at 0.5 cm burial depth. These results suggest that light-reducing management techniques such as mulching or use of crop residues will be unsuccessful for preventing germination of wild lettuce. By contrast, burial of seeds at a depth of at least 4 cm will significantly reduce their emergence.
APA, Harvard, Vancouver, ISO, and other styles
48

Khangura, R. K., and M. J. Barbetti. "Prevalence of blackleg (Leptosphaeria maculans) on canola (Brassica napus) in Western Australia." Australian Journal of Experimental Agriculture 41, no. 1 (2001): 71. http://dx.doi.org/10.1071/ea00068.

Full text
Abstract:
Canola crops were monitored throughout the Western Australian wheatbelt during 1996–99 to determine the incidence and severity of crown cankers caused by the blackleg fungus (Leptosphaeria maculans). All crops surveyed had blackleg. The incidence of crown canker was 48–100%, 15–100%, 9–94% and 48–100% during 1996, 1997, 1998 and 1999, respectively. The mean incidence of crown cankers statewide was 85, 63, 55 and 85% in 1996, 1997, 1998 and 1999, respectively. The severity of crown canker (expressed as percentage disease index) ranged between 30 and 96%, 3 and 94%, 5 and 78% and 21 and 96% during 1996, 1997, 1998 and 1999, respectively. These high levels of blackleg can possibly be attributed to the accumulation of large amounts of infested canola residues. In 1999, there were effects of variety, application of the fungicide Impact, distance to last year’s canola residues and rainfall on the incidence and severity of blackleg. However, there were no effects of sowing date or region on the disease incidence or severity once the other factor effects listed above had been considered. In 1995, an additional survey of 19 sites in the central wheatbelt of Western Australia assessed the survival of the blackleg fungus on residues from crops grown in 1992–94. The residues at all sites carried blackleg. However, the extent of infection at any particular site varied from 12 to 100% of stems with the percentage of stems carrying pseudothecia containing ascospores varying between 7 and 96%. The high levels of blackleg disease found in commercial crops are indicative of significant losses in seed yields, making it imperative that management of blackleg be improved if canola is to remain a viable long-term cropping option in Western Australia.
APA, Harvard, Vancouver, ISO, and other styles
49

Jiang, Y. M., Y. Wang, L. Song, H. Liu, A. Lichter, O. Kerdchoechuen, D. C. Joyce, and J. Shi. "Postharvest characteristics and handling of litchi fruit — an overview." Australian Journal of Experimental Agriculture 46, no. 12 (2006): 1541. http://dx.doi.org/10.1071/ea05108.

Full text
Abstract:
Litchi (Litchi chinensis Sonn.) is a tropical to subtropical crop that originated in South-East Asia. Litchi fruit are prized on the world market for their flavour, semi-translucent white aril and attractive red skin. Litchi is now grown commercially in many countries and production in Australia, China, Israel, South Africa and Thailand has expanded markedly in recent years. Increased production has made significant contributions to economic development in these countries, especially those in South-East Asia. Non-climacteric litchi fruit are harvested at their visual and organoleptic optimum. They are highly perishable and, consequently, have a short life that limits marketability and potential expansion of demand. Pericarp browning and pathological decay are common and important defects of harvested litchi fruit. Postharvest technologies have been developed to reduce these defects. These technologies involve cooling and heating the fruit, use of various packages and packaging materials and the application of fungicides and other chemicals. Through the use of fungicides and refrigeration, litchi fruit have a storage life of about 30 days. However, when they are removed from storage, their shelf life at ambient temperature is very short due to pericarp browning and fruit rotting. Low temperature acclimation or use of chitsoan as a coating can extend the shelf life. Sulfur dioxide fumigation effectively reduces pericarp browning, but approval from Europe, Australia and Japan for this chemical is likely to be withdrawn due to concerns over sulfur residues in fumigated fruit. Thus, sulfur-free postharvest treatments that maintain fruit skin colour are increasingly important. Alternatives to SO2 fumigation for control of pericarp browning and fruit rotting are pre-storage pathogen management, anoxia treatment, and dipping in 2% hydrogen chloride solution for 6−8 min following storage at 0°C. Insect disinfestation has become increasingly important for the expansion of export markets because of quarantine issues associated with some fruit fly species. Thus, effective disinfestation protocols need to be developed. Heat treatment has shown promise as a quarantine technology, but it injures pericarp tissue and results in skin browning. However, heat treatment can be combined with an acid dip treatment that inhibits browning. Therefore, the primary aim of postharvest litchi research remains the achievement of highly coloured fruit which is free of pests and disease. Future research should focus on disease control before harvest, combined acid and heat treatments after harvest and careful temperature management during storage and transport.
APA, Harvard, Vancouver, ISO, and other styles
50

Saral, Rohit, and Dr Sarvjeet Kukreja. "Crop residue its impact and management." International Journal of Chemical Studies 8, no. 6 (November 1, 2020): 1550–54. http://dx.doi.org/10.22271/chemi.2020.v8.i6v.10985.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography