Academic literature on the topic 'Accurate distribution data'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Accurate distribution data.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Accurate distribution data"

1

Wang, Zhao Hong. "Normal Distribution Data Generating Method Based on Cloud Model." Advanced Materials Research 171-172 (December 2010): 385–88. http://dx.doi.org/10.4028/www.scientific.net/amr.171-172.385.

Full text
Abstract:
The similar normal distribution is used wildly in the natural science and social science, fuzzy membership degree function which is accurately established seriously reduces the forecast accuracy of such data. Cloud model compare randomness and fuzziness organically, it reveal the relevance between randomness and fuzziness with digital expectations, entropy and hyper entropy, forecast algorithm based on normal cloud model relaxed the requirements of a normal distribution prerequisite and replaced the accurate membership degree function with the membership degree distribution expectation function, it is more easier and simpler than the joint distribution, Comparative experiment showed it is more general, can complete the data forecast accurately and directly.
APA, Harvard, Vancouver, ISO, and other styles
2

Sung-Shik Koh, H. Hama, and T. T. Zin. "Accurate Estimation of Missing Data under Noise Distribution." IEEE Transactions on Consumer Electronics 52, no. 2 (May 2006): 528–35. http://dx.doi.org/10.1109/tce.2006.1649675.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Trombik, J., and T. Hlásny. "Free European data on forest distribution: overview and evaluation." Journal of Forest Science 59, No. 11 (November 29, 2013): 447–57. http://dx.doi.org/10.17221/58/2013-jfs.

Full text
Abstract:
A growing need for the evaluation of prospects and sustainability of forest resources calls for the availability of harmonized data on forest distribution. We described and evaluated nine datasets providing such information: Corine LandCover, four European forest maps and four tree species distribution maps. Apart from providing a condensed overview of these datasets, we focused on the match between selected forest maps and forest management plans (FMPs) of Slovakia, which can be thought of as highly accurate information on forest distribution. The degree of match between forest and species area, within 306 forest administrative districts of Slovakia, was used as an indicator of accuracy. In addition, the match between the total forest and species area in Slovakia, given by FMPs and by evaluated datasets, was addressed. We found a high degree of match for the datasets on forest distribution (R-square 0.77–0.93, depending on the dataset), as well as strong agreement in total forest area (± 5%). Both indicators are worse in the case of forest type evaluation (coniferous and broadleaved). Poor results were obtained for tree species maps, which under- or overestimated species areas by tens of per cent, although differences were highly variable among species. The obtained results are valid mainly for temperate forests.
APA, Harvard, Vancouver, ISO, and other styles
4

Trumbo, D. R., A. A. Burgett, and J. H. Knouft. "Testing climate-based species distribution models with recent field surveys of pond-breeding amphibians in eastern Missouri." Canadian Journal of Zoology 89, no. 11 (November 2011): 1074–83. http://dx.doi.org/10.1139/z11-083.

Full text
Abstract:
Species distribution models (SDMs) have become an important tool for ecologists by providing the ability to predict the distributions of organisms based on species niche parameters and available habitat across broad geographic areas. However, investigation of the appropriate extent of environmental data needed to make accurate predictions has received limited attention. We investigate whether SDMs developed with regional climate and species locality data (i.e., within Missouri, USA) produce more accurate predictions of species occurrences than models developed with data from across an entire species range. To test the accuracy of the model predictions, field surveys were performed in 2007 and 2008 at 103 study ponds for eight amphibian study species. Models developed using data from across the entire species range did not accurately predict the occurrences of any study species. However, models developed using data only from Missouri produced accurate predictions for four study species, all of which are near the edge of their geographic ranges within the study area. These results suggest that species distribution modeling with regionally focused data may be preferable for local ecological and conservation purposes, and that climate factors may be more important for determining species distributions at the edge of their geographic ranges.
APA, Harvard, Vancouver, ISO, and other styles
5

Liu, Yu, Xiaoping Wang, and Jiaxin Qian. "Crop distribution extraction based on Sentinel data." E3S Web of Conferences 252 (2021): 02081. http://dx.doi.org/10.1051/e3sconf/202125202081.

Full text
Abstract:
Remote sensing identification and classification of crops is the use of remote sensing for estimating crop planting area of timely and accurate monitoring of crop growth and plant diseases and insect pests in advance to make the product output to estimate the key and premise of the study using Sentinel-1 and Sentinel-2 satellite, by random forest algorithm, the traditional optical wavelengths and vegetation index The backward scattering field of red edge information and radar information in feature selection and feature classification, including winter wheat summer corn orchard woodland town water and bare land set three controls, such as the first group contains radar time characteristics, the characteristics of the second control group contains red edge long, the third group includes traditional vegetation index for phase characteristics, analyzed the different classification accuracy. The results from the confusion matrix show that the red edge band edge after index and the radar scattering information to join the crop classification accuracy is improved effectively. Sentinel optical and radar satellites with a time resolution of 5–6 days have great potential for crop monitoring research.
APA, Harvard, Vancouver, ISO, and other styles
6

Xintao, Xia, Chang Zhen, Zhang Lijun, and Yang Xiaowei. "Estimation on Reliability Models of Bearing Failure Data." Mathematical Problems in Engineering 2018 (2018): 1–21. http://dx.doi.org/10.1155/2018/6189527.

Full text
Abstract:
The failure data of bearing products is random and discrete and shows evident uncertainty. Is it accurate and reliable to use Weibull distribution to represent the failure model of product? The Weibull distribution, log-normal distribution, and an improved maximum entropy probability distribution were compared and analyzed to find an optimum and precise reliability analysis model. By utilizing computer simulation technology and k-s hypothesis testing, the feasibility of three models was verified, and the reliability of different models obtained via practical bearing failure data was compared and analyzed. The research indicates that the reliability model of two-parameter Weibull distribution does not apply to all situations, and sometimes, two-parameter log-normal distribution model is more precise and feasible; compared to three-parameter log-normal distribution model, the three-parameter Weibull distribution manifests better accuracy but still does not apply to all cases, while the novel proposed model of improved maximum entropy probability distribution fits not only all kinds of known distributions but also poor information issues with unknown probability distribution, prior information, or trends, so it is an ideal reliability analysis model with least error at present.
APA, Harvard, Vancouver, ISO, and other styles
7

Lin, Zhidi, Dongliang Duan, Qi Yang, Xuemin Hong, Xiang Cheng, Liuqing Yang, and Shuguang Cui. "Data-Driven Fault Localization in Distribution Systems with Distributed Energy Resources." Energies 13, no. 1 (January 6, 2020): 275. http://dx.doi.org/10.3390/en13010275.

Full text
Abstract:
The integration of Distributed Energy Resources (DERs) introduces a non-conventional two-way power flow which cannot be captured well by traditional model-based techniques. This brings an unprecedented challenge in terms of the accurate localization of faults and proper actions of the protection system. In this paper, we propose a data-driven fault localization strategy based on multi-level system regionalization and the quantification of fault detection results in all subsystems/subregions. This strategy relies on the tree segmentation criterion to divide the entire system under study into several subregions, and then combines Support Vector Data Description (SVDD) and Kernel Density Estimation (KDE) to find the confidence level of fault detection in each subregion in terms of their corresponding p-values. By comparing the p-values, one can accurately localize the faults. Experiments demonstrate that the proposed data-driven fault localization can greatly improve the accuracy of fault localization for distribution systems with high DER penetration.
APA, Harvard, Vancouver, ISO, and other styles
8

Gao, Nannan, Fen Li, Hui Zeng, Daniël van Bilsen, and Martin De Jong. "Can More Accurate Night-Time Remote Sensing Data Simulate a More Detailed Population Distribution?" Sustainability 11, no. 16 (August 19, 2019): 4488. http://dx.doi.org/10.3390/su11164488.

Full text
Abstract:
Aging, shrinking cities, urban agglomerations and other new key terms continue to emerge when describing the large-scale population changes in various cities in mainland China. It is important to simulate the distribution of residential populations at a coarse scale to manage cities as a whole, and at a fine scale for policy making in infrastructure development. This paper analyzes the relationship between the DN (Digital number, value assigned to a pixel in a digital image) value of NPP-VIIRS (the Suomi National Polar-orbiting Partnership satellite’s Visible Infrared Imaging Radiometer Suite) and LuoJia1-01 and the residential populations of urban areas at a district, sub-district, community and court level, to compare the influence of resolution of remote sensing data by taking urban land use to map out auxiliary data in which first-class (R1), second-class (R2) and third-class residential areas (R3) are distinguished by house price. The results show that LuoJia1-01 more accurately analyzes population distributions at a court level for second- and third-class residential areas, which account for over 85% of the total population. The accuracy of the LuoJia1-01 simulation data is higher than that of Landscan and GHS (European Commission Global Human Settlement) population. This can be used as an important tool for refining the simulation of residential population distributions. In the future, higher-resolution night-time light data could be used for research on accurate simulation analysis that scales down large-scale populations.
APA, Harvard, Vancouver, ISO, and other styles
9

Yoon, Seong-Sim, Anh Tran Phuong, and Deg-Hyo Bae. "Quantitative Comparison of the Spatial Distribution of Radar and Gauge Rainfall Data." Journal of Hydrometeorology 13, no. 6 (December 1, 2012): 1939–53. http://dx.doi.org/10.1175/jhm-d-11-066.1.

Full text
Abstract:
Abstract The common statement that a rain gauge network usually provides better observation at specific points while weather radar provides more accurate observation of the spatial distribution of rain field over a large area has never been subjected to quantitative evaluation. The aim of this paper is to evaluate the statement by using some statistical criteria. The Monte Carlo simulation experiment, inverse distance weighting (IDW) interpolation method, and cross-validation technique are used to investigate the relation between the accuracy of the interpolated rainfall and the rain gauge density. The radar reflectivity–rainfall intensity (Z–R) relationship is constructed by the least squares fitting method from observation data of radar and rain gauges. The variation in this relationship and the accuracy of the radar rainfall with rain gauge density are evaluated by using the Monte Carlo simulation experiment. Three storm events are selected as the case studies. The obtained results show that the accuracy of interpolated and radar rainfall increases nonlinearly with increasing gauge density. The higher correlation coefficient (γ) value of radar-rainfall estimation, compared to gauge interpolation, especially in the convective storm, proves that radar observation provides a more accurate spatial structure of the rain field than gauge observation does.
APA, Harvard, Vancouver, ISO, and other styles
10

Piltz, Ross O. "Accurate data processing for neutron Laue diffractometers." Journal of Applied Crystallography 51, no. 3 (May 25, 2018): 635–45. http://dx.doi.org/10.1107/s1600576718005058.

Full text
Abstract:
The factors affecting the accuracy of structural refinements from image-plate neutron Laue diffractometers are analysed. From this analysis, an improved data-processing method is developed which optimizes the intensity corrections for exposure scaling, wavelength distribution, absorption and extinction corrections, and the wavelength/spatial/time dependence of the image-plate detector efficiencies. Of equal importance is an analysis of the sources of uncertainty in the final corrected intensities, without which bias of the merged intensities occurs, due to the dominance of measurements with small statistical errors though potentially large systematic errors. A new aspect of the impact of detector crosstalk on the counting statistics of area detectors is reported and shown to be significant for the case of neutron Laue diffraction. These methods have been implemented in software which processes data from the KOALA instrument at ANSTO and the now decommissioned VIVALDI instrument at ILL (Grenoble, France). A comparison with earlier data-analysis methods shows a significant improvement in accuracy of the refined structures.
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Accurate distribution data"

1

Mugodo, James, and n/a. "Plant species rarity and data restriction influence the prediction success of species distribution models." University of Canberra. Resource, Environmental & Heritage Sciences, 2002. http://erl.canberra.edu.au./public/adt-AUC20050530.112801.

Full text
Abstract:
There is a growing need for accurate distribution data for both common and rare plant species for conservation planning and ecological research purposes. A database of more than 500 observations for nine tree species with different ecological and geographical distributions and a range of frequencies of occurrence in south-eastern New South Wales (Australia) was used to compare the predictive performance of logistic regression models, generalised additive models (GAMs) and classification tree models (CTMs) using different data restriction regimes and several model-building strategies. Environmental variables (mean annual rainfall, mean summer rainfall, mean winter rainfall, mean annual temperature, mean maximum summer temperature, mean minimum winter temperature, mean daily radiation, mean daily summer radiation, mean daily June radiation, lithology and topography) were used to model the distribution of each of the plant species in the study area. Model predictive performance was measured as the area under the curve of a receiver operating characteristic (ROC) plot. The initial predictive performance of logistic regression models and generalised additive models (GAMs) using unrestricted, temperature restricted, major gradient restricted and climatic domain restricted data gave results that were contrary to current practice in species distribution modelling. Although climatic domain restriction has been used in other studies, it was found to produce models that had the lowest predictive performance. The performance of domain restricted models was significantly (p = 0.007) inferior to the performance of major gradient restricted models when the predictions of the models were confined to the climatic domain of the species. Furthermore, the effect of data restriction on model predictive performance was found to depend on the species as shown by a significant interaction between species and data restriction treatment (p = 0.013). As found in other studies however, the predictive performance of GAM was significantly (p = 0.003) better than that of logistic regression. The superiority of GAM over logistic regression was unaffected by different data restriction regimes and was not significantly different within species. The logistic regression models used in the initial performance comparisons were based on models developed using the forward selection procedure in a rigorous-fitting model-building framework that was designed to produce parsimonious models. The rigorous-fitting modelbuilding framework involved testing for the significant reduction in model deviance (p = 0.05) and significance of the parameter estimates (p = 0.05). The size of the parameter estimates and their standard errors were inspected because large estimates and/or standard errors are an indication of model degradation from overfilling or effecls such as mullicollinearily. For additional variables to be included in a model, they had to contribule significantly (p = 0.025) to the model prediclive performance. An attempt to improve the performance of species distribution models using logistic regression models in a rigorousfitting model-building framework, the backward elimination procedure was employed for model selection, bul it yielded models with reduced performance. A liberal-filling model-building framework that used significant model deviance reduction at p = 0.05 (low significance models) and 0.00001 (high significance models) levels as the major criterion for variable selection was employed for the development of logistic regression models using the forward selection and backward elimination procedures. Liberal filling yielded models that had a significantly greater predictive performance than the rigorous-fitting logistic regression models (p = 0.0006). The predictive performance of the former models was comparable to that of GAM and classification tree models (CTMs). The low significance liberal-filling models had a much larger number of variables than the high significance liberal-fitting models, but with no significant increase in predictive performance. To develop liberal-filling CTMs, the tree shrinking program in S-PLUS was used to produce a number of trees of differenl sizes (subtrees) by optimally reducing the size of a full CTM for a given species. The 10-fold cross-validated model deviance for the subtrees was plotted against the size of the subtree as a means of selecting an appropriate tree size. In contrast to liberal-fitting logistic regression, liberal-fitting CTMs had poor predictive performance. Species geographical range and species prevalence within the study area were used to categorise the tree species into different distributional forms. These were then used, to compare the effect of plant species rarity on the predictive performance of logistic regression models, GAMs and CTMs. The distributional forms included restricted and rare (RR) species (Eucalyptus paliformis and Eucalyptus kybeanensis), restricted and common (RC) species (Eucalyptus delegatensis, Eucryphia moorei and Eucalyptus fraxinoides), widespread and rare (WR) species (Eucalyptus data) and widespread and common (WC) species (Eucalyptus sieberi, Eucalyptus pauciflora and Eucalyptus fastigata). There were significant differences (p = 0.076) in predictive performance among the distributional forms for the logistic regression and GAM. The predictive performance for the WR distributional form was significantly lower than the performance for the other plant species distributional forms. The predictive performance for the RC and RR distributional forms was significantly greater than the performance for the WC distributional form. The trend in model predictive performance among plant species distributional forms was similar for CTMs except that the CTMs had poor predictive performance for the RR distributional form. This study shows the importance of data restriction to model predictive performance with major gradient data restriction being recommended for consistently high performance. Given the appropriate model selection strategy, logistic regression, GAM and CTM have similar predictive performance. Logistic regression requires a high significance liberal-fitting strategy to both maximise its predictive performance and to select a relatively small model that could be useful for framing future ecological hypotheses about the distribution of individual plant species. The results for the modelling of plant species for conservation purposes were encouraging since logistic regression and GAM performed well for the restricted and rare species, which are usually of greater conservation concern.
APA, Harvard, Vancouver, ISO, and other styles
2

Grim, Evan T. "ACHIEVING HIGH-ACCURACY TIME DISTRIBUTION IN NETWORK-CENTRIC DATA ACQUISITION AND TELEMETRY SYSTEMS WITH IEEE 1588." International Foundation for Telemetering, 2006. http://hdl.handle.net/10150/604418.

Full text
Abstract:
ITC/USA 2006 Conference Proceedings / The Forty-Second Annual International Telemetering Conference and Technical Exhibition / October 23-26, 2006 / Town and Country Resort & Convention Center, San Diego, California
Network-centric data acquisition and telemetry systems continue to gain momentum and adoption. However, inherent non-deterministic network delays hinder these systems’ suitability for use where high-accuracy timing information is required. The emerging IEEE 1588 standard for time distribution offers the potential for real-time data acquisition system development using cost-effective, standards-based network technologies such as Ethernet and IP multicast. This paper discusses the challenges, realities, lessons, and triumphs experienced using IEEE 1588 in the development and implementation of such a large-scale network-centric data acquisition and telemetry system. IEEE 1588 clears a major hurdle in moving the network-centric buzz from theory to realization.
APA, Harvard, Vancouver, ISO, and other styles
3

Persson, Erold. "Multicast Time Distribution." Thesis, Linköping University, Department of Electrical Engineering, 2004. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-2274.

Full text
Abstract:

The Swedish National Testing and Research Institute is maintaining the Swedish realization of the world time scale UTC, called UTC(SP). One area of research and development for The Swedish National Laboratory of Time and Frequency is time synchronization and how UTC(SP) can be distributed in Sweden. Dissemination of time information by SP is in Sweden mainly performed via Internet using the Network Time Protocol (NTP) as well as via a modem dial up service and a speaking clock (Fröken Ur). In addition to these services, time information from the Global Positioning System (GPS) and from the long-wave transmitter DCF77 in Germany, is also available in Sweden.

This master’s thesis considers how different available commercial communication systems could be used for multicast time distribution. DECT, Bluetooth, Mobile Telecommunication and Radio Broadcasting are different techniques that are investigated. One application of Radio Broadcasting, DARC, was found to be interesting for a more detailed study. A theoretical description of how DARC could be used for national time distribution is accomplished and a practical implementation of a test system is developed to evaluate the possibilities to use DARC for multicast time distribution.

The tests of DARC and the radio broadcast system showed that these could be interesting techniques to distribute time with an accuracy of a couple of milliseconds. This quality level is not obtained today but would be possible with some alterations of the system.

APA, Harvard, Vancouver, ISO, and other styles
4

Briones, Maria. "Validating the Accuracy of Neatwork, a Rural Gravity Fed Water Distribution System Design Program, Using Field Data in the Comarca Ngöbe-Bugle, Panama." Scholar Commons, 2018. https://scholarcommons.usf.edu/etd/7268.

Full text
Abstract:
Despite the sustainable development goals to increase access to improved water there are still 884 million people in the world without access to an improved water source (WHO, 2017). One method to improve access to water in rural, mountainous areas, is through construction of gravity fed water distribution systems. These systems should be designed based upon fundamental principles of hydraulics. One method of doing so in a time efficient manner with minimal engineering knowledge is to utilize a downloadable computer program such as Neatwork, which aids in design of rural, gravity fed water distribution systems and has been used by volunteers in Peace Corps Panama for years. It was the goal of this research to validate the results of the Neatwork program by comparing the flow results produced in the simulation program with flow results measured at tap stands of a rural gravity fed water distribution system in the community of Alto Nube, Comarca Ngöbe Bugle, Panama. The author measured flow under default Neatwork conditions of 40% faucets open in the system (in the field an equivalent of 8 taps) to have an initial basis as to whether the Neatwork program and field conditions yielded corresponding flows. The second objective would be to vary the number of taps open if the default condition did not produce comparable results between the field and the simulation, to pinpoint if under a certain condition of open faucets in the system the two methods would agree. The author did this by measuring flow at varying combinations from 10-100% of the open taps in the system (2-20 taps). Lastly the author observed the flow differences in the Neatwork program against the field flows, when the elevation of water in the water reservoir is set to the Neatwork default, where elevation of water is the tank outlet (at the bottom of the tank) versus when the elevation is established at the overflow at the tank (at the top of the tank) for the case of two taps open. The author used paired t-tests to test for statistical difference between Neatwork and field produced flows. She found that for the default condition of 40% taps open and all other combinations executed between 30-80% taps open, the field and Neatwork flows did not produce statistically similar results and, in fact, had the tendency to overestimate flows. The author also found that the change in water elevation in the storage tank from outlet to overflow increased the flow at the two taps measured by 0.140 l/s and 0.145 l/s and in this case, did not change whether the flows at these taps were within desired range (0.1 -0.3 l/s). Changing the elevation of the water level in the tank in the Neatwork program to correspond to a “full” tank condition is not recommended, as assuming an empty tank will account for seasonal changes or other imperfections in topographical surveying that could reduce available head at each tap. The author also found that the orifice coefficients, θ, of 0.62 and 0.68, did not demonstrate more or less accurate results that coincided with field measurements, but rather showed the tendency of particular faucets to prefer one coefficient over the other, regardless of combination of other taps open in the system. This study demonstrates a consistent overestimation in flow using the computer program Neatwork. Further analysis on comparisons made show that between field and flow results across each individual faucet, variations between Neatwork and the field were a result of variables dependent upon the tap, such as flow reducers or errors in surveying. Flow reducers are installed before taps to distribute flow equally amongst homes over varying distances and elevations and are fabricated using different diameter orifices depending on the location of the tap. While Neatwork allows the user to simulate the effect of these flow reducers on tap flow, it may not account for the imperfect orifices made by the simple methods used in the field to make such flow reducers. The author recommends further investigation to be done on the results of field flow versus Neatwork simulated flow using other methods of flow reducer fabrication which produce varying degrees of accuracy in orifice sizing. The author also recommends executing these field measurements over a greater sample size of faucets and more randomized combination of open/closed taps to verify the results of this research. More work should be done to come up with a practical solution for poor and rural communities to fabricate and/or obtain more precisely sized flow reducers. A full sensitivity analysis of the input variables into the Neatwork program should be performed to understand the sensitivity of varying each input.
APA, Harvard, Vancouver, ISO, and other styles
5

Vestin, Albin, and Gustav Strandberg. "Evaluation of Target Tracking Using Multiple Sensors and Non-Causal Algorithms." Thesis, Linköpings universitet, Reglerteknik, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-160020.

Full text
Abstract:
Today, the main research field for the automotive industry is to find solutions for active safety. In order to perceive the surrounding environment, tracking nearby traffic objects plays an important role. Validation of the tracking performance is often done in staged traffic scenarios, where additional sensors, mounted on the vehicles, are used to obtain their true positions and velocities. The difficulty of evaluating the tracking performance complicates its development. An alternative approach studied in this thesis, is to record sequences and use non-causal algorithms, such as smoothing, instead of filtering to estimate the true target states. With this method, validation data for online, causal, target tracking algorithms can be obtained for all traffic scenarios without the need of extra sensors. We investigate how non-causal algorithms affects the target tracking performance using multiple sensors and dynamic models of different complexity. This is done to evaluate real-time methods against estimates obtained from non-causal filtering. Two different measurement units, a monocular camera and a LIDAR sensor, and two dynamic models are evaluated and compared using both causal and non-causal methods. The system is tested in two single object scenarios where ground truth is available and in three multi object scenarios without ground truth. Results from the two single object scenarios shows that tracking using only a monocular camera performs poorly since it is unable to measure the distance to objects. Here, a complementary LIDAR sensor improves the tracking performance significantly. The dynamic models are shown to have a small impact on the tracking performance, while the non-causal application gives a distinct improvement when tracking objects at large distances. Since the sequence can be reversed, the non-causal estimates are propagated from more certain states when the target is closer to the ego vehicle. For multiple object tracking, we find that correct associations between measurements and tracks are crucial for improving the tracking performance with non-causal algorithms.
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Accurate distribution data"

1

Wahyunto. Peatland distribution in Sumatra and Kalimantan: Explanation of its data sets including source of information, accuracy, data constraints, and gaps. Bogor: Wetlands International, Indonesia Programme, 2008.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Valelly, Richard, Suzanne Mettler, and Robert Lieberman, eds. The Oxford Handbook of American Political Development. Oxford University Press, 2014. http://dx.doi.org/10.1093/oxfordhb/9780199697915.001.0001.

Full text
Abstract:
Scholars working in or sympathetic to American political development (APD) share a commitment to accurately understanding the history of American politics – and thus they question stylized facts about America’s political evolution. Like other approaches to American politics, APD prizes analytical rigor, data collection, the development and testing of theory, and the generation of provocative hypotheses. Much APD scholarship indeed overlaps with the American politics subfield and its many well developed literatures on specific institutions or processes (for example Congress, judicial politics, or party competition), specific policy domains (welfare policy, immigration), the foundations of (in)equality in American politics (the distribution of wealth and income, race, ethnicity, gender, class, and sexual and gender orientation), public law, and governance and representation. What distinguishes APD is careful, systematic thought about the ways that political processes, civic ideals, the political construction of social divisions, patterns of identity formation, the making and implementation of public policies, contestation over (and via) the Constitution, and other formal and informal institutions and processes evolve over time – and whether (and how) they alter, compromise, or sustain the American liberal democratic regime. APD scholars identify, in short, the histories that constitute American politics. They ask: what familiar or unfamiliar elements of the American past illuminate the present? Are contemporary phenomena that appear new or surprising prefigured in ways that an APD approach can bring to the fore? If a contemporary phenomenon is unprecedented then how might an accurate understanding of the evolution of American politics unlock its significance?
APA, Harvard, Vancouver, ISO, and other styles
3

Michael, Damian, and David Lindenmayer. Reptiles of the NSW Murray Catchment. CSIRO Publishing, 2010. http://dx.doi.org/10.1071/9780643098213.

Full text
Abstract:
This is an easy to use field guide for identifying the 80 reptile species currently known to occur in the Murray catchment area of New South Wales. Illustrated with high quality colour photographs, the book describes the key distinguishing features of each reptile and includes details on habitats and conservation status. Uniquely, it has a detailed chapter on how to conserve reptiles and manage key habitats, providing landholders and natural resource agencies with the knowledge to help conserve reptiles in agricultural farming landscapes. The up-to-date distribution maps are based on 10 years of extensive surveys and research on reptiles in the Murray catchment. The final chapter includes a section on similar looking species to further enable readers to accurately and quickly identify difficult species. Reptiles of the NSW Murray Catchment promotes a broad appreciation of reptiles in the region, and is a must-have for natural history enthusiasts.
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Accurate distribution data"

1

Perakis, Nikolaos, and Oskar J. Haidn. "Experimental and Numerical Investigation of CH$$_4$$/O$$_2$$ Rocket Combustors." In Notes on Numerical Fluid Mechanics and Multidisciplinary Design, 359–79. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-53847-7_23.

Full text
Abstract:
Abstract The experimental investigation of sub-scale rocket engines gives significant information about the combustion dynamics and wall heat transfer phenomena occurring in full-scale hardware. At the same time, the performed experiments serve as validation test cases for numerical CFD models and for that reason it is vital to obtain accurate experimental data. In the present work, an inverse method is developed able to accurately predict the axial and circumferential heat flux distribution in CH$$_4$$/O$$_2$$ rocket combustors. The obtained profiles are used to deduce information about the injector-injector and injector-flame interactions. Using a 3D CFD simulation of the combustion and heat transfer within a multi-element thrust chamber, the physical phenomena behind the measured heat flux profiles can be inferred. A very good qualitative and quantitative agreement between the experimental measurements and the numerical simulations is achieved.
APA, Harvard, Vancouver, ISO, and other styles
2

Basellini, Ugofilippo, and Carlo Giovanni Camarda. "A Three-Component Approach to Model and Forecast Age-at-Death Distributions." In Developments in Demographic Forecasting, 105–29. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-42472-5_6.

Full text
Abstract:
Abstract Mortality forecasting has recently received growing interest, as accurate projections of future lifespans are needed to ensure the solvency of insurance and pension providers. Several innovative stochastic methodologies have been proposed in most recent decades, the majority of them being based on age-specific mortality rates or on summary measures of the life table. The age-at-death distribution is an informative life-table function that provides readily available information on the mortality pattern of a population, yet it has been mostly overlooked for mortality projections. In this chapter, we propose to analyse and forecast mortality developments over age and time by introducing a novel methodology based on age-at-death distributions. Our approach starts from a nonparametric decomposition of the mortality pattern into three independent components corresponding to Childhood, Early-Adulthood and Senescence, respectively. We then model the evolution of each component-specific death density with a relational model that associates a time-invariant standard to a series of observed distributions by means of a transformation of the age axis. Our approach allows us to capture mortality developments over age and time, and forecasts can be derived from parameters’ extrapolation using standard time series models. We illustrate our methods by estimating and forecasting the mortality pattern of females and males in two high-longevity countries using data of the Human Mortality Database. We compare the forecast accuracy of our model and its projections until 2050 with three other forecasting methodologies.
APA, Harvard, Vancouver, ISO, and other styles
3

Grist, James T., Esben Søvsø Hansen, Frank G. Zöllner, and Christoffer Laustsen. "Sodium (23Na) MRI of the Kidney: Experimental Protocol." In Methods in Molecular Biology, 473–80. New York, NY: Springer US, 2021. http://dx.doi.org/10.1007/978-1-0716-0978-1_28.

Full text
Abstract:
AbstractSodium handling is a key physiological hallmark of renal function. Alterations are generally considered a pathophysiologic event associated with kidney injury, with disturbances in the corticomedullary sodium gradient being indicative of a number of conditions. This experimental protocol review describes the individual steps needed to perform 23Na MRI; allowing accurate monitoring of the renal sodium distribution in a step-by-step experimental protocol for rodents.This chapter is based upon work from the PARENCHIMA COST Action, a community-driven network funded by the European Cooperation in Science and Technology (COST) program of the European Union, which aims to improve the reproducibility and standardization of renal MRI biomarkers. This experimental protocol chapter is complemented by two separate chapters describing the basic concept and data analysis.
APA, Harvard, Vancouver, ISO, and other styles
4

Rocker, Björn, Mariana Kolberg, and Vincent Heuveline. "The Impact of Data Distribution in Accuracy and Performance of Parallel Linear Algebra Subroutines." In Lecture Notes in Computer Science, 394–407. Berlin, Heidelberg: Springer Berlin Heidelberg, 2011. http://dx.doi.org/10.1007/978-3-642-19328-6_36.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Schaeffner, Maximilian, Christopher M. Gehb, Robert Feldmann, and Tobias Melz. "Forward vs. Bayesian Inference Parameter Calibration: Two Approaches for Non-deterministic Parameter Calibration of a Beam-Column Model." In Lecture Notes in Mechanical Engineering, 173–90. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-77256-7_15.

Full text
Abstract:
AbstractMathematical models are commonly used to predict the dynamic behavior of mechanical structures or to synthesize controllers for active systems. Calibrating the model parameters to experimental data is crucial to achieve reliable and adequate model predictions. However, the experimental dynamic behavior is uncertain due to variations in component properties, assembly and mounting. Therefore, uncertainty in the model parameters can be considered in a non-deterministic calibration. In this paper, we compare two approaches for a non-deterministic parameter calibration, which both consider uncertainty in the parameters of a beam-column model. The goal is to improve the model prediction of the axial load-dependent lateral dynamic behavior. The investigation is based on a beam-column system subjected to compressive axial loads used for active buckling control. A representative sample of 30 nominally identical beam-column systems characterizes the variations in the experimental lateral axial load-dependent dynamic behavior. First, in a forward parameter calibration approach, the parameters of the beam-column model are calibrated separately for all 30 investigated beam-column systems using a least squares optimization. The uncertainty in the parameters is obtained by assuming normal distributions of the separately calibrated parameters. Second, in a Bayesian inference parameter calibration approach, the parameters are calibrated using the complete sample of experimental data. Posterior distributions of the parameters characterize the uncertain dynamic behavior of the beam-column model. For both non-deterministic parameter calibration approaches, the predicted uncertainty ranges of the axial load-dependent lateral dynamic behavior are compared to the uncertain experimental behavior and the most accurate results are identified.
APA, Harvard, Vancouver, ISO, and other styles
6

Mukhopadhyay, Nitis. "Exploring Fixed-Accuracy Estimation for Population Gini Inequality Index Under Big Data: A Passage to Practical Distribution-Free Strategies." In Gini Inequality Index, 217–41. First edition. | Boca Raton: CRC Press, 2021.: Chapman and Hall/CRC, 2021. http://dx.doi.org/10.1201/9781003143642-11.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Friedrich, Bretislav, and Horst Schmidt-Böcking. "Otto Stern’s Molecular Beam Method and Its Impact on Quantum Physics." In Molecular Beams in Physics and Chemistry, 37–88. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-63963-1_5.

Full text
Abstract:
AbstractMotivated by his interest in thermodynamics and the emerging quantum mechanics, Otto Stern (1888–1969) launched in 1919 his molecular beam method to examine the fundamental assumptions of theory that transpire in atomic, molecular, optical, and nuclear physics. Stern’s experimental endeavors at Frankfurt (1919–1922), Hamburg (1923–1933), and Pittsburgh (1933–1945) provided insights into the quantum world that were independent of spectroscopy and that concerned well-defined isolated systems, hitherto accessible only to Gedanken experiments. In this chapter we look at how Stern’s molecular beam research came about and review six of his seminal experiments along with their context and reception by the physics community: the Stern-Gerlach experiment; the three-stage Stern-Gerlach experiment; experimental evidence for de Broglie’s matter waves; measurements of the magnetic dipole moment of the proton and the deuteron; experimental demonstration of momentum transfer upon absorption or emission of a photon; the experimental verification of the Maxwell-Boltzmann velocity distribution via deflection of a molecular beam by gravity. Regarded as paragons of thoroughness and ingenuity, these experiments entail accurate transversal momentum measurements with resolution better than 0.1 atomic units. Some of these experiments would be taken up by others where Stern left off only decades later (matter-wave scattering or photon momentum transfer). We conclude by highlighting aspects of Stern’s legacy as reflected by the honors that have been bestowed upon him to date.
APA, Harvard, Vancouver, ISO, and other styles
8

Supple, DMD, Robert C. "Digital Occlusal Force Distribution Patterns (DOFDPs)." In Oral Healthcare and Technologies, 1–74. IGI Global, 2017. http://dx.doi.org/10.4018/978-1-5225-1903-4.ch001.

Full text
Abstract:
This chapter describes the many clinical applications of Digital Occlusal Force Distribution Patterns (DOFDPs) recorded with the T-Scan Computerized Occlusal Analysis system. Movements made by the Center of Force trajectory as force travels around the dental arches during the occlusion and disocclusion creates these patterns. The repetitive occlusal contact data points locate the force distribution received when teeth occlude against each other. These force distribution patterns correlate to intraoral compromised dental anatomy found in radiographs, photographs, and during the clinical examination of teeth and their supporting tissues. Moreover, they directly influence the envelope of motion, the envelope of function, and head and neck posture. This chapter illustrates with clinical examples the correlation between Stomatognathic System structural damage and repeating patterns of abnormal occlusal force distribution. The T-Scan technology isolates these damaging regions of excess microtraumatic occlusal force, absent of clinician subjectivity, thereby helping clinicians make an accurate, organized, and documented occlusal diagnosis.
APA, Harvard, Vancouver, ISO, and other styles
9

Supple, DMD, Robert C. "Digital Occlusal Force Distribution Patterns (DOFDPs)." In Handbook of Research on Computerized Occlusal Analysis Technology Applications in Dental Medicine, 830–904. IGI Global, 2015. http://dx.doi.org/10.4018/978-1-4666-6587-3.ch019.

Full text
Abstract:
This chapter describes the many clinical applications of Digital Occlusal Force Distribution Patterns (DOFDPs) recorded with the T-Scan Computerized Occlusal Analysis system. Movements made by the Center of Force trajectory as force travels around the dental arches during the occlusion and disocclusion creates these patterns. The repetitive occlusal contact data points locate the force distribution received when teeth occlude against each other. These force distribution patterns correlate to intraoral compromised dental anatomy found in radiographs, photographs, and during the clinical examination of teeth and their supporting tissues. Moreover, they directly influence the envelope of motion, the envelope of function, and head and neck posture. This chapter illustrates with clinical examples the correlation between Stomatognathic System structural damage and repeating patterns of abnormal occlusal force distribution. The T-Scan technology isolates these damaging regions of excess microtraumatic occlusal force, absent of clinician subjectivity, thereby helping clinicians make an accurate, organized, and documented occlusal diagnosis.
APA, Harvard, Vancouver, ISO, and other styles
10

"Fish Habitat: Essential Fish Habitat and Rehabilitation." In Fish Habitat: Essential Fish Habitat and Rehabilitation, edited by Philip Roni, Laurie A. Weitkamp, and Joe Scordino. American Fisheries Society, 1999. http://dx.doi.org/10.47886/9781888569124.ch9.

Full text
Abstract:
<em>Abstract.—</em> Freshwater and marine essential fish habitat (EFH) for chinook <em>Oncorhynchus tshawytscha</em> , coho <em>O. kisutch</em> , pink <em>O. gorbuscha</em> , and sockeye <em>O. nerka </em> salmon within Washington, Oregon, California, and Idaho was described and identified using the available literature and databases on salmon distribution and life history. The diversity of freshwater habitats utilized by individual species of salmon coupled with the limitations of existing distribution maps precluded identification of specific stream reaches, wetlands, and other water bodies as EFH for Pacific salmon. A more holistic watershed approach consistent with the ecosystem method recommended by the revised Magnuson-Stevens Fishery Conservation and Management Act was necessary. Therefore, Pacific salmon freshwater EFH was delineated and described as all existing water bodies currently and historically utilized by Pacific salmon within selected watersheds defined by U.S. Geological Survey hydrologic units. Areas above some long-standing artificial barriers to juvenile and adult salmon migration were excluded from designation as Pacific salmon EFH. Delineation of marine EFH was also problematic because of the paucity of scientific studies on offshore Pacific salmon habitat use and distribution. However, available scientific data augmented by information from commercial fisheries indicate that juvenile salmon are found in high concentrations in the nearshore areas of the continental shelf off the Washington, Oregon, and California coasts from late spring through fall. Therefore, Pacific salmon marine EFH was identified as all waters within 60 km of the Washington, Oregon, and California coasts north of Point Conception, California. This initial effort to identify Pacific salmon EFH emphasized the need for accurate, fine-scale geographic information systems data on freshwater and marine salmon distribution and habitat quality and the need for compilation of uniform data sets. Future efforts should focus on developing accurate seasonal salmon distribution data at a 1:24,000 scale to aid in more precise and accurate delineation of Pacific salmon EFH. Furthermore, detailed information on winter distribution of Pacific salmon would be useful in delineating marine EFH.
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Accurate distribution data"

1

Vojtech, Josef, Vladimir Smotlacha, and Jan Radil. "Distribution of accurate time over fiber data network." In 2015 Science and Information Conference (SAI). IEEE, 2015. http://dx.doi.org/10.1109/sai.2015.7237328.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Hay, Michael, Chao Li, Gerome Miklau, and David Jensen. "Accurate Estimation of the Degree Distribution of Private Networks." In 2009 Ninth IEEE International Conference on Data Mining (ICDM). IEEE, 2009. http://dx.doi.org/10.1109/icdm.2009.11.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Bletterie, B., S. Kadam, M. Stifter, A. Abart, D. Burnier de Castro, and H. Brunner. "Characterising LV networks on the basis of smart meter data and accurate network models." In CIRED 2012 Workshop: Integration of Renewables into the Distribution Grid. IET, 2012. http://dx.doi.org/10.1049/cp.2012.0883.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Kumar, Abhishek, Minho Sung, Jun (Jim) Xu, and Jia Wang. "Data streaming algorithms for efficient and accurate estimation of flow size distribution." In the joint international conference. New York, New York, USA: ACM Press, 2004. http://dx.doi.org/10.1145/1005686.1005709.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Naidu, OD, Neethu George, and Ashok S. "An accurate fault location method for radial distribution system using one terminal data." In 2015 International Conference on Technological Advancements in Power and Energy (TAP Energy). IEEE, 2015. http://dx.doi.org/10.1109/tapenergy.2015.7229615.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

di Lembo, G., P. Petroni, and C. Noce. "Reduction of power losses and CO2 emissions: accurate network data to obtain good performances of DMS systems." In 20th International Conference and Exhibition on Electricity Distribution (CIRED 2009). IET, 2009. http://dx.doi.org/10.1049/cp.2009.0705.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Bansal, Ankit, Michael F. Modest, and Deborah Levin. "Narrow-Band k-Distribution Database for Atomic Radiation in Hypersonic Nonequilibrium Flows." In ASME 2009 Heat Transfer Summer Conference collocated with the InterPACK09 and 3rd Energy Sustainability Conferences. ASMEDC, 2009. http://dx.doi.org/10.1115/ht2009-88120.

Full text
Abstract:
Full-spectrum k-distribution (FSK) and multi-group FSK approaches make it possible to evaluate radiative fluxes at a fraction of the cost needed for line-by-line calculations. However, the required k-distributions need to be assembled from accurate absorption coefficient data for each flow condition, which is computationally expensive. An accurate and compact narrow-band k-distribution database has been developed for the most important species encountered in hypersonic nonequilibrium flow. The database allows users to calculate desired full-spectrum k-distributions through look-up and interpolation. Strategies for k-distribution data generation are outlined. The accuracy of the database is tested by comparing narrow-band mean absorption coefficients and narrow-band emissivities with those obtained from line-by-line calculations. Application of the database to construct full-spectrum k-distributions accurately and efficiently is discussed, and results from a number of heat transfer calculations and cpu-time studies are presented.
APA, Harvard, Vancouver, ISO, and other styles
8

Xiangning Lin, Jinwen Sun, Zhongkang Wei, Zhiqian Bo, Zhengtian Li, Rongjin Zheng, Jianfei Xu, and I. Kursan. "Accurate Single-ended Data based Fault Location Method for Multi-Source and Multi-branch Distribution Network." In 12th IET International Conference on Developments in Power System Protection (DPSP 2014). Institution of Engineering and Technology, 2014. http://dx.doi.org/10.1049/cp.2014.0097.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Lin, Jiayuan, Wei Chen, Xianggen Yin, Rui Chen, Yu Bai, Junjie Shi, Li Yu, and Hao Bai. "An Accurate Fault Location Method for Distribution Lines Based on Coupling Parameter Identification Using Two Terminals Data." In 2019 IEEE 8th International Conference on Advanced Power System Automation and Protection (APAP). IEEE, 2019. http://dx.doi.org/10.1109/apap47170.2019.9224912.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Lee, Jonathan. "Automatic Fault Location on Distribution Networks Using Synchronized Voltage Phasor Measurement Units." In ASME 2014 Power Conference. American Society of Mechanical Engineers, 2014. http://dx.doi.org/10.1115/power2014-32231.

Full text
Abstract:
Automatic fault location on the distribution system is a necessity for a resilient grid with fast service restoration after an outage. Motivated by the development of low cost synchronized voltage phasor measurement units (PMUs) for the distribution system, this paper describes how PMU data during a fault event can be used to accurately locate faults on the primary distribution system. Rather than requiring many specialized line sensors to enable fault location, the proposed approach leverages a PMU data stream that can be used for a variety of applications, making it easier to justify the investment in fault location. The accuracy of existing automatic fault location techniques are dependent either on dense deployments of line sensors or unrealistically accurate models of system loads. This paper demonstrates how synchronized voltage measurements enable sufficiently accurate fault location with relatively few instrumentation devices and relatively low fidelity system models. The IEEE 123 bus distribution feeder is examined as a test case, and the proposed algorithm is demonstrated to be robust to variations in total load and uncertainty in the response of loads to voltage sags during a sample set of varied fault conditions.
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Accurate distribution data"

1

Puttanapong, Nattapong, Arturo M. Martinez Jr, Mildred Addawe, Joseph Bulan, Ron Lester Durante, and Marymell Martillan. Predicting Poverty Using Geospatial Data in Thailand. Asian Development Bank, December 2020. http://dx.doi.org/10.22617/wps200434-2.

Full text
Abstract:
This study examines an alternative approach in estimating poverty by investigating whether readily available geospatial data can accurately predict the spatial distribution of poverty in Thailand. It also compares the predictive performance of various econometric and machine learning methods such as generalized least squares, neural network, random forest, and support vector regression. Results suggest that intensity of night lights and other variables that approximate population density are highly associated with the proportion of population living in poverty. The random forest technique yielded the highest level of prediction accuracy among the methods considered, perhaps due to its capability to fit complex association structures even with small and medium-sized datasets.
APA, Harvard, Vancouver, ISO, and other styles
2

Terzic, Vesna, and William Pasco. Novel Method for Probabilistic Evaluation of the Post-Earthquake Functionality of a Bridge. Mineta Transportation Institute, April 2021. http://dx.doi.org/10.31979/mti.2021.1916.

Full text
Abstract:
While modern overpass bridges are safe against collapse, their functionality will likely be compromised in case of design-level or beyond design-level earthquake, which may generate excessive residual displacements of the bridge deck. Presently, there is no validated, quantitative approach for estimating the operational level of the bridge after an earthquake due to the difficulty of accurately simulating residual displacements. This research develops a novel method for probabilistic evaluation of the post-earthquake functionality state of the bridge; the approach is founded on an explicit evaluation of bridge residual displacements and associated traffic capacity by considering realistic traffic load scenarios. This research proposes a high-fidelity finite-element model for bridge columns, developed and calibrated using existing experimental data from the shake table tests of a full-scale bridge column. This finite-element model of the bridge column is further expanded to enable evaluation of the axial load-carrying capacity of damaged columns, which is critical for an accurate evaluation of the traffic capacity of the bridge. Existing experimental data from the crushing tests on the columns with earthquake-induced damage support this phase of the finite-element model development. To properly evaluate the bridge's post-earthquake functionality state, realistic traffic loadings representative of different bridge conditions (e.g., immediate access, emergency traffic only, closed) are applied in the proposed model following an earthquake simulation. The traffic loadings in the finite-element model consider the distribution of the vehicles on the bridge causing the largest forces in the bridge columns.
APA, Harvard, Vancouver, ISO, and other styles
3

Galvin, Jeff, and Sarah Studd. Vegetation inventory, mapping, and characterization report, Saguaro National Park: Volume III, type descriptions. Edited by Alice Wondrak Biel. National Park Service, March 2021. http://dx.doi.org/10.36967/nrr-2284802.

Full text
Abstract:
The Sonoran Desert Network (SODN) conducted a vegetation mapping and characterization effort at the two districts of Saguaro National Park from 2010 to 2018. This project was completed under the National Park Service (NPS) Vegetation Mapping Inventory, which aims to complete baseline mapping and classification inventories at more than 270 NPS units. The vegetation map data were collected to provide park managers with a digital map product that meets national standards of spatial and thematic accuracy, while also placing the vegetation into a regional and national context. A total of 97 distinct vegetation communities were described: 83 exclusively at the Rincon Mountain District, 9 exclusively at the Tucson Mountain District, and 5 occurring in both districts. These communities ranged from low-elevation creosote (Larrea tridentata) shrub-lands spanning broad alluvial fans to mountaintop Douglas fir (Pseudotsuga menziesii) forests on the slopes of Rincon Peak. All 97 communities were described at the association level, each with detailed narratives including lists of species found in each association, their abundance, landscape features, and overall community structural characteristics. Only 15 of the 97 vegetation types were existing “accepted” types within the NVC. The others are newly de-scribed and specific to Saguaro National Park (and will be proposed for formal status within the NVC). This document is Volume III of three volumes comprising the Saguaro National Park Vegetation Mapping Inventory. This volume provides full type descriptions of the 97 associations identified and mapped during the project, and detailed in Volume I. Volume II provides abridged versions of these full descriptions, briefly describing the floristic and structural characteristics of the vegetation and showing representative photos of associations, their distribution, and an example of the satellite imagery for one polygon.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography