Dissertations / Theses on the topic 'Traffic safety Data processing'

To see the other types of publications on this topic, follow the link: Traffic safety Data processing.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Traffic safety Data processing.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Chisalita, Ioan. "Communication and Networking Techniques for Traffic Safety Systems." Doctoral thesis, Linköping : Department of Computer and Information Science, Linköpings universitet, 2006. http://www.bibl.liu.se/liupubl/disp/disp2006/tek1018s.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Dickinson, Keith William. "Traffic data capture and analysis using video image processing." Thesis, University of Sheffield, 1986. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.306374.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

賴翰笙 and Hon-seng Lai. "An effective methodology for visual traffic surveillance." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2000. http://hub.hku.hk/bib/B30456708.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

So, Wai-ki, and 蘇慧琪. "Shadow identification in traffic video sequences." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2005. http://hub.hku.hk/bib/B32045967.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Boonsiripant, Saroch. "Speed profile variation as a surrogate measure of road safety based on GPS-equipped vehicle data." Diss., Atlanta, Ga. : Georgia Institute of Technology, 2009. http://hdl.handle.net/1853/28275.

Full text
Abstract:
Thesis (M. S.)--Civil and Environmental Engineering, Georgia Institute of Technology, 2009.
Committee Chair: Hunter, Michael; Committee Member: Dixon, Karen; Committee Member: Guensler, Randall; Committee Member: Rodgers, Michael; Committee Member: Tsui, Kwok-Leung.
APA, Harvard, Vancouver, ISO, and other styles
6

Goulet, Dennis A., Joseph McMorrow, G. Edward Roberts, and Robert Lynch. "VESSEL TRAFFIC MANAGEMENT SYSTEM A Test Technology Development and Demonstration Project." International Foundation for Telemetering, 1997. http://hdl.handle.net/10150/607390.

Full text
Abstract:
International Telemetering Conference Proceedings / October 27-30, 1997 / Riviera Hotel and Convention Center, Las Vegas, Nevada
The Vessel Traffic Management System is a cooperative effort of the Naval Undersea Warfare Center and the Naval Air Warfare Center Aircraft Division, funded by the OSD's Test Technology Development and Demonstration Program. The project is establishing the capability to acquire ship tracking information from numerous sources (GPS and radar target extractors), and combine them into a comprehensive, integrated view of the range safety target area. The consolidated tracking information will be transmitted to range safety vessel personnel and presented on portable display systems to aid in clearing the surveillance area of unauthorized vessels. The communications module is media independent in that positional and image data can be routed via RF modem, cellular phone, Intranet or Internet, singly or in any combination. The software systems for data acquisition, display and control are also platform independent, with the system under development operating under WindowsNT and Windows95. Additionally, the use of Java and VRML tools permits a user to display data (including three dimensional presentations of the data) without requiring the applications software. This system has numerous applications including range safety, commercial vessel traffic management, port authority and services monitoring, and oceanographic data gathering.
APA, Harvard, Vancouver, ISO, and other styles
7

Craig, David W. (David William) Carleton University Dissertation Engineering Electrical. "Light traffic loss of random hard real-time tasks in a network." Ottawa, 1988.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
8

Mollet, C. J. "The analysis of road traffic accident data in the implementation of road safety remedial programmes." Thesis, Stellenbosch : Stellenbosch University, 2001. http://hdl.handle.net/10019.1/52483.

Full text
Abstract:
Thesis (M.Ing.)--Stellenbosch University, 2001.
ENGLISH ABSTRACT: A road safety remedial programme has as an objective the improvement of road transportation safety by applying road safety engineering remedial measures to hazardous road network elements in a manner that will be economically efficient. Since accident data is the primary manifestation of poor safety levels it must be analysed in manner that will support the overall objective of economic efficiency. Three steps in the process of implementing a road safety remedial programme, that rely on the systematic analysis of accident data, are the identification of hazardous locations, the ranking of hazardous locations and the evaluation of remedial measure effectiveness. The efficiency of a road safety remedial programme can be enhanced by using appropriate methodologies to measure safety, identify and rank hazardous locations and to determine the effectiveness of road safety remedial measures. There are a number of methodologies available to perform these tasks, although some perform much better than other. Methodologies based on the Empirical Bayesian approach generally provide better results than the Conventional methods. Bayesian methodologies are not often used in South Africa. To do so would require the additional training of students and engineering professionals as well as more research by tertiary and other research institutions. The efficiency of a road safety remedial programme can be compromised by using poor quality accident data. In South Africa the quality of accident data is generally poor and should more attention be given to the proper management and control of accident data. This thesis will report on, investigate and evaluate Bayesian and Conventional accident data analysis methodologies.
AFRIKAANSE OPSOMMING: Die doel van 'n padveiligheidsverbeteringsprogram is om op die mees koste effektiewe manier die veiligheid van onveilige padnetwerkelemente te verbeter deur die toepassing van ingenieursmaatreëls. Aangesien padveiligheid direk verband hou met verkeersongelukke vereis die koste effektiewe implementering van 'n padveiligheidsverbeteringsprogram die doelgerigte en korrekte ontleding van ongeluksdata. Om 'n padveiligheidsverbeteringsprogram te implementeer word die ontleding van ongeluksdata verlang vir die identifisering en priortisering van gevaarkolle, sowel as om die effektiwiteit van verbeteringsmaatreëls te bepaal. Die koste effektiwiteit van 'n padveiligheidsverbeteringsprogram kan verbeter word deur die regte metodes te kies om padveiligheid te meet, gevaarkolle te identifiseer en te prioritiseer en om die effektiwiteit van verbeteringsmaatreëls te bepaal. Daar is verskeie metodes om hierdie ontledings te doen, alhoewel sommige van die metodes beter is as ander. Die 'Bayesian' metodes lewer oor die algemeen beter resultate as die gewone konvensionele metodes. 'Bayesian' metodes word nie. in Suid Afrika toegepas nie. Om dit te doen sal addisionele opleiding van studente en ingenieurs vereis, sowel as addisionele navorsing deur universiteite en ander navorsing instansies. Die gebruik van swak kwaliteit ongeluksdata kan die integriteit van 'n padveiligheidsverbeteringsprogram benadeel. Die kwaliteit van ongeluksdata in Suid Afrika is oor die algemeen swak en behoort meer aandag gegee te word aan die bestuur en kontrole van ongeluksdata. Die doel van hierdie tesis is om verslag te doen oor 'Bayesian' en konvensionele metodes wat gebruik kan word om ongeluksdata te ontleed, dit te ondersoek en te evalueer.
APA, Harvard, Vancouver, ISO, and other styles
9

Popescu, Vlad M. "Airspace analysis and design by data aggregation and lean model synthesis." Diss., Georgia Institute of Technology, 2013. http://hdl.handle.net/1853/49126.

Full text
Abstract:
Air traffic demand is growing. New methods of airspace design are required that can enable new designs, do not depend on current operations, and can also support quantifiable performance goals. The main goal of this thesis is to develop methods to model inherent safety and control cost so that these can be included as principal objectives of airspace design, in support of prior work which examines capacity. The first contribution of the thesis is to demonstrate two applications of airspace analysis and design: assessing the inherent safety and control cost of the airspace. Two results are shown, a model which estimates control cost depending on autonomy allocation and traffic volume, and the characterization of inherent safety conditions which prevent unsafe trajectories. The effects of autonomy ratio and traffic volume on control cost emerge from a Monte Carlo simulation of air traffic in an airspace sector. A maximum likelihood estimation identifies the Poisson process to be the best stochastic model for control cost. Recommendations are made to support control-cost-centered airspace design. A novel method to reliably generate collision avoidance advisories, in piloted simulations, by the widely-used Traffic Alert and Collision Avoidance System (TCAS) is used to construct unsafe trajectory clusters. Results show that the inherent safety of routes can be characterized, determined, and predicted by relatively simple convex polyhedra (albeit multi-dimensional and involving spatial and kinematic information). Results also provide direct trade-off relations between spatial and kinematic constraints on route geometries that preserve safety. Accounting for these clusters thus supports safety-centered airspace design. The second contribution of the thesis is a general methodology that generalizes unifying principles from these two demonstrations. The proposed methodology has three steps: aggregate data, synthesize lean model, and guide design. The use of lean models is a result of a natural flowdown from the airspace view to the requirements. The scope of the lean model is situated at a level of granularity that identifies the macroscopic effects of operational changes on the strategic level. The lean model technique maps low-level changes to high-level properties and provides predictive results. The use of lean models allows the mapping of design variables (route geometry, autonomy allocation) to design evaluation metrics (inherent safety, control cost).
APA, Harvard, Vancouver, ISO, and other styles
10

Agafonov, Evgeny. "Fuzzy and multi-resolution data processing for advanced traffic and travel information." Thesis, Nottingham Trent University, 2003. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.271790.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Diosdado-De-La-Pena, Maria-Paulina. "Safety externalities of SUVs on passenger cars an analysis of the Peltzman Effect using FARS data /." Morgantown, W. Va. : [West Virginia University Libraries], 2008. https://eidr.wvu.edu/etd/documentdata.eTD?documentid=5812.

Full text
Abstract:
Thesis (M.S.)--West Virginia University, 2008.
Title from document title page. Document formatted into pages; contains vi, 75 p. : ill. (some col.). Includes abstract. Includes bibliographical references (p. 47-50).
APA, Harvard, Vancouver, ISO, and other styles
12

Ismail, Karim Aldin. "Application of computer vision techniques for automated road safety analysis and traffic data collection." Thesis, University of British Columbia, 2010. http://hdl.handle.net/2429/29546.

Full text
Abstract:
Safety and sustainability are the two main themes of this thesis. They are also the two main pillars of a functional transportation system. Recent studies showed that the cost of road collisions in Canada exceeds the cost of traffic congestion by almost tenfold. The reliance on collision statistics alone to enhance road safety is challenged by qualitative and quantitative limitations of collision data. Traffic conflict techniques have been advocated as a proactive and supplementary approach to collision-based road safety analysis. However, the cost of field observation of traffic conflicts coupled with observer subjectivity have inhibited the widespread acceptance of these techniques. This thesis advocates the use of computer vision for conducting automated, resource-efficient, and objective traffic conflict analysis. Video data in this thesis was collected at several national and international locations. Real-world coordinates of road users' positions were extracted by tracking moving features visible on road users from a calibrated camera. Subsequently, road users were classified into pedestrians and non-pedestrians, not differentiating between other road users' classes. Classification was based on automatically-learned and manually-annotated motion patterns. Subsequent to road user tracking, various spatiotemporal proximity measures were implemented to measure the severity of traffic events. The following contributions were achieved in this thesis: i) co-development of a methodology for tracking and classifying road users, ii) development of a methodology for measuring real-world coordinates of road users' positions which appear in video sequences, iii) automated measurement of pedestrian walking speed, iv) investigation of the effect of different factors on pedestrian walking speed, v) development and validation of a methodology for automated detection of pedestrian-vehicle conflicts, vi) investigation of the application of the developed methodology in a before-and-after evaluation of a pedestrian scramble treatment, vii) development of a methodology for aggregating event-level severity measurements into a safety index, viii) development and validation of two methodologies for automated detection of spatial traffic violations. Another contribution of this thesis was the creation of a video library collected from several locations around the world which can significantly aid in future developments in this field.
APA, Harvard, Vancouver, ISO, and other styles
13

Van, Espen Adinda. "Evaluating traffic safety performance of countries using data envelopment analysis and accident prediction models." Thesis, University of British Columbia, 2013. http://hdl.handle.net/2429/44552.

Full text
Abstract:
Road safety is an issue of global importance, receiving both national and international attention. According to the World Health Organization, road traffic injuries are extrapolated to become the fifth leading cause of death in the world by 2030. Studies conducted to gain better insight into how countries can improve their road safety performance levels often use one single variable – the number of fatalities per million inhabitants – and focus predominantly on European countries. This thesis looks to develop and analyze models incorporating a wider range of countries as well as a wider range of road safety performance indicators using data envelopment analysis and accident prediction models. The first method, initially calculate the efficiency scores using three input variables (percentage of seatbelt use in front seat, road density, and total health expenditure as percentage of GDP) and two output variables (number of fatalities per million inhabitants and fatalities per million passenger cars). It was found that the addition of the percentage of seatbelt use in rear seats (fourth input variable) and the percentage of roads paved (fifth input variable) improved the efficiency scores and rankings. Overall, the percentage of seat belt use in front seats and the total health expenditure variables had the greatest importance. The second method developed three accident prediction models using the generalized linear modeling approach with the negative binomial error structure. The elasticity analysis revealed that, for Model 1 and Model 2, the health expenditure variable had the greatest impact on the number of fatalities. For Model 3, the seatbelt wearing rate in front seats and the seatbelt wearing rate in rear seats had the greatest effect on the number of fatalities.
APA, Harvard, Vancouver, ISO, and other styles
14

Wolf, Jean Louise. "Using GPS data loggers to replace travel diaries in the collection of travel data." Diss., Georgia Institute of Technology, 2000. http://hdl.handle.net/1853/20203.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

St-Aubin, Paul. "Traffic safety analysis for urban highway ramps and lane- change bans using accident data and video-based surrogate safety measures." Thesis, McGill University, 2012. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=107783.

Full text
Abstract:
The purpose of this study was to evaluate the traffic safety of urban highway segments near exit and entrance ramps in the Montréal metropolitan area. The city's tight urban environment has resulted in the construction of sub-standard highway ramp merging sections (e.g. short merging lengths, inadequate visibility, influence zone overlap, etc.). In order to mitigate safety problems associated with these inadequately designed features, a special lane-change ban treatment (technical designation LCGV1) was implemented several years ago at various ramps. This study used accident data and video-based surrogate safety measures to evaluate the safety effectiveness of the treatment.The cross-sectional accident analysis controlled for factors such as lane configuration, merge length, traffic flow and speed, area of influence overlap (inter-ramp distance), lane and shoulder widths, horizontal and vertical curves, and covered the presence of the treatment across 10 years of accident data at multiple sites along Montréal's busiest highways. The time-to-collision conflict measure obtained from automated video-based vehicle trajectory extrapolation was analyzed and used to identify microscopic behaviour patterns and conflicting interactions.The study generally concludes that, across all sites, the presence of the treatment has led to no appreciable change in accident rate and that other contributing factors have played a greater role in observed accident rate, time-to-collision distribution, and lane changes. However, the study also indicates that there was significant variation between contributing factors across all analysis sites, leading to the conclusion that adopting a general policy of treating an entire urban region is a futile exercise. In addition, it was observed that the treatment has had a slight accident migration effect. These conclusions lead to the recommendation that the treatment should be applied on a case-by-case basis only, and otherwise that the default case (no treatment) should remain in effect so as not to hinder the normal navigation and operation of highway drivers.
Le but de cette étude était d'évaluer la sécurité routière à la hauteur des entrées et des sorties des autoroutes urbaines dans la région métropolitaine de Montréal. Certains sites de cette région comportent des problèmes de conception au niveau de la sécurité routière (notamment une longueur de diffusion trop courte, une visibilité d'approche insuffisante, des zones d'influences se chevauchant, etc.). Afin d'atténuer les problèmes de sécurité associés à ces caractéristiques, un traitement spécial interdisant les changements de voie (désignation technique LCGV1) a été mis en place à ces bretelles. Cette étude vise à évaluer l'efficacité du traitement à partir des données d'accidents et d'une analyse du comportement des usagers à partir des trajectoires des véhicules telles que recueillies depuis des données vidéo. Les facteurs de contrôle de l'analyse transversale comprennent la configuration des voies, la longueur de diffusion, le débit et la vitesse, le chevauchement des zones d'influences (distance des échangeurs), la largeur des voies et des accotements, les courbures horizontales et verticales, et la présence du traitement. L'analyse se fait avec dix ans de données d'accidents à plusieurs sites le long des autoroutes les plus fréquentées de Montréal. L'analyse du comportement des usagers étudie le time-to-collision (TTC) entre les trajectoires des usagers extrapolés depuis des données vidéo et elle identifie les interactions de conflit microscopiques. L'étude conclut généralement que, pour tous les sites, la présence du traitement a conduit à aucun changement appréciable des taux d'accident et que d'autres facteurs ont un effet plus important sur le taux d'accident observé, la distribution du time-to-collision, et les changements de voie. Cependent, on a constaté qu'il y avait des variations importantes de facteurs de site en site, menant à la conclusion que l'adoption d'une politique générale de traitement n'est pas justifiée. En outre, on a remarqué que le traitement avait un léger effet de migration d'accidents. Ces conclusions ont conduit à la recommandation que le traitement soit appliqué au cas par cas seulement.
APA, Harvard, Vancouver, ISO, and other styles
16

Mawji, Afzal. "Achieving Scalable, Exhaustive Network Data Processing by Exploiting Parallelism." Thesis, University of Waterloo, 2004. http://hdl.handle.net/10012/779.

Full text
Abstract:
Telecommunications companies (telcos) and Internet Service Providers (ISPs) monitor the traffic passing through their networks for the purposes of network evaluation and planning for future growth. Most monitoring techniques currently use a form of packet sampling. However, exhaustive monitoring is a preferable solution because it ensures accurate traffic characterization and also allows encoding operations, such as compression and encryption, to be performed. To overcome the very high computational cost of exhaustive monitoring and encoding of data, this thesis suggests exploiting parallelism. By utilizing a parallel cluster in conjunction with load balancing techniques, a simulation is created to distribute the load across the parallel processors. It is shown that a very scalable system, capable of supporting a fairly high data rate can potentially be designed and implemented. A complete system is then implemented in the form of a transparent Ethernet bridge, ensuring that the system can be deployed into a network without any change to the network. The system focuses its encoding efforts on obtaining the maximum compression rate and, to that end, utilizes the concept of streams, which attempts to separate data packets into individual flows that are correlated and whose redundancy can be removed through compression. Experiments show that compression rates are favourable and confirms good throughput rates and high scalability.
APA, Harvard, Vancouver, ISO, and other styles
17

Marupudi, Surendra Brahma. "Framework for Semantic Integration and Scalable Processing of City Traffic Events." Wright State University / OhioLINK, 2016. http://rave.ohiolink.edu/etdc/view?acc_num=wright1472505847.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Dai, Chengyu. "Exploration of Weather Impacts on Freeway Traffic Operations and Safety Using High-Resolution Weather Data." PDXScholar, 2011. https://pdxscholar.library.pdx.edu/open_access_etds/255.

Full text
Abstract:
Adverse weather is considered as one of the important factors contributing to injuries and severe crashes. During rainy conditions, it can reduce travel visibility, increase stopping distance, and create the opportunity hydroplaning. This study quantified the relative crash risk on Oregon 217 southbound direction under rainy conditions by using a match-paired approach, applied one-year traffic data, crash data and NEXRAD Level II radar weather data. There are 26 crashes occurred in match-paired weather conditions for Oregon 217 in year 2007. The results of this study indicate that a higher crash risk and a higher property-damage-only crash risk occurred during rainy days. The crash risk level varies by the location of the highway, at milepost 2.55 station SW Allen Blvd has the highest driving risks under rainy conditions.
APA, Harvard, Vancouver, ISO, and other styles
19

Avadhani, Umesh D. "Data processing in a small transit company using an automatic passenger counter." Thesis, Virginia Tech, 1986. http://hdl.handle.net/10919/45669.

Full text
Abstract:

This thesis describes the work done in the second stage of the implementation of the Automatic Passenger Counter (APC) system at the Roanoke Valley - Metro Transit Company. This second stage deals with the preparation of a few reports and plots that would help the transit managers in efficiently managing the transit system. The reports and plots give an evaluation of the system and service operations by which the decision makers can support their decisions.

For an efficient management of the transit system, data on ridership activity, running times schedule information, and fare revenue is required. From this data it is possible to produce management information reports and summary statistics.

The present data collection program at Roanoke Valleyâ Metro is carried by using checkers and supervisors to collect ridership and schedule adherence information using manual methods. The information needed for efficient management of transit operations is both difficult and expensive to obtain. The new APC system offers the management with a new and powerful tool that will enhance their capability to make better decisions when allocating the service needs. The data from the APC are essential for the transit propertys ongoing planning and scheduling activites. The management could easily quantify the service demands on a route or for the whole system as desired by the user.


Master of Science
APA, Harvard, Vancouver, ISO, and other styles
20

Hwang, Kuo-Ping. "Applying heuristic traffic assignment in natural disaster evacuation: a decision support system." Diss., Virginia Polytechnic Institute and State University, 1986. http://hdl.handle.net/10919/54455.

Full text
Abstract:
The goal of this research is to develop a heuristic traffic assignment method to simulate the traffic flow of a transportation network at a real-time speed. The existing assignment methods are reviewed and a heuristic path-recording assignment method is proposed. Using the new heuristic assignment method, trips are loaded onto the network in a probabilistic approach for the first iteration; paths are recorded, and path impedance is computed as the basis for further assignment iteration. The real-time traffic assignment model developed with the new assignment method is called HEUPRAE. The difference in link traffic between this new assignment and Dial's multipath assignment ranges from 10 to 25 percent. Saving in computer time is about 55 percent. The proposed heuristic path-recording assignment is believed to be an efficient and reliable method. Successful development of this heuristic assignment method helps solve those transportation problems which need assignment results at a real-time speed, and for which the assignment process lasts a couple of hours. Evacuation planning and operation are well suited to the application of this real-time heuristic assignment method. Evacuation planning and operations are major activities in emergency management. Evacuation planning instructs people where to go, which route to take, and the time needed to accomplish an evacuation. Evacuation operations help the execution of an evacuation plan in response to the changing nature of a disaster. The Integrated Evacuation Decision Support System (IEDSS) is a computer system which employs the evacuation planning model, MASSVAC2, and the evacuation operation model, HEUPRAE, to deal with evacuations. The IEDSS uses computer graphics to prepare input and interpret output. It helps a decision maker analyze the evacuation system, review evacuation plans, and issue an evacuation order at a proper time. Users of the IEDSS can work on evacuation problems in a friendly interactive visual environment. The application of the IEDSS to the hurricane and flood problems for the city of Virginia Beach shows how IEDSS is practically implemented. It proves the usefulness of the IEDSS in coping with disasters.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
21

Collin, Sofie. "Synthetic Data for Training and Evaluation of Critical Traffic Scenarios." Thesis, Linköpings universitet, Medie- och Informationsteknik, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-177779.

Full text
Abstract:
Modern camera-based vehicle safety systems heavily rely on machine learning and consequently require large amounts of training data to perform reliably. However, collecting and annotating the needed data is an extremely expensive and time-consuming process. In addition, it is exceptionally difficult to collect data that covers critical scenarios. This thesis investigates to what extent synthetic data can replace real-world data for these scenarios. Since only a limited amount of data consisting of such real-world scenarios is available, this thesis instead makes use of proxy scenarios, e.g. situations when pedestrians are located closely in front of the vehicle (for example at a crosswalk). The presented approach involves training a detector on real-world data where all samples of these proxy scenarios have been removed and compare it to other detectors trained on data where the removed samples have been replaced with various degrees of synthetic data. A method for generating and automatically and accurately annotating synthetic data, using features in the CARLA simulator, is presented. Also, the domain gap between the synthetic and real-world data is analyzed and methods in domain adaptation and data augmentation are reviewed. The presented experiments show that aligning statistical properties between the synthetic and real-world datasets distinctly mitigates the domain gap. There are also clear indications that synthetic data can help detect pedestrians in critical traffic situations

Examensarbetet är utfört vid Institutionen för teknik och naturvetenskap (ITN) vid Tekniska fakulteten, Linköpings universitet

APA, Harvard, Vancouver, ISO, and other styles
22

Pumfrey, David John. "The principled design of computer system safety analyses." Thesis, University of York, 1999. http://etheses.whiterose.ac.uk/9797/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Cross, Ginger Wigington. "The impact of an auditory task on visual processing implications for cellular phone usage while driving /." Diss., Mississippi State : Mississippi State University, 2008. http://library.msstate.edu/etd/show.asp?etd=etd-04012008-083032.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Lam, Fung, and 林峰. "Internet inter-domain traffic engineering and optimizatioon." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2001. http://hub.hku.hk/bib/B31224581.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Cacciola, Stephen J. "Fusion of Laser Range-Finding and Computer Vision Data for Traffic Detection by Autonomous Vehicles." Thesis, Virginia Tech, 2007. http://hdl.handle.net/10919/36126.

Full text
Abstract:
The DARPA Challenges were created in response to a Congressional and Department of Defense (DoD) mandate that one-third of US operational ground combat vehicles be unmanned by the year 2015. The Urban Challenge is the latest competition that tasks industry, academia, and inventors with designing an autonomous vehicle that can safely operate in an urban environment. A basic and important capability needed in a successful competition vehicle is the ability to detect and classify objects. The most important objects to classify are other vehicles on the road. Navigating traffic, which includes other autonomous vehicles, is critical in the obstacle avoidance and decision making processes. This thesis provides an overview of the algorithms and software designed to detect and locate these vehicles. By combining the individual strengths of laser range-finding and vision processing, the two sensors are able to more accurately detect and locate vehicles than either sensor acting alone. The range-finding module uses the built-in object detection capabilities of IBEO Alasca laser rangefinders to detect the location, size, and velocity of nearby objects. The Alasca units are designed for automotive use, and so they alone are able to identify nearby obstacles as vehicles with a high level of certainty. After some basic filtering, an object detected by the Alasca scanner is given an initial classification based on its location, size, and velocity. The vision module uses the location of these objects as determined by the ranger finder to extract regions of interest from large images through perspective transformation. These regions of the image are then examined for distinct characteristics common to all vehicles such as tail lights and tires. Checking multiple characteristics helps reduce the number of false-negative detections. Since the entire image is never processed, the image size and resolution can be maximized to ensure the characteristics are as clear as possible. The existence of these characteristics is then used to modify the certainty level from the IBEO and determine if a given object is a vehicle.
Master of Science
APA, Harvard, Vancouver, ISO, and other styles
26

Moskaluk, John. "Arterial priority option for the TRANSYT-7F traffic-signal-timing program." Diss., Georgia Institute of Technology, 1987. http://hdl.handle.net/1853/19428.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Loy, James Michael. "RELATING NATURALISTIC GLOBAL POSITIONING SYSTEM (GPS) DRIVING DATA WITH LONG-TERM SAFETY PERFORMANCE OF ROADWAYS." DigitalCommons@CalPoly, 2013. https://digitalcommons.calpoly.edu/theses/1078.

Full text
Abstract:
This thesis describes a research study relating naturalistic Global Positioning System (GPS) driving data with long-term traffic safety performance for two classes of roadways. These two classes are multilane arterial streets and limited access highways. GPS driving data used for this study was collected from 33 volunteer drivers from July 2012 to March 2013. The GPS devices used were custom GPS data loggers capable of recording speed, position, and other attributes at an average rate of 2.5 hertz. Linear Referencing in ESRI ArcMAP was performed to assign spatial and other roadway attributes to each GPS data point collected. GPS data was filtered to exclude data with high horizontal dilution of precision (HDOP), incorrect heading attributes or other GPS communication errors. For analysis of arterial roadways, the Two-Fluid model parameters were chosen as the measure for long-term traffic safety analysis. The Two-Fluid model was selected based on previous research which showed correlation between the Two-Fluid model parameters n and Tm and total crash rate along arterial roadways. Linearly referenced GPS data was utilized to obtain the total travel time and stop time for several half-mile long trips along two arterial roadways, Grand Avenue and California Boulevard, in San Luis Obispo. Regression between log transformed values of these variables (total travel time and stop time) were used to derive the parameters n and Tm. To estimate stop time for each trip, a vehicle “stop” was defined when the device was traveling at less than 2 miles per hour. Results showed that Grand Avenue had a higher value for n and a lower value for Tm, which suggests that Grand Avenue may have worse long-term safety performance as characterized by long-term crash rates. However, this was not verified with crash data due to incomplete crash data in the TIMS database. Analysis of arterial roadways concluded by verifying GPS data collected in the California Boulevard study with sample data collected utilizing a traditional “car chase” methodology, which showed that no significant difference in the two data sources existed when trips included noticeable stop times. For analysis of highways the derived measurement of vehicle jerk, or rate of change of acceleration, was calculated to explore its relationship with long-term traffic safety performance of highway segments. The decision to use jerk comes from previous research which utilized high magnitude jerk events as crash surrogate, or near-crash events. Instead of using jerk for near-crash analysis, the measurement of jerk was utilized to determine the percentage of GPS data observed below a certain negative jerk threshold for several highway segments. These segments were ¼-mile and ½-mile long. The preliminary exploration was conducted with 39 ¼-mile long segments of US Highway 101 within the city limits of San Luis Obispo. First, Pearson’s correlation coefficients were estimated for rate of ‘high’ jerk occurrences on these highway segments (with definitions of ‘high’ depending on varying jerk thresholds) and an estimate of crash rates based on long-term historical crash data. The trends in the correlation coefficients as the thresholds were varied led to conducting further analysis based on a jerk threshold of -2 ft./sec3 for the ¼-mile segment analysis and -1 ft./sec3 for the ¼-mile segment analysis. Through a negative binomial regression model, it was shown that utilizing the derived jerk percentage measure showed a significant correlation with the total number of historical crashes observed along US Highway 101. Analysis also showed that other characteristics of the roadway, including presences of a curve, presence of weaving (indicated by the presence of auxiliary lanes), and average daily traffic (ADT) did not have a significant correlation with observed crashes. Similar analysis was repeated for 19 ½-mile long segments in the same study area, and it was found the percentage of high negative jerk metric was again significant with historical crashes. The ½-mile negative binomial regression for the presence of curve was also a significant variable; however the standard error for this determination was very high due to a low sample size of analysis segments that did not contain curves. Results of this research show the potential benefit that naturalistic GPS driving data can provide for long-term traffic safety analysis, even if data is unaccompanied with any additional data (such as live video feed) collected with expensive vehicle instrumentation. The methodologies of this study are repeatable with many GPS devices found in certain consumer electronics, including many newer smartphones.
APA, Harvard, Vancouver, ISO, and other styles
28

Byrne, Patricia Hiromi. "Development of an advisory system for indoor radon mitigation." PDXScholar, 1991. https://pdxscholar.library.pdx.edu/open_access_etds/4263.

Full text
Abstract:
A prototype hybrid knowledge-based advisory system for indoor radon mitigation has been developed to assist Pacific Northwest mitigators in the selection and design of mitigation systems for existing homes. The advisory system employs a heuristic inferencing strategy to determine which mitigation techniques are applicable, and applies procedural methods to perform the fan selection and cost estimation for particular techniques. The rule base has been developed employing knowledge in existing publications on radon mitigation. Additional knowledge has been provided by field experts. The benefits of such an advisory system include uniform record-keeping and consistent computations for the user, and verification of approved radon mitigation methods.
APA, Harvard, Vancouver, ISO, and other styles
29

Kumar, Saurabh. "Real-Time Road Traffic Events Detection and Geo-Parsing." Thesis, Purdue University, 2018. http://pqdtopen.proquest.com/#viewpdf?dispub=10842958.

Full text
Abstract:

In the 21st century, there is an increasing number of vehicles on the road as well as a limited road infrastructure. These aspects culminate in daily challenges for the average commuter due to congestion and slow moving traffic. In the United States alone, it costs an average US driver $1200 every year in the form of fuel and time. Some positive steps, including (a) introduction of the push notification system and (b) deploying more law enforcement troops, have been taken for better traffic management. However, these methods have limitations and require extensive planning. Another method to deal with traffic problems is to track the congested area in a city using social media. Next, law enforcement resources can be re-routed to these areas on a real-time basis.

Given the ever-increasing number of smartphone devices, social media can be used as a source of information to track the traffic-related incidents.

Social media sites allow users to share their opinions and information. Platforms like Twitter, Facebook, and Instagram are very popular among users. These platforms enable users to share whatever they want in the form of text and images. Facebook users generate millions of posts in a minute. On these platforms, abundant data, including news, trends, events, opinions, product reviews, etc. are generated on a daily basis.

Worldwide, organizations are using social media for marketing purposes. This data can also be used to analyze the traffic-related events like congestion, construction work, slow-moving traffic etc. Thus the motivation behind this research is to use social media posts to extract information relevant to traffic, with effective and proactive traffic administration as the primary focus. I propose an intuitive two-step process to utilize Twitter users' posts to obtain for retrieving traffic-related information on a real-time basis. It uses a text classifier to filter out the data that contains only traffic information. This is followed by a Part-Of-Speech (POS) tagger to find the geolocation information. A prototype of the proposed system is implemented using distributed microservices architecture.

APA, Harvard, Vancouver, ISO, and other styles
30

Irwin, Barry Vivian William. "Bandwidth management and monitoring for IP network traffic : an investigation." Thesis, Rhodes University, 2001. http://hdl.handle.net/10962/d1006492.

Full text
Abstract:
Bandwidth management is a topic which is often discussed, but on which relatively little work has been done with regard to compiling a comprehensive set of techniques and methods for managing traffic on a network. What work has been done has concentrated on higher end networks, rather than the low bandwidth links which are commonly available in South Africa and other areas outside the United States. With more organisations increasingly making use of the Internet on a daily basis, the demand for bandwidth is outstripping the ability of providers to upgrade their infrastructure. This resource is therefore in need of management. In addition, for Internet access to become economically viable for widespread use by schools, NGOs and other academic institutions, the associated costs need to be controlled. Bandwidth management not only impacts on direct cost control, but encompasses the process of engineering a network and network resources in order to ensure the provision of as optimal a service as possible. Included in this is the provision of user education. Software has been developed for the implementation of traffic quotas, dynamic firewalling and visualisation. The research investigates various methods for monitoring and management of IP traffic with particular applicability to low bandwidth links. Several forms of visualisation for the analysis of historical and near-realtime traffic data are also discussed, including the use of three-dimensional landscapes. A number of bandwidth management practices are proposed, and the advantages of their combination, and complementary use are highlighted. By implementing these suggested policies, a holistic approach can be taken to the issue of bandwidth management on Internet links.
APA, Harvard, Vancouver, ISO, and other styles
31

Krishnaswamy, Vijay. "Heuristic network generator : an expert systems approach for selection of alternative routes during incident conditions /." Thesis, This resource online, 1994. http://scholar.lib.vt.edu/theses/available/etd-05022009-040559/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Thorri, Sigurdsson Thorsteinn. "Road traffic congestion detection and tracking with Spark Streaming analytics." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-254874.

Full text
Abstract:
Road traffic congestion causes several problems. For instance, slow moving traffic in congested regions poses a safety hazard to vehicles approaching the congested region and increased commuting times lead to higher transportation costs and increased pollution.The work carried out in this thesis aims to detect and track road traffic congestion in real time. Real-time road congestion detection is important to allow for mechanisms to e.g. improve traffic safety by sending advanced warnings to drivers approaching a congested region and to mitigate congestion by controlling adaptive speed limits. In addition, the tracking of the evolution of congestion in time and space can be a valuable input to the development of the road network. Traffic sensors in Stockholm’s road network are represented as a directed weighted graph and the congestion detection problem is formulated as a streaming graph processing problem. The connected components algorithm and existing graph processing algorithms originally used for community detection in social network graphs are adapted for the task of road congestion detection. The results indicate that a congestion detection method based on the streaming connected components algorithm and the incremental Dengraph community detection algorithm can detect congestion with accuracy at best up to 94% for connected components and up to 88% for Dengraph. A method based on hierarchical clustering is able to detect congestion while missing details such as shockwaves, and the Louvain modularity algorithm for community detection fails to detect congested regions in the traffic sensor graph.Finally, the performance of the implemented streaming algorithms is evaluated with respect to the real-time requirements of the system, their throughput and memory footprint.
Vägtrafikstockningar orsakar flera problem. Till exempel utgör långsam trafik i överbelastade områden en säkerhetsrisk för fordon som närmar sig den överbelastade regionen och ökade pendeltider leder till ökade transportkostnader och ökad förorening.Arbetet i denna avhandling syftar till att upptäcka och spåra trafikstockningar i realtid. Detektering av vägtrafiken i realtid är viktigt för att möjliggöra mekanismer för att t.ex. förbättra trafiksäkerheten genom att skicka avancerade varningar till förare som närmar sig en överbelastad region och för att mildra trängsel genom att kontrollera adaptiva hastighetsgränser. Dessutom kan spårningen av trängselutveckling i tid och rum vara en värdefull inverkan på utvecklingen av vägnätet. Trafikavkännare i Stockholms vägnät representeras som en riktad vägd graf och problemet med överbelastningsdetektering är formulerat som ett problem med behandling av flödesgrafer. Den anslutna komponentalgoritmen och befintliga grafbehandlingsalgoritmer som ursprungligen användes för communitydetektering i sociala nätgravar är anpassade för uppgiften att detektera vägtäthet. Resultaten indikerar att en överbelastningsdetekteringsmetod baserad på den strömmande anslutna komponentalgoritmen och den inkrementella Dengraph communitydetekteringsalgoritmen kan upptäcka överbelastning med noggrannhet i bästa fall upp till 94% för anslutna komponenter och upp till 88% för Dengraph. En metod baserad på hierarkisk klustring kan detektera överbelastning men saknar detaljer som shockwaves, och Louvain modularitetsalgoritmen för communitydetektering misslyckas med att detektera överbelastade områden i trafiksensorns graf.Slutligen utvärderas prestandan hos de implementerade strömmalgoritmerna med hänsyn till systemets realtidskrav, deras genomströmning och minnesfotavtryck.
APA, Harvard, Vancouver, ISO, and other styles
33

Trinh, Viet. "Using voicexml to provide real-time traffic information." Honors in the Major Thesis, University of Central Florida, 2002. http://digital.library.ucf.edu/cdm/ref/collection/ETH/id/307.

Full text
Abstract:
This item is only available in print in the UCF Libraries. If this is your Honors Thesis, you can help us make it available online for use by researchers around the world by following the instructions on the distribution consent form at http://library.ucf.edu/Systems/DigitalInitiatives/DigitalCollections/InternetDistributionConsentAgreementForm.pdf You may also contact the project coordinator, Kerri Bottorff, at kerri.bottorff@ucf.edu for more information.
Bachelors
Engineering
Computer Engineering
APA, Harvard, Vancouver, ISO, and other styles
34

Smith, Katie S. "A profile of HOV lane vehicle characteristics on I-85 prior to HOV-to-HOT conversion." Thesis, Georgia Institute of Technology, 2011. http://hdl.handle.net/1853/42923.

Full text
Abstract:
The conversion of high-occupancy vehicle (HOV) lanes to high-occupancy toll (HOT) lanes is currently being implemented in metro Atlanta on a demonstration basis and is under consideration for more widespread adoption throughout the metro region. Further conversion of HOV lanes to HOT lanes is a major policy decision that depends on knowledge of the likely impacts, including the equity of the new HOT lane. Rather than estimating these impacts using modeling or surveys, this study collects revealed preference data in the form of observed vehicle license plate data and vehicle occupancy data from users of the HOV corridor. Building on a methodology created in Spring 2011, researchers created a new methodology for matching license plate data to vehicle occupancy data that required extensive post-processing of the data. The new methodology also presented an opportunity to take an in-depth look at errors in both occupancy and license plate data (in terms of data collection efforts, processing, and the vehicle registration database). Characteristics of individual vehicles were determined from vehicle registration records associated with the license plate data collected during AM and PM peak periods immediately prior to the HOV lanes conversion to HOT lanes. More than 70,000 individual vehicle license plates were collected for analysis, and over 3,500 records are matched to occupancy values. Analysis of these data have shown that government and commercial vehicle were more prevalent in the HOV lane, while hybrid and alternative fuel vehicles were much less common in either lane than expected. Vehicle occupancy data from the first four quarters of data collection were used to create the distribution of occupancy on the HOV and general purpose lane, and then the matched occupancy and license plate data were examined. A sensitivity analysis of the occupancy data established that the current use of uncertain occupancy values is acceptable and that bus and vanpool occupancy should be considered when determining the average occupancy of all vehicles on the HOV lane. Using a bootstrap analysis, vehicle values were compared to vehicle occupancy values and the results found that there is no correlation between vehicle value and vehicle occupancy. A conclusions section suggests possible impacts of the findings on policy decisions as Georgia considers expanding the HOT network. Further research using these data, and additional data that will be collected after the HOT lane opens, will include emissions modeling and a study of changes in vehicle characteristics associated with the HOT lane conversion.
APA, Harvard, Vancouver, ISO, and other styles
35

Nkhumeleni, Thizwilondi Moses. "Correlation and comparative analysis of traffic across five network telescopes." Thesis, Rhodes University, 2014. http://hdl.handle.net/10962/d1011668.

Full text
Abstract:
Monitoring unused IP address space by using network telescopes provides a favourable environment for researchers to study and detect malware, worms, denial of service and scanning activities. Research in the field of network telescopes has progressed over the past decade resulting in the development of an increased number of overlapping datasets. Rhodes University's network of telescope sensors has continued to grow with additional network telescopes being brought online. At the time of writing, Rhodes University has a distributed network of five relatively small /24 network telescopes. With five network telescope sensors, this research focuses on comparative and correlation analysis of traffic activity across the network of telescope sensors. To aid summarisation and visualisation techniques, time series' representing time-based traffic activity, are constructed. By employing an iterative experimental process of captured traffic, two natural categories of the five network telescopes are presented. Using the cross- and auto-correlation methods of time series analysis, moderate correlation of traffic activity was achieved between telescope sensors in each category. Weak to moderate correlation was calculated when comparing category A and category B network telescopes' datasets. Results were significantly improved by studying TCP traffic separately. Moderate to strong correlation coefficients in each category were calculated when using TCP traffic only. UDP traffic analysis showed weaker correlation between sensors, however the uniformity of ICMP traffic showed correlation of traffic activity across all sensors. The results confirmed the visual observation of traffic relativity in telescope sensors within the same category and quantitatively analysed the correlation of network telescopes' traffic activity.
APA, Harvard, Vancouver, ISO, and other styles
36

Tacic, Ivan. "Efficient Synchronized Data Distribution Management in Distributed Simulations." Diss., Georgia Institute of Technology, 2005. http://hdl.handle.net/1853/6822.

Full text
Abstract:
Data distribution management (DDM) is a mechanism to interconnect data producers and data consumers in a distributed application. Data producers provide useful data to consumers in the form of messages. For each message produced, DDM determines the set of data consumers interested in receiving the message and delivers it to those consumers. We are particularly interested in DDM techniques for parallel and distributed discrete event simulations. Thus far, researchers have treated synchronization of events (i.e. time management) and DDM independent of each other. This research focuses on how to realize time managed DDM mechanisms. The main reason for time-managed DDM is to ensure that changes in the routing of messages from producers to consumers occur in a correct sequence. Also time managed DDM avoids non-determinism in the federation execution, which may result in non-repeatable executions. An optimistic approach to time managed DDM is proposed where one allows DDM events to be processed out of time stamp order, but a detection and recovery procedure is used to recover from such errors. These mechanisms are tailored to the semantics of the DDM operations to ensure an efficient realization. A correctness proof is presented to verify the algorithm correctly synchronizes DDM events. We have developed a fully distributed implementation of the algorithm within the framework of the Georgia Tech Federated Simulation Development Kit (FDK) software. A performance evaluation of the synchronized DDM mechanism has been completed in a loosely coupled distributed system consisting of a network of workstations connected over a local area network (LAN). We compare time-managed versus unsynchronized DDM for two applications that exercise different mobility patterns: one based on a military simulation and a second utilizing a synthetic workload. The experiments and analysis illustrate that synchronized DDM performance depends on several factors: the simulations model (e.g. lookahead), applications mobility patterns and the network hardware (e.g. size of network buffers). Under certain mobility patterns, time-managed DDM is as efficient as unsynchronized DDM. There are also mobility patterns where time-managed DDM overheads become significant, and we show how they can be reduced.
APA, Harvard, Vancouver, ISO, and other styles
37

Barraclough, Peter J. "Common method variance and other sources of bias in road traffic research." Thesis, Queensland University of Technology, 2017. https://eprints.qut.edu.au/104818/1/Peter_Barraclough_Thesis.pdf.

Full text
Abstract:
A series of studies examined the extent to which method bias, primarily Common Method Variance, potentially affect road safety studies. A meta-analysis, examining self-reported and archival records of traffic offences and crashes, found differences in terms of the size of the effects produced by the two data types. The research also found some evidence to suggest that effect sizes are inflated when dichotomous scales are used in preference to Likert scales, and also when both the predictor and the predicted variables are gathered and analysed in the same manner from the same source.
APA, Harvard, Vancouver, ISO, and other styles
38

Füssl, Elisabeth, and Juliane Haupt. "Understanding cyclist identity and related interaction strategies. A novel approach to traffic research." Elsevier, 2016. https://publish.fid-move.qucosa.de/id/qucosa%3A72334.

Full text
Abstract:
It is an established fact that interaction of road users is crucial for road safety. However, the knowledge about what governs people’s behaviour in interaction with others and what these interactions mean is not well documented. The present study introduces a novel approach for traffic safety research and puts the cyclist identity at the centre of attention, in order to answer the questions how the heterogeneity of cyclists in terms of applied interaction strategies, opinions towards infrastructure and traffic safety can be explained. For this purpose, a qualitative study following the Grounded Theory methodology has been carried out. Fifteen in-depth-interviews with cyclists in Vienna were analysed in order to obtain data about these questions. As a result, we present a model sketch about constructing a cyclist identity, which serves as a framework that links different power relations in traffic, the switching perspectives of being a cyclist/car user and the changing conditions of cycling traffic policy through interaction strategies of self-portrayal, power demonstration and coping with fear. Finally, we argue that applying the often overlooked concept of ‘identity’ can bring new concepts into the debate on traffic safety for cyclists and support efficient traffic policy making.
APA, Harvard, Vancouver, ISO, and other styles
39

Glick, Travis Bradley. "Utilizing High-Resolution Archived Transit Data to Study Before-and-After Travel-Speed and Travel-Time Conditions." PDXScholar, 2017. https://pdxscholar.library.pdx.edu/open_access_etds/4065.

Full text
Abstract:
Travel times, operating speeds, and service reliability influence costs and service attractiveness. This paper outlines an approach to quantify how these metrics change after a modification of roadway design or transit routes using archived transit data. The Tri-County Metropolitan Transportation District of Oregon (TriMet), Portland's public transportation provider, archives automatic vehicle location (AVL) data for all buses as part of their bus dispatch system (BDS). This research combines three types of AVL data (stop event, stop disturbance, and high-resolution) to create a detailed account of transit behavior; this probe data gives insights into the behavior of transit as well as general traffic. The methodology also includes an updated approach for confidence intervals estimates that more accurately represent of range of speed and travel time percentile estimates. This methodology is applied to three test cases using a month of AVL data collected before and after the implementation of each roadway change. The results of the test cases highlight the broad applicability for this approach to before-and-after studies.
APA, Harvard, Vancouver, ISO, and other styles
40

Wang, Hong Feng. "IGP traffic engineering : a comparison of computational optimization algorithms." Thesis, Stellenbosch : Stellenbosch University, 2008. http://hdl.handle.net/10019.1/20877.

Full text
Abstract:
Thesis (MSc)--Stellenbosch University, 2008.
ENGLISH ABSTRACT: Traffic Engineering (TE) is intended to be used in next generation IP networks to optimize the usage of network resources by effecting QoS agreements between the traffic offered to the network and the available network resources. TE is currently performed by the IP community using three methods including (1) IGP TE using connectionless routing optimization (2) MPLS TE using connection-oriented routing optimization and (3) Hybrid TE combining IGP TE with MPLS TE. MPLS has won the battle of the core of the Internet and is making its way into metro, access and even some private networks. However, emerging provider practices are revealing the relevance of using IGP TE in hybrid TE models where IGP TE is combined with MPLS TE to optimize IP routing. This is done by either optimizing IGP routing while setting a few number of MPLS tunnels in the network or optimizing the management of MPLS tunnels to allow growth for the IGP traffic or optimizing both IGP and MPLS routing in a hybrid IGP+MPLS setting. The focus of this thesis is on IGP TE using heuristic algorithms borrowed from the computational intelligence research field. We present four classes of algorithms for Maximum Link Utilization (MLU) minimization. These include Genetic Algorithm (GA), Gene Expression Programming (GEP), Ant Colony Optimization (ACO), and Simulated Annealing (SA). We use these algorithms to compute a set of optimal link weights to achieve IGP TE in different settings where a set of test networks representing Europe, USA, Africa and China are used. Using NS simulation, we compare the performance of these algorithms on the test networks with various traffic profiles.
AFRIKAANSE OPSOMMING: Verkeersingenieurswese (VI) is aangedui vir gebruik in volgende generasie IP netwerke vir die gebruiksoptimering van netwerkbronne deur die daarstelling van kwaliteit van diens ooreenkomste tussen die verkeersaanbod vir die netwerk en die beskikbare netwerkbronne. VI word huidiglik algemeen bewerkstellig deur drie metodes, insluitend (1) IGP VI gebruikmakend van verbindingslose roete-optimering, (2) MPLS VI gebruikmakend van verbindingsvaste roete-optimering en (3) hibriede VI wat IGP VI en MPLS VI kombineer. MPLS is die mees algemene, en word ook aangewend in metro, toegang en selfs sommige privaatnetwerke. Nuwe verskaffer-praktyke toon egter die relevansie van die gebruik van IGP VI in hibriede VI modelle, waar IGP VI gekombineer word met MPLS VI om IP roetering te optimeer. Dit word gedoen deur `of optimering van IGP roetering terwyl ’n paar MPLS tonnels in die netwerk gestel word, `of optimering van die bestuur van MPLS tonnels om toe te laat vir groei in die IGP verkeer `of die optimering van beide IGP en MPLS roetering in ’n hibriede IGP en MPLS situasie. Die fokus van hierdie tesis is op IGP VI gebruikmakend van heuristieke algoritmes wat ontleen word vanuit die berekeningsintelligensie navorsingsveld. Ons beskou vier klasse van algoritmes vir Maksimum Verbindingsgebruik (MVG) minimering. Dit sluit in genetiese algoritmes, geen-uitdrukkingsprogrammering, mierkoloniemaksimering and gesimuleerde temperoptimering. Ons gebruik hierdie algoritmes om ’n versameling optimale verbindingsgewigte te bereken om IGP VI te bereik in verskillende situasies, waar ’n versameling toetsnetwerke gebruik is wat Europa, VSA, Afrika en China verteenwoordig. Gebruikmakende van NS simulasie, vergelyk ons die werkverrigting van hierdie algoritmes op die toetsnetwerke, met verskillende verkeersprofiele.
APA, Harvard, Vancouver, ISO, and other styles
41

Khatri, Chandra P. "Real-time road traffic information detection through social media." Thesis, Georgia Institute of Technology, 2015. http://hdl.handle.net/1853/53889.

Full text
Abstract:
In current study, a mechanism to extract traffic related information such as congestion and incidents from textual data from the internet is proposed. The current source of data is Twitter, however, the same mechanism can be extended to any kind of text available on the internet. As the data being considered is extremely large in size automated models are developed to stream, download, and mine the data in real-time. Furthermore, if any tweet has traffic related information then the models should be able to infer and extract this data. To pursue this task, Artificial Intelligence, Machine Learning, and Natural Language Processing techniques are used. These models are designed in such a way that they are able to detect the traffic congestion and traffic incidents from the Twitter stream at any location. Currently, the data is collected only for United States. The data is collected for 85 days (50 complete and 35 partial) randomly sampled over the span of five months (September, 2014 to February, 2015) and a total of 120,000 geo-tagged traffic related tweets are extracted, while six million geo-tagged non-traffic related tweets are retrieved. The classification models for detection of traffic congestion and incidents are trained on this dataset. Furthermore, this data is also used for various kinds of spatial and temporal analysis. A mechanism to calculate level of traffic congestion, safety, and traffic perception for cities in U.S. is proposed. Traffic congestion and safety rankings for the various urban areas are obtained and then they are statistically validated with existing widely adopted rankings. Traffic perception depicts the attitude and perception of people towards the traffic. It is also seen that traffic related data when visualized spatially and temporally provides the same pattern as the actual traffic flows for various urban areas. When visualized at the city level, it is clearly visible that the flow of tweets is similar to flow of vehicles and that the traffic related tweets are representative of traffic within the cities. With all the findings in current study, it is shown that significant amount of traffic related information can be extracted from Twitter and other sources on internet. Furthermore, Twitter and these data sources are freely available and are not bound by spatial and temporal limitations. That is, wherever there is a user there is a potential for data.
APA, Harvard, Vancouver, ISO, and other styles
42

Vock, Dominik. "Automatic segmentation and reconstruction of traffic accident scenarios from mobile laser scanning data." Doctoral thesis, Saechsische Landesbibliothek- Staats- und Universitaetsbibliothek Dresden, 2014. http://nbn-resolving.de/urn:nbn:de:bsz:14-qucosa-141582.

Full text
Abstract:
Virtual reconstruction of historic sites, planning of restorations and attachments of new building parts, as well as forest inventory are few examples of fields that benefit from the application of 3D surveying data. Originally using 2D photo based documentation and manual distance measurements, the 3D information obtained from multi camera and laser scanning systems realizes a noticeable improvement regarding the surveying times and the amount of generated 3D information. The 3D data allows a detailed post processing and better visualization of all relevant spatial information. Yet, for the extraction of the required information from the raw scan data and for the generation of useable visual output, time-consuming, complex user-based data processing is still required, using the commercially available 3D software tools. In this context, the automatic object recognition from 3D point cloud and depth data has been discussed in many different works. The developed tools and methods however, usually only focus on a certain kind of object or the detection of learned invariant surface shapes. Although the resulting methods are applicable for certain practices of data segmentation, they are not necessarily suitable for arbitrary tasks due to the varying requirements of the different fields of research. This thesis presents a more widespread solution for automatic scene reconstruction from 3D point clouds, targeting street scenarios, specifically for the task of traffic accident scene analysis and documentation. The data, obtained by sampling the scene using a mobile scanning system is evaluated, segmented, and finally used to generate detailed 3D information of the scanned environment. To realize this aim, this work adapts and validates various existing approaches on laser scan segmentation regarding the application on accident relevant scene information, including road surfaces and markings, vehicles, walls, trees and other salient objects. The approaches are therefore evaluated regarding their suitability and limitations for the given tasks, as well as for possibilities concerning the combined application together with other procedures. The obtained knowledge is used for the development of new algorithms and procedures to allow a satisfying segmentation and reconstruction of the scene, corresponding to the available sampling densities and precisions. Besides the segmentation of the point cloud data, this thesis presents different visualization and reconstruction methods to achieve a wider range of possible applications of the developed system for data export and utilization in different third party software tools.
APA, Harvard, Vancouver, ISO, and other styles
43

Rao, Shrisha. "Safety and hazard analysis in concurrent systems." Diss., University of Iowa, 2005. http://ir.uiowa.edu/etd/106.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Ahmed, Mohamed. "Multi-Level Safety Performance Functions for High Speed Facilities." Doctoral diss., University of Central Florida, 2012. http://digital.library.ucf.edu/cdm/ref/collection/ETD/id/5091.

Full text
Abstract:
High speed facilities are considered the backbone of any successful transportation system; Interstates, freeways, and expressways carry the majority of daily trips on the transportation network. Although these types of roads are relatively considered the safest among other types of roads, they still experience many crashes, many of which are severe, which not only affect human lives but also can have tremendous economical and social impacts. These facts signify the necessity of enhancing the safety of these high speed facilities to ensure better and efficient operation. Safety problems could be assessed through several approaches that can help in mitigating the crash risk on long and short term basis. Therefore, the main focus of the research in this dissertation is to provide a framework of risk assessment to promote safety and enhance mobility on freeways and expressways. Multi-level Safety Performance Functions (SPFs) were developed at the aggregate level using historical crash data and the corresponding exposure and risk factors to identify and rank sites with promise (hot-spots). Additionally, SPFs were developed at the disaggregate level utilizing real-time weather data collected from meteorological stations located at the freeway section as well as traffic flow parameters collected from different detection systems such as Automatic Vehicle Identification (AVI) and Remote Traffic Microwave Sensors (RTMS). These disaggregate SPFs can identify real-time risks due to turbulent traffic conditions and their interactions with other risk factors. In this study, two main datasets were obtained from two different regions. Those datasets comprise historical crash data, roadway geometrical characteristics, aggregate weather and traffic parameters as well as real-time weather and traffic data. At the aggregate level, Bayesian hierarchical models with spatial and random effects were compared to Poisson models to examine the safety effects of roadway geometrics on crash occurrence along freeway sections that feature mountainous terrain and adverse weather. At the disaggregate level; a main framework of a proactive safety management system using traffic data collected from AVI and RTMS, real-time weather and geometrical characteristics was provided. Different statistical techniques were implemented. These techniques ranged from classical frequentist classification approaches to explain the relationship between an event (crash) occurring at a given time and a set of risk factors in real time to other more advanced models. Bayesian statistics with updating approach to update beliefs about the behavior of the parameter with prior knowledge in order to achieve more reliable estimation was implemented. Also a relatively recent and promising Machine Learning technique (Stochastic Gradient Boosting) was utilized to calibrate several models utilizing different datasets collected from mixed detection systems as well as real-time meteorological stations. The results from this study suggest that both levels of analyses are important, the aggregate level helps in providing good understanding of different safety problems, and developing policies and countermeasures to reduce the number of crashes in total. At the disaggregate level, real-time safety functions help toward more proactive traffic management system that will not only enhance the performance of the high speed facilities and the whole traffic network but also provide safer mobility for people and goods. In general, the proposed multi-level analyses are useful in providing roadway authorities with detailed information on where countermeasures must be implemented and when resources should be devoted. The study also proves that traffic data collected from different detection systems could be a useful asset that should be utilized appropriately not only to alleviate traffic congestion but also to mitigate increased safety risks. The overall proposed framework can maximize the benefit of the existing archived data for freeway authorities as well as for road users.
ID: 031988164; System requirements: World Wide Web browser and PDF reader.; Mode of access: World Wide Web.; Thesis (Ph.D.)--University of Central Florida, 2012.; Includes bibliographical references.
Ph.D.
Doctorate
Civil, Environmental, and Construction Engineering
Engineering and Computer Science
Civil Engineering
APA, Harvard, Vancouver, ISO, and other styles
45

Milluzzi, Anthony J. "An Avian Target Processing Algorithm to Mitigate Bird Strike Risk in Aviation." Ohio University / OhioLINK, 2019. http://rave.ohiolink.edu/etdc/view?acc_num=ohiou1556144831508548.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Samples, Agnes Mary Banks. "Validity of Self-Reported Data on Seat Belt Use: The Behavioral Risk Factor Surveillance System." [Johnson City, Tenn. : East Tennessee State University], 2004. http://etd-submit.etsu.edu/etd/theses/available/etd-0315104-172201/unrestricted/SamplesA032604f.pdf.

Full text
Abstract:
Thesis (Ed. D.)--East Tennessee State University, 2004.
Title from electronic submission form. ETSU ETD database URN: etd-0315104-172201. Includes bibliographical references. Also available via Internet at the UMI web site.
APA, Harvard, Vancouver, ISO, and other styles
47

Zapletal, Dominik. "Preemptivní bezpečnostní analýza dopravního chování z trajektorií." Master's thesis, Vysoké učení technické v Brně. Fakulta informačních technologií, 2018. http://www.nusl.cz/ntk/nusl-385900.

Full text
Abstract:
This work deals with the and preemptive road users behaviour safety analysis problem. Safety analysis is based on a processing of road users trajectories obtained from processed aerial videos captured by drons. A system for traffic conflicts detection from spatial-temporal data is presented in this work. The standard approach for pro-active traffic conflict indicators evaluation was extended by simulating traffic objects movement in the scene using Ackerman steering geometry in order to get more accurate results.
APA, Harvard, Vancouver, ISO, and other styles
48

Ha, Wai On. "Empirical studies toward DRP constructs and a model for DRP development for information systems function." HKBU Institutional Repository, 2002. http://repository.hkbu.edu.hk/etd_ra/432.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Bastos, Jorge Tiago. "Road safety strategic analysis in Brazil: indicator and index research." Universidade de São Paulo, 2014. http://www.teses.usp.br/teses/disponiveis/18/18144/tde-08042015-103747/.

Full text
Abstract:
The intense economic growth that Brazil has experienced in recent decades and its consequent explosive motorization process have resulted in an undesirable impact: the continuously increasing trend in traffic fatality numbers. This study presents a research on indicators and indexes with the objective of delivering both overall and disaggregated evidence about the road safety performance and targets in fatality reduction in Brazil at the state level taking the exposure into account. The intention is to support road safety strategic analysis in the country and to contribute to improve this critical scene. The methodological structure of this thesis consists of the following three main parts: (I) diagnosing the road safety situation at the state level using final outcome related information, in particular traffic fatality risk data; (II) setting a target number of traffic fatalities based on the relationship between the exposure level and the number of traffic fatalities in each state; and (III) suggesting domains for improvements based on the research of safety performance indicators representing three domains (road user, environment and vehicle) throughout the states. From a benchmarking point of view, we divided the Brazilian states into three separate clusters in order to provide more realistic state performance comparisons. After a data collection and indicators selection step, Data Envelopment Analysis (DEA) was the method used for executing the different steps, with the application of four different types of models specially developed for the identified research purposes. In addition, by means of bootstrapping the DEA scores we measured the sensitivity of the results to possible variations in the input data, for example concerning data quality and availability. As a result, we provided a road safety diagnosis per state as well as traffic fatality targets according to different perspectives: the entire group of road users (motorized and nonmotorized ones), motor vehicle occupants, and finally a disaggregated performance evaluation by running four separate DEA models (for motorcycle, car, truck and bus). Moreover, the SPI research including a hierarchy of 27 safety performance indicators expressed the states relative performance on the main road safety domains. Lastly, state profiles compiling all this information summarized the \"per state\" findings.
O intenso crescimento econômico que o Brasil tem experimentado nas últimas décadas e seu consequente explosivo processo de motorização resultaram em um impacto indesejado: a tendência contínua do aumento do número de mortes no trânsito. Este estudo apresenta uma pesquisa acerca de índices e indicadores com o objetivo de fornecer evidências gerais e desagregadas sobre o desempenho da segurança viária e metas de redução no número de mortes no Brasil no âmbito estadual, levando a exposição em consideração. A intenção é embasar uma análise estratégica da segurança viária no país e contribuir para melhorar este cenário crítico. A estrutura metodológica desta tese consiste das seguintes três partes principais: (I) diagnóstico da situação da segurança viária no nível estadual utilizando informações relacionadas ao resultado final, em particular dados de risco de morte no trânsito; (II) estabelecer uma meta para o número de mortes no trânsito para cada estado; e (III) sugerir domínios para melhorias baseado em pesquisa de indicadores de desempenho da segurança viária voltada a três domínios (usuário da via, ambiente e veículo). Sob a ótica do benchmarking, dividiram-se os estados brasileiros em três clusters para proporcionar comparações mais realistas dos desempenhos estaduais. Após uma etapa de coleta e seleção de indicadores, utilizou-se o método de Data Envelopment Analyis (DEA) para executar as diferentes etapas, com a aplicação de quatro tipos distintos de modelos especialmente desenvolvidos para os propósitos da pesquisa. Além disso, por meio de bootstrapping dos escores obtidos com a DEA, mediu-se a sensibilidade dos resultados a possíveis variações nos dados de entrada, no que diz respeito a, por exemplo, qualidade e disponibilidade dos dados. Como resultado, propicia-se, a partir de diferentes perspectivas, um diagnóstico da segurança viária por estado, assim como metas no número de mortes: para todo o grupo de usuários (motorizados e não-motorizados), ocupantes de veículos motorizados, e finalmente uma avaliação desagregada por meio de quatro modelos separados (para motocicletas, automóveis, caminhões e ônibus). Adicionalmente, a pesquisa de indicadores de desempenho da segurança considerando a hierarquia de 27 indicadores expressou os desempenhos relativos dos estados nos principais domínios da segurança viária. Por fim, perfis estaduais compilando todas estas informações resumem os resultados para os estados.
APA, Harvard, Vancouver, ISO, and other styles
50

Jeannesson, Clément. "Development of a methodology to exploit nuclear data in the unresolved resonance range and the impact on criticality safety and reactor applications." Thesis, université Paris-Saclay, 2020. http://www.theses.fr/2020UPASP074.

Full text
Abstract:
Les calculs neutroniques réalisés notamment pour assurer l’exploitation et la sûreté des installations nucléaires sont très dépendants des données nucléaires, qui décrivent les interactions neutron-matière. En particulier, la connaissance des sections efficaces, qui définissent la probabilité d’occurrence des réactions nucléaires en fonction de l’énergie du neutron, est primordiale. Dans la plage d’énergies du neutron incident qualifiée de domaine des résonances non résolues, une structure résonante caractérise les sections efficaces mais les résonances ne peuvent être différenciées expérimentalement. Seules les valeurs moyennes des sections efficaces peuvent être calculées à partir de paramètres moyens mesurés expérimentalement, ainsi que leurs distributions de probabilité à l’aide d’une méthode Monte-Carlo appelée « la méthode des ladders ». En ce dernier cas, une représentation discrète est alors privilégiée, fondée sur l'utilisation de tables de probabilité. Cette thèse développe une méthodologie précise pour traiter les sections efficaces dans le domaine des résonances non résolues. Le travail réalisé porte notamment sur les méthodes d’échantillonnage statistique de résonances dans le cadre de la méthode des ladders. Plusieurs points sont traités, parmi lesquels l’influence du nombre de résonances échantillonnées sur le calcul des sections efficaces, ou le nombre minimal d’itérations Monte-Carlo à réaliser. Ces questions sont reformulées en fonction des paramètres de résonance fournis, et une relation est établie avec le ratio entre l’espacement moyen entre les résonances et la largeur moyenne de réaction totale. Les calculs sont réalisés sur des bibliothèques entières de données nucléaires, ce qui constitue un point fort de cette thèse. La théorie des matrices aléatoires est ensuite introduite pour échantillonner des jeux de résonances en meilleur accord avec la physique sous-jacente du problème traité. La mise en œuvre de cette théorie permet ici de corréler les espacements entre les résonances échantillonnées. L’ensemble des calculs est comparé avec la théorie de Hauser-Feschbach pour le calcul des valeurs moyennes, avec des résultats probants lorsque cette dernière utilise l’approximation de Moldauer. Plusieurs méthodes de construction de tables de probabilité sont également étudiées, et deux nouvelles méthodes fondées sur des algorithmes de k-clustering sont introduites. Des calculs de benchmarks à l’aide de codes neutroniques permettent de compléter les résultats obtenus, et d’établir une série de recommandations pour le traitement des sections efficaces dans le domaine des résonances non-résolues
Neutronics computations are widely used in reactor physics and criticality calculations to ensure the safety and the exploitation of nuclear facilities. They rely on nuclear data which describe neutron-matter interactions. Among them, cross sections are fundamental data that express the probability for a particular reaction to occur as a function of the incident neutron energy. At the intermediate to high energy range, cross section shapes are no longer distinguished, which defines the so-called unresolved resonance range. There, cross sections can only be computed as average values from average experimentally-determined parameters, as well as probability tables. These latter are a discretized form of the cross section probability distributions, determined from a Monte-Carlo-based technique called the ladder method. This thesis aims at proposing a robust methodology to process cross sections in the unresolved resonance range. In particular, the work carried out deals with statistical sampling of resonances in the framework of the ladder method. Several issues are tackled, among which the impact of the number of sampled resonances on the cross section calculations, as well as the number of Monte-Carlo iterations performed. A relation is established between these quantities and values of the input resonance parameters, namely the ratio of the average resonance spacing and the average total reaction width. Calculations are done for constituents of entire nuclear data libraries, which is an advantage of this work. Then, the random matrix theory is introduced to produce more physical sets of resonances that take into account correlations between the resonance spacings. All calculations are compared to the outcomes of the Hauser-Feschbach formalism for the calculation of average cross sections. When these latter ones are computed using the Moldauer assumption, results significantly match. Several probability table construction methods are then studied. Two innovative methods are introduced, based on a k-clustering algorithm. Benchmarks calculations using neutronics codes complete the results, and enable to formulate a detailed methodology for the nuclear data processing in the unresolved resonance range
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography