Segui questo link per vedere altri tipi di pubblicazioni sul tema: Traffic engineering – Data processing.

Tesi sul tema "Traffic engineering – Data processing"

Cita una fonte nei formati APA, MLA, Chicago, Harvard e in molti altri stili

Scegli il tipo di fonte:

Vedi i top-50 saggi (tesi di laurea o di dottorato) per l'attività di ricerca sul tema "Traffic engineering – Data processing".

Accanto a ogni fonte nell'elenco di riferimenti c'è un pulsante "Aggiungi alla bibliografia". Premilo e genereremo automaticamente la citazione bibliografica dell'opera scelta nello stile citazionale di cui hai bisogno: APA, MLA, Harvard, Chicago, Vancouver ecc.

Puoi anche scaricare il testo completo della pubblicazione scientifica nel formato .pdf e leggere online l'abstract (il sommario) dell'opera se è presente nei metadati.

Vedi le tesi di molte aree scientifiche e compila una bibliografia corretta.

1

賴翰笙 e Hon-seng Lai. "An effective methodology for visual traffic surveillance". Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2000. http://hub.hku.hk/bib/B30456708.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
2

Lam, Fung, e 林峰. "Internet inter-domain traffic engineering and optimizatioon". Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2001. http://hub.hku.hk/bib/B31224581.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
3

Wolf, Jean Louise. "Using GPS data loggers to replace travel diaries in the collection of travel data". Diss., Georgia Institute of Technology, 2000. http://hdl.handle.net/1853/20203.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
4

Mawji, Afzal. "Achieving Scalable, Exhaustive Network Data Processing by Exploiting Parallelism". Thesis, University of Waterloo, 2004. http://hdl.handle.net/10012/779.

Testo completo
Abstract (sommario):
Telecommunications companies (telcos) and Internet Service Providers (ISPs) monitor the traffic passing through their networks for the purposes of network evaluation and planning for future growth. Most monitoring techniques currently use a form of packet sampling. However, exhaustive monitoring is a preferable solution because it ensures accurate traffic characterization and also allows encoding operations, such as compression and encryption, to be performed. To overcome the very high computational cost of exhaustive monitoring and encoding of data, this thesis suggests exploiting parallelism. By utilizing a parallel cluster in conjunction with load balancing techniques, a simulation is created to distribute the load across the parallel processors. It is shown that a very scalable system, capable of supporting a fairly high data rate can potentially be designed and implemented. A complete system is then implemented in the form of a transparent Ethernet bridge, ensuring that the system can be deployed into a network without any change to the network. The system focuses its encoding efforts on obtaining the maximum compression rate and, to that end, utilizes the concept of streams, which attempts to separate data packets into individual flows that are correlated and whose redundancy can be removed through compression. Experiments show that compression rates are favourable and confirms good throughput rates and high scalability.
Gli stili APA, Harvard, Vancouver, ISO e altri
5

Wang, Hong Feng. "IGP traffic engineering : a comparison of computational optimization algorithms". Thesis, Stellenbosch : Stellenbosch University, 2008. http://hdl.handle.net/10019.1/20877.

Testo completo
Abstract (sommario):
Thesis (MSc)--Stellenbosch University, 2008.
ENGLISH ABSTRACT: Traffic Engineering (TE) is intended to be used in next generation IP networks to optimize the usage of network resources by effecting QoS agreements between the traffic offered to the network and the available network resources. TE is currently performed by the IP community using three methods including (1) IGP TE using connectionless routing optimization (2) MPLS TE using connection-oriented routing optimization and (3) Hybrid TE combining IGP TE with MPLS TE. MPLS has won the battle of the core of the Internet and is making its way into metro, access and even some private networks. However, emerging provider practices are revealing the relevance of using IGP TE in hybrid TE models where IGP TE is combined with MPLS TE to optimize IP routing. This is done by either optimizing IGP routing while setting a few number of MPLS tunnels in the network or optimizing the management of MPLS tunnels to allow growth for the IGP traffic or optimizing both IGP and MPLS routing in a hybrid IGP+MPLS setting. The focus of this thesis is on IGP TE using heuristic algorithms borrowed from the computational intelligence research field. We present four classes of algorithms for Maximum Link Utilization (MLU) minimization. These include Genetic Algorithm (GA), Gene Expression Programming (GEP), Ant Colony Optimization (ACO), and Simulated Annealing (SA). We use these algorithms to compute a set of optimal link weights to achieve IGP TE in different settings where a set of test networks representing Europe, USA, Africa and China are used. Using NS simulation, we compare the performance of these algorithms on the test networks with various traffic profiles.
AFRIKAANSE OPSOMMING: Verkeersingenieurswese (VI) is aangedui vir gebruik in volgende generasie IP netwerke vir die gebruiksoptimering van netwerkbronne deur die daarstelling van kwaliteit van diens ooreenkomste tussen die verkeersaanbod vir die netwerk en die beskikbare netwerkbronne. VI word huidiglik algemeen bewerkstellig deur drie metodes, insluitend (1) IGP VI gebruikmakend van verbindingslose roete-optimering, (2) MPLS VI gebruikmakend van verbindingsvaste roete-optimering en (3) hibriede VI wat IGP VI en MPLS VI kombineer. MPLS is die mees algemene, en word ook aangewend in metro, toegang en selfs sommige privaatnetwerke. Nuwe verskaffer-praktyke toon egter die relevansie van die gebruik van IGP VI in hibriede VI modelle, waar IGP VI gekombineer word met MPLS VI om IP roetering te optimeer. Dit word gedoen deur `of optimering van IGP roetering terwyl ’n paar MPLS tonnels in die netwerk gestel word, `of optimering van die bestuur van MPLS tonnels om toe te laat vir groei in die IGP verkeer `of die optimering van beide IGP en MPLS roetering in ’n hibriede IGP en MPLS situasie. Die fokus van hierdie tesis is op IGP VI gebruikmakend van heuristieke algoritmes wat ontleen word vanuit die berekeningsintelligensie navorsingsveld. Ons beskou vier klasse van algoritmes vir Maksimum Verbindingsgebruik (MVG) minimering. Dit sluit in genetiese algoritmes, geen-uitdrukkingsprogrammering, mierkoloniemaksimering and gesimuleerde temperoptimering. Ons gebruik hierdie algoritmes om ’n versameling optimale verbindingsgewigte te bereken om IGP VI te bereik in verskillende situasies, waar ’n versameling toetsnetwerke gebruik is wat Europa, VSA, Afrika en China verteenwoordig. Gebruikmakende van NS simulasie, vergelyk ons die werkverrigting van hierdie algoritmes op die toetsnetwerke, met verskillende verkeersprofiele.
Gli stili APA, Harvard, Vancouver, ISO e altri
6

Hwang, Kuo-Ping. "Applying heuristic traffic assignment in natural disaster evacuation: a decision support system". Diss., Virginia Polytechnic Institute and State University, 1986. http://hdl.handle.net/10919/54455.

Testo completo
Abstract (sommario):
The goal of this research is to develop a heuristic traffic assignment method to simulate the traffic flow of a transportation network at a real-time speed. The existing assignment methods are reviewed and a heuristic path-recording assignment method is proposed. Using the new heuristic assignment method, trips are loaded onto the network in a probabilistic approach for the first iteration; paths are recorded, and path impedance is computed as the basis for further assignment iteration. The real-time traffic assignment model developed with the new assignment method is called HEUPRAE. The difference in link traffic between this new assignment and Dial's multipath assignment ranges from 10 to 25 percent. Saving in computer time is about 55 percent. The proposed heuristic path-recording assignment is believed to be an efficient and reliable method. Successful development of this heuristic assignment method helps solve those transportation problems which need assignment results at a real-time speed, and for which the assignment process lasts a couple of hours. Evacuation planning and operation are well suited to the application of this real-time heuristic assignment method. Evacuation planning and operations are major activities in emergency management. Evacuation planning instructs people where to go, which route to take, and the time needed to accomplish an evacuation. Evacuation operations help the execution of an evacuation plan in response to the changing nature of a disaster. The Integrated Evacuation Decision Support System (IEDSS) is a computer system which employs the evacuation planning model, MASSVAC2, and the evacuation operation model, HEUPRAE, to deal with evacuations. The IEDSS uses computer graphics to prepare input and interpret output. It helps a decision maker analyze the evacuation system, review evacuation plans, and issue an evacuation order at a proper time. Users of the IEDSS can work on evacuation problems in a friendly interactive visual environment. The application of the IEDSS to the hurricane and flood problems for the city of Virginia Beach shows how IEDSS is practically implemented. It proves the usefulness of the IEDSS in coping with disasters.
Ph. D.
Gli stili APA, Harvard, Vancouver, ISO e altri
7

Trinh, Viet. "Using voicexml to provide real-time traffic information". Honors in the Major Thesis, University of Central Florida, 2002. http://digital.library.ucf.edu/cdm/ref/collection/ETH/id/307.

Testo completo
Abstract (sommario):
This item is only available in print in the UCF Libraries. If this is your Honors Thesis, you can help us make it available online for use by researchers around the world by following the instructions on the distribution consent form at http://library.ucf.edu/Systems/DigitalInitiatives/DigitalCollections/InternetDistributionConsentAgreementForm.pdf You may also contact the project coordinator, Kerri Bottorff, at kerri.bottorff@ucf.edu for more information.
Bachelors
Engineering
Computer Engineering
Gli stili APA, Harvard, Vancouver, ISO e altri
8

Smith, Katie S. "A profile of HOV lane vehicle characteristics on I-85 prior to HOV-to-HOT conversion". Thesis, Georgia Institute of Technology, 2011. http://hdl.handle.net/1853/42923.

Testo completo
Abstract (sommario):
The conversion of high-occupancy vehicle (HOV) lanes to high-occupancy toll (HOT) lanes is currently being implemented in metro Atlanta on a demonstration basis and is under consideration for more widespread adoption throughout the metro region. Further conversion of HOV lanes to HOT lanes is a major policy decision that depends on knowledge of the likely impacts, including the equity of the new HOT lane. Rather than estimating these impacts using modeling or surveys, this study collects revealed preference data in the form of observed vehicle license plate data and vehicle occupancy data from users of the HOV corridor. Building on a methodology created in Spring 2011, researchers created a new methodology for matching license plate data to vehicle occupancy data that required extensive post-processing of the data. The new methodology also presented an opportunity to take an in-depth look at errors in both occupancy and license plate data (in terms of data collection efforts, processing, and the vehicle registration database). Characteristics of individual vehicles were determined from vehicle registration records associated with the license plate data collected during AM and PM peak periods immediately prior to the HOV lanes conversion to HOT lanes. More than 70,000 individual vehicle license plates were collected for analysis, and over 3,500 records are matched to occupancy values. Analysis of these data have shown that government and commercial vehicle were more prevalent in the HOV lane, while hybrid and alternative fuel vehicles were much less common in either lane than expected. Vehicle occupancy data from the first four quarters of data collection were used to create the distribution of occupancy on the HOV and general purpose lane, and then the matched occupancy and license plate data were examined. A sensitivity analysis of the occupancy data established that the current use of uncertain occupancy values is acceptable and that bus and vanpool occupancy should be considered when determining the average occupancy of all vehicles on the HOV lane. Using a bootstrap analysis, vehicle values were compared to vehicle occupancy values and the results found that there is no correlation between vehicle value and vehicle occupancy. A conclusions section suggests possible impacts of the findings on policy decisions as Georgia considers expanding the HOT network. Further research using these data, and additional data that will be collected after the HOT lane opens, will include emissions modeling and a study of changes in vehicle characteristics associated with the HOT lane conversion.
Gli stili APA, Harvard, Vancouver, ISO e altri
9

Kumar, Saurabh. "Real-Time Road Traffic Events Detection and Geo-Parsing". Thesis, Purdue University, 2018. http://pqdtopen.proquest.com/#viewpdf?dispub=10842958.

Testo completo
Abstract (sommario):

In the 21st century, there is an increasing number of vehicles on the road as well as a limited road infrastructure. These aspects culminate in daily challenges for the average commuter due to congestion and slow moving traffic. In the United States alone, it costs an average US driver $1200 every year in the form of fuel and time. Some positive steps, including (a) introduction of the push notification system and (b) deploying more law enforcement troops, have been taken for better traffic management. However, these methods have limitations and require extensive planning. Another method to deal with traffic problems is to track the congested area in a city using social media. Next, law enforcement resources can be re-routed to these areas on a real-time basis.

Given the ever-increasing number of smartphone devices, social media can be used as a source of information to track the traffic-related incidents.

Social media sites allow users to share their opinions and information. Platforms like Twitter, Facebook, and Instagram are very popular among users. These platforms enable users to share whatever they want in the form of text and images. Facebook users generate millions of posts in a minute. On these platforms, abundant data, including news, trends, events, opinions, product reviews, etc. are generated on a daily basis.

Worldwide, organizations are using social media for marketing purposes. This data can also be used to analyze the traffic-related events like congestion, construction work, slow-moving traffic etc. Thus the motivation behind this research is to use social media posts to extract information relevant to traffic, with effective and proactive traffic administration as the primary focus. I propose an intuitive two-step process to utilize Twitter users' posts to obtain for retrieving traffic-related information on a real-time basis. It uses a text classifier to filter out the data that contains only traffic information. This is followed by a Part-Of-Speech (POS) tagger to find the geolocation information. A prototype of the proposed system is implemented using distributed microservices architecture.

Gli stili APA, Harvard, Vancouver, ISO e altri
10

Glick, Travis Bradley. "Utilizing High-Resolution Archived Transit Data to Study Before-and-After Travel-Speed and Travel-Time Conditions". PDXScholar, 2017. https://pdxscholar.library.pdx.edu/open_access_etds/4065.

Testo completo
Abstract (sommario):
Travel times, operating speeds, and service reliability influence costs and service attractiveness. This paper outlines an approach to quantify how these metrics change after a modification of roadway design or transit routes using archived transit data. The Tri-County Metropolitan Transportation District of Oregon (TriMet), Portland's public transportation provider, archives automatic vehicle location (AVL) data for all buses as part of their bus dispatch system (BDS). This research combines three types of AVL data (stop event, stop disturbance, and high-resolution) to create a detailed account of transit behavior; this probe data gives insights into the behavior of transit as well as general traffic. The methodology also includes an updated approach for confidence intervals estimates that more accurately represent of range of speed and travel time percentile estimates. This methodology is applied to three test cases using a month of AVL data collected before and after the implementation of each roadway change. The results of the test cases highlight the broad applicability for this approach to before-and-after studies.
Gli stili APA, Harvard, Vancouver, ISO e altri
11

Ni, Daiheng. "Extension and generalization of Newell's simplified theory of kinematic waves". Diss., Available online, Georgia Institute of Technology, 2004:, 2004. http://etd.gatech.edu/theses/available/etd-11112004-112805/unrestricted/ni%5Fdaiheng%5F200412%5Fphd.pdf.

Testo completo
Abstract (sommario):
Thesis (Ph. D.)--Civil and Environmental Engineering, Georgia Institute of Technology, 2005.
Leonard, John D., Committee Chair ; Goldsman, Dave, Committee Member ; Amekudzi, Adjo, Committee Member ; Hunter, Michael, Committee Member ; Dixon, Karen, Committee Member. Vita. Includes bibliographical references.
Gli stili APA, Harvard, Vancouver, ISO e altri
12

Duarte, David. "A profile of changes in vehicle characteristics following the I-85 HOV-to-HOT conversion". Thesis, Georgia Institute of Technology, 2013. http://hdl.handle.net/1853/47689.

Testo completo
Abstract (sommario):
A 15.5-mile portion of the I-85 high-occupancy vehicle (HOV) lane in the metropolitan area of Atlanta, GA was converted to a high-occupancy toll (HOT) lane as part of a federal demonstration project designed to provide a reliable travel option through this congested corridor. Results from the I-85 demonstration project provided insight into the results that may follow the Georgia Department of Transportation's planned implementation of a $16 billion HOT lane network along metropolitan Atlanta's other major roadways [2]. To evaluate the impacts of the conversion, it was necessary to measure changes in corridor travel speed, reliability, vehicle throughput, passenger throughput, lane weaving, and user demographics. To measure such performance, a monitoring project, led by the Georgia Institute of Technology collected various forms of data through on-site field deployments, GDOT video, and cooperation from the State Road and Toll Authority (SRTA). Changes in the HOT lane's speed, reliability or other performance measure can affect the demographic and vehicle characteristics of those who utilize the corridor. The purpose of this particular study was to analyze the changes to the vehicle characteristics by comparing vehicle occupancy, vehicle classifications, and vehicle registration data to their counterparts from before the HOV-to-HOT conversion. As part of the monitoring project, the Georgia Tech research team organized a two-year deployment effort to collect data along the corridor during morning and afternoon peak hours. One year of data collection occurred before the conversion date to establish a control and a basis from which to compare any changes. The second year of data collection occurred after the conversion to track those changes and observe the progress of the lane's performance. While on-site, researchers collected data elements including visually-observed vehicle occupancy, license plate numbers, and vehicle classification [25]. The research team obtained vehicle records by submitting the license plate tag entries to a registration database [26]. In previous work, vehicle occupancy data were collected independently of license plate records used to establish the commuter shed. For the analyses reported in this thesis, license plate data and occupancy data were collected concurrently, providing a link between occupancy records of specific vehicles and relevant demographic characteristics based upon census data. The vehicle records also provided characteristics of the users' vehicles (light-duty vehicle vs. sport utility vehicle, model year, etc.) that the researchers aggregated to identify general trends in fleet characteristics. The analysis reported in this thesis focuses on identifying changes in vehicle characteristics that resulted from the HOV-to-HOT conversion. The data collected from post-conversion are compared to pre-conversion data, revealing changes in vehicle characteristics and occupancy distributions that most likely resulted from the implementation of the HOT lane. Plausible reasons affecting the vehicle characteristics alterations will be identified and further demographic research will enhance the data currently available to better pinpoint the cause and effect relationship between implementation and the current status of the I-85 corridor. Preliminary data collection outliers were identified by using vehicle occupancy data. However, future analysis will reveal the degree of their impact on the project as a whole. Matched occupancy and license plate data revealed vehicle characteristics for HOT lane users as well as indications that the tested data collectors are predominantly synchronized when concurrently collecting data, resulting in an argument to uphold the validity of the data collection methods. Chapter two provides reasons for why HOT lanes were sought out to replace I-85's HOV lanes. Chapter two will also provide many details regarding how the HOT lanes function and it will describe the role the Georgia Institute of Technology played in the assessment the HOV-to-HOT conversion. Chapter three includes the methodologies used to complete this document while chapter four provides results and analysis for the one year period before the conversion and the one year period after the conversion.
Gli stili APA, Harvard, Vancouver, ISO e altri
13

Dickinson, Keith William. "Traffic data capture and analysis using video image processing". Thesis, University of Sheffield, 1986. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.306374.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
14

So, Wai-ki, e 蘇慧琪. "Shadow identification in traffic video sequences". Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2005. http://hub.hku.hk/bib/B32045967.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
15

Craig, David W. (David William) Carleton University Dissertation Engineering Electrical. "Light traffic loss of random hard real-time tasks in a network". Ottawa, 1988.

Cerca il testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
16

Jafar, Fatmeh Nazmi Ahmad. "Simulating traditional traffic data from satellite data-preparing for real satellite data test /". The Ohio State University, 2000. http://rave.ohiolink.edu/etdc/view?acc_num=osu1488193665235894.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
17

Agafonov, Evgeny. "Fuzzy and multi-resolution data processing for advanced traffic and travel information". Thesis, Nottingham Trent University, 2003. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.271790.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
18

Fernandez, Noemi. "Statistical information processing for data classification". FIU Digital Commons, 1996. http://digitalcommons.fiu.edu/etd/3297.

Testo completo
Abstract (sommario):
This thesis introduces new algorithms for analysis and classification of multivariate data. Statistical approaches are devised for the objectives of data clustering, data classification and object recognition. An initial investigation begins with the application of fundamental pattern recognition principles. Where such fundamental principles meet their limitations, statistical and neural algorithms are integrated to augment the overall approach for an enhanced solution. This thesis provides a new dimension to the problem of classification of data as a result of the following developments: (1) application of algorithms for object classification and recognition; (2) integration of a neural network algorithm which determines the decision functions associated with the task of classification; (3) determination and use of the eigensystem using newly developed methods with the objectives of achieving optimized data clustering and data classification, and dynamic monitoring of time-varying data; and (4) use of the principal component transform to exploit the eigensystem in order to perform the important tasks of orientation-independent object recognition, and di mensionality reduction of the data such as to optimize the processing time without compromising accuracy in the analysis of this data.
Gli stili APA, Harvard, Vancouver, ISO e altri
19

Chiu, Cheng-Jung. "Data processing in nanoscale profilometry". Thesis, Massachusetts Institute of Technology, 1995. http://hdl.handle.net/1721.1/36677.

Testo completo
Abstract (sommario):
Thesis (M.S.)--Massachusetts Institute of Technology, Dept. of Mechanical Engineering, 1995.
Includes bibliographical references (p. 176-177).
New developments on the nanoscale are taking place rapidly in many fields. Instrumentation used to measure and understand the geometry and property of the small scale structure is therefore essential. One of the most promising devices to head the measurement science into the nanoscale is the scanning probe microscope. A prototype of a nanoscale profilometer based on the scanning probe microscope has been built in the Laboratory for Manufacturing and Productivity at MIT. A sample is placed on a precision flip stage and different sides of the sample are scanned under the SPM to acquire its separate surface topography. To reconstruct the original three dimensional profile, many techniques like digital filtering, edge identification, and image matching are investigated and implemented in the computer programs to post process the data, and with greater emphasis placed on the nanoscale application. The important programming issues are addressed, too. Finally, this system's error sources are discussed and analyzed.
by Cheng-Jung Chiu.
M.S.
Gli stili APA, Harvard, Vancouver, ISO e altri
20

Lin, Joyce C. (Joyce Chaisin) 1979. "VisualFlight : the air traffic control data analysis system". Thesis, Massachusetts Institute of Technology, 2002. http://hdl.handle.net/1721.1/87266.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
21

Marupudi, Surendra Brahma. "Framework for Semantic Integration and Scalable Processing of City Traffic Events". Wright State University / OhioLINK, 2016. http://rave.ohiolink.edu/etdc/view?acc_num=wright1472505847.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
22

Nadella, Sunita. "Effect of machine vision based traffic data collection accuracy on traffic noise". Ohio : Ohio University, 2002. http://www.ohiolink.edu/etd/view.cgi?ohiou1174681979.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
23

Gunnar, Anders. "Towards Robust Traffic Engineering in IP Networks". Licentiate thesis, Stockholm : Elektriska energisystem, Kungliga Tekniska högskolan, 2007. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-4557.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
24

Avadhani, Umesh D. "Data processing in a small transit company using an automatic passenger counter". Thesis, Virginia Tech, 1986. http://hdl.handle.net/10919/45669.

Testo completo
Abstract (sommario):

This thesis describes the work done in the second stage of the implementation of the Automatic Passenger Counter (APC) system at the Roanoke Valley - Metro Transit Company. This second stage deals with the preparation of a few reports and plots that would help the transit managers in efficiently managing the transit system. The reports and plots give an evaluation of the system and service operations by which the decision makers can support their decisions.

For an efficient management of the transit system, data on ridership activity, running times schedule information, and fare revenue is required. From this data it is possible to produce management information reports and summary statistics.

The present data collection program at Roanoke Valleyâ Metro is carried by using checkers and supervisors to collect ridership and schedule adherence information using manual methods. The information needed for efficient management of transit operations is both difficult and expensive to obtain. The new APC system offers the management with a new and powerful tool that will enhance their capability to make better decisions when allocating the service needs. The data from the APC are essential for the transit propertys ongoing planning and scheduling activites. The management could easily quantify the service demands on a route or for the whole system as desired by the user.


Master of Science
Gli stili APA, Harvard, Vancouver, ISO e altri
25

Derksen, Timothy J. (Timothy John). "Processing of outliers and missing data in multivariate manufacturing data". Thesis, Massachusetts Institute of Technology, 1996. http://hdl.handle.net/1721.1/38800.

Testo completo
Abstract (sommario):
Thesis (M. Eng.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 1996.
Includes bibliographical references (leaf 64).
by Timothy J. Derksen.
M.Eng.
Gli stili APA, Harvard, Vancouver, ISO e altri
26

Mumtaz, Ali. "Window-based stream data mining for classification of Internet traffic". Thesis, University of Ottawa (Canada), 2008. http://hdl.handle.net/10393/27601.

Testo completo
Abstract (sommario):
Accurate classification of Internet applications is a fundamental requirement for network provisioning, network security, maintaining quality of services and network management. Increasingly, new applications are being introduced on the Internet. The traffic volume and patterns of some of the new applications such as Peer-to-Peer (P2P) file sharing put pressure on service providers' networks in terms of congestion and delay, to the point that maintaining Quality of Services (QoS) planned in the access network requires the provisioning of additional bandwidth sooner than planned. Peer-to-Peer applications enable users to communicate directly over the Internet, thus bypassing central server control implemented by service providers and poses threats in terms of network congestion, and creating an environment for malicious attacks on networks. One key challenge in this area is to adapt to the dynamic nature of Internet traffic. With the growth in Internet traffic, in terms of number and type of applications, traditional classification techniques such as port matching, protocol decoding or packet payload analysis are no longer effective For instance, P2P applications may use randomly selected non-standard ports to communicate which makes it difficult to distinguish from other types of traffic only by inspecting port number. The present research introduces two new techniques to classify stream (online) data using K-means clustering and Fast Decision Tree (FDT). In the first technique, we first generate micro-clusters using k-means clustering with different values of k. Micro clusters are then merged into two clusters based on weighted averages of P2P and NonP2P population. This technique generates two merged clusters, each representing P2P or NonP2P traffic. The simulation results confirm that the two final clusters represent P2P and NonP2P traffic each with a good accuracy. The second technique employs a two-stage architecture for classification of P2P traffic, where in the first stage, the traffic is filtered using standard port numbers and layer 4 port matching to label well-known P2P and NonP2P traffics, leaving the rest of the traffic as "Unknown". The labeled traffic generated in the first stage is used to train a Fast Decision Tree (FDT) classifier with high accuracy. The Unknown traffic is then applied to the FDT model which classifies the traffic into P2P and NonP2P with high accuracy. The two-stage architecture, therefore, not only classifies well-known P2P applications, it also classifies applications that use random or private (non standard) port numbers and can not be classified otherwise. We performed various experiments where we captured Internet traffic at a main gateway router, pre-processed the data and selected three most significant attributes, namely Packet Length, Source IP address and Destination IP address. We then applied the proposed technique to three different windows of records. Accuracy, Specificity and Sensitivity of the model are calculated. Our simulation results confirm that the predicted output represents P2P and NonP2P traffic with accuracy higher than 90%.
Gli stili APA, Harvard, Vancouver, ISO e altri
27

Collin, Sofie. "Synthetic Data for Training and Evaluation of Critical Traffic Scenarios". Thesis, Linköpings universitet, Medie- och Informationsteknik, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-177779.

Testo completo
Abstract (sommario):
Modern camera-based vehicle safety systems heavily rely on machine learning and consequently require large amounts of training data to perform reliably. However, collecting and annotating the needed data is an extremely expensive and time-consuming process. In addition, it is exceptionally difficult to collect data that covers critical scenarios. This thesis investigates to what extent synthetic data can replace real-world data for these scenarios. Since only a limited amount of data consisting of such real-world scenarios is available, this thesis instead makes use of proxy scenarios, e.g. situations when pedestrians are located closely in front of the vehicle (for example at a crosswalk). The presented approach involves training a detector on real-world data where all samples of these proxy scenarios have been removed and compare it to other detectors trained on data where the removed samples have been replaced with various degrees of synthetic data. A method for generating and automatically and accurately annotating synthetic data, using features in the CARLA simulator, is presented. Also, the domain gap between the synthetic and real-world data is analyzed and methods in domain adaptation and data augmentation are reviewed. The presented experiments show that aligning statistical properties between the synthetic and real-world datasets distinctly mitigates the domain gap. There are also clear indications that synthetic data can help detect pedestrians in critical traffic situations

Examensarbetet är utfört vid Institutionen för teknik och naturvetenskap (ITN) vid Tekniska fakulteten, Linköpings universitet

Gli stili APA, Harvard, Vancouver, ISO e altri
28

Nyström, Simon, e Joakim Lönnegren. "Processing data sources with big data frameworks". Thesis, KTH, Data- och elektroteknik, 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-188204.

Testo completo
Abstract (sommario):
Big data is a concept that is expanding rapidly. As more and more data is generatedand garnered, there is an increasing need for efficient solutions that can be utilized to process all this data in attempts to gain value from it. The purpose of this thesis is to find an efficient way to quickly process a large number of relatively small files. More specifically, the purpose is to test two frameworks that can be used for processing big data. The frameworks that are tested against each other are Apache NiFi and Apache Storm. A method is devised in order to, firstly, construct a data flow and secondly, construct a method for testing the performance and scalability of the frameworks running this data flow. The results reveal that Apache Storm is faster than Apache NiFi, at the sort of task that was tested. As the number of nodes included in the tests went up, the performance did not always do the same. This indicates that adding more nodes to a big data processing pipeline, does not always result in a better performing setup and that, sometimes, other measures must be made to heighten the performance.
Big data är ett koncept som växer snabbt. När mer och mer data genereras och samlas in finns det ett ökande behov av effektiva lösningar som kan användas föratt behandla all denna data, i försök att utvinna värde från den. Syftet med detta examensarbete är att hitta ett effektivt sätt att snabbt behandla ett stort antal filer, av relativt liten storlek. Mer specifikt så är det för att testa två ramverk som kan användas vid big data-behandling. De två ramverken som testas mot varandra är Apache NiFi och Apache Storm. En metod beskrivs för att, för det första, konstruera ett dataflöde och, för det andra, konstruera en metod för att testa prestandan och skalbarheten av de ramverk som kör dataflödet. Resultaten avslöjar att Apache Storm är snabbare än NiFi, på den typen av test som gjordes. När antalet noder som var med i testerna ökades, så ökade inte alltid prestandan. Detta visar att en ökning av antalet noder, i en big data-behandlingskedja, inte alltid leder till bättre prestanda och att det ibland krävs andra åtgärder för att öka prestandan.
Gli stili APA, Harvard, Vancouver, ISO e altri
29

Moody, Kacen Paul. "FPGA-Accelerated Digital Signal Processing for UAV Traffic Control Radar". BYU ScholarsArchive, 2021. https://scholarsarchive.byu.edu/etd/8941.

Testo completo
Abstract (sommario):
As an extension of previous work done by Luke Newmeyer in his master's thesis \cite{newmeyer2018efficient}, this report presents an improved signal processing chain for efficient, real-time processing of radar data for small-scale UAV traffic control systems. The HDL design described is for a 16-channel, 2-dimensional phased array feed processing chain and includes mean subtraction, windowing, FIR filtering, decimation, spectral estimation via FFT, cross-correlation, and averaging, as well as a significant amount of control and configuration logic. The design runs near the the max allowable memory bus frequency at 300MHz, and using AXI DMA engines can achieve throughput of 38.3 Gb/s (~0.25% below theoretical 38.4 Gb/s), transferring 2MB of correlation data in about 440us. This allows for a pulse repetition frequency of nearly 2kHz, in contrast to 454Hz from the previous design. The design targets the Avnet UltraZed-EV MPSoC board, which boots custom PetaLinux images. API code and post-processing algorithms run in this environment to interface with the FPGA control registers and further process frames of data. Primary configuration options include variable sample rate, window coefficients, FIR filter coefficients, chirp length, pulse repetition interval, decimation factor, number of averaged frames, error monitoring, three DMA sampling points, and DMA ring buffer transfers. The result is a dynamic, high-speed, small-scale design which can process 16 parallel channels of data in real time for 3-dimensional detection of local UAV traffic at a range of 1000m.
Gli stili APA, Harvard, Vancouver, ISO e altri
30

徐順通 e Sung-thong Andrew Chee. "Computerisation in Hong Kong professional engineering firms". Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 1985. http://hub.hku.hk/bib/B31263124.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
31

Box, Stephanie. "Arterial roadway traffic data collection using bluetooth technology". Thesis, Georgia Institute of Technology, 2011. http://hdl.handle.net/1853/42778.

Testo completo
Abstract (sommario):
The use of Bluetooth technology for gathering traffic data is becoming increasingly popular due to the large volume of data that can be gathered at a relatively low cost. The limited number of devices in discoverable mode and potential long discovery time of the Bluetooth devices creates an opportunity for evaluating the sensor array setup that can maximize the sample of devices identified. This thesis investigates several factors that have a significant impact on the quality of the data obtained using Bluetooth, including the number of Bluetooth readers, orientation of the Bluetooth antennas, position of the readers relative to one another, and the location of the Bluetooth stations. The thesis begins with an overview of Bluetooth technology and literature review on the use of Bluetooth in previous traffic studies. Next, the methodology for the setup of the Bluetooth system and the four tests performed to evaluate the factors affecting the quality of the data are described. Through the results of these tests, it was observed that a "flat" antenna orientation allows for the greatest detection range and that the walls of buildings can prevent detection of Bluetooth devices inside the buildings. In addition, using multiple Bluetooth readers per sensor array resulted in statistically significant increases in number of detections of single reader sensors, and horizontally separated sensor arrays were observed to be more effective than vertically separated sensor arrays. Finally, the thesis concludes with a summary of findings and a discussion of further research needs.
Gli stili APA, Harvard, Vancouver, ISO e altri
32

Wang, Yi. "Data Management and Data Processing Support on Array-Based Scientific Data". The Ohio State University, 2015. http://rave.ohiolink.edu/etdc/view?acc_num=osu1436157356.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
33

Zhang, Tong. "Improving the performance of a traffic data management system". Ohio : Ohio University, 1999. http://www.ohiolink.edu/etd/view.cgi?ohiou1175198741.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
34

Bostanudin, Nurul Jihan Farhah. "Computational methods for processing ground penetrating radar data". Thesis, University of Portsmouth, 2013. https://researchportal.port.ac.uk/portal/en/theses/computational-methods-for-processing-ground-penetrating-radar-data(d519f94f-04eb-42af-a504-a4c4275d51ae).html.

Testo completo
Abstract (sommario):
The aim of this work was to investigate signal processing and analysis techniques for Ground Penetrating Radar (GPR) and its use in civil engineering and construction industry. GPR is the general term applied to techniques which employ radio waves, typically in the Mega Hertz and Giga Hertz range, to map structures and features buried in the ground or in manmade structures. GPR measurements can suffer from large amount of noise. This is primarily caused by interference from other radio-wave-emitting devices (e.g., cell phones, radios, etc.) that are present in the surrounding area of the GPR system during data collection. In addition to noise, presence of clutter – reflections from other non-target objects buried underground in the vicinity of the target can make GPR measurement difficult to understand and interpret, even for the skilled human, GPR analysts. This thesis is concerned with the improvements and processes that can be applied to GPR data in order to enhance target detection and characterisation process particularly with multivariate signal processing techniques. Those primarily include Principal Component Analysis (PCA) and Independent Component Analysis (ICA). Both techniques have been investigated, implemented and compared regarding their abilities to separate the target originating signals from the noise and clutter type signals present in the data. Combination of PCA and ICA (SVDPICA) and two-dimensional PCA (2DPCA) are the specific approaches adopted and further developed in this work. Ability of those methods to reduce the amount of clutter and unwanted signals present in GPR data have been investigated and reported in this thesis, suggesting that their use in automated analysis of GPR images is a possibility. Further analysis carried out in this work concentrated on analysing the performance of developed multivariate signal processing techniques and at the same time investigating the possibility of identifying and characterising the features of interest in pre-processed GPR images. The driving idea behind this part of work was to extract the resonant modes present in the individual traces of each GPR image and to use properties of those poles to characterise target. Three related but different methods have been implemented and applied in this work – Extended Prony, Linear Prediction Singular Value Decomposition and Matrix Pencil methods. In addition to these approaches, PCA technique has been used to reduce dimensionality of extracted traces and to compare signals measured in various experimental setups. Performance analysis shows that Matrix Pencil offers the best results.
Gli stili APA, Harvard, Vancouver, ISO e altri
35

Mohammed, Amin Rasti Jameel. "Using Associative Processing to Simplify Current Air Traffic Control". Kent State University / OhioLINK, 2015. http://rave.ohiolink.edu/etdc/view?acc_num=kent1452616856.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
36

Metarko, Jeffrey Craig. "A procedure to catalog and access traffic data from pre-recorded surveillance camera videotapes". Thesis, Georgia Institute of Technology, 1997. http://hdl.handle.net/1853/19066.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
37

Grinman, Alex J. "Natural language processing on encrypted patient data". Thesis, Massachusetts Institute of Technology, 2016. http://hdl.handle.net/1721.1/113438.

Testo completo
Abstract (sommario):
Thesis: M. Eng., Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science, 2016.
This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.
Cataloged from student-submitted PDF version of thesis.
Includes bibliographical references (pages 85-86).
While many industries can benefit from machine learning techniques for data analysis, they often do not have the technical expertise nor computational power to do so. Therefore, many organizations would benefit from outsourcing their data analysis. Yet, stringent data privacy policies prevent outsourcing sensitive data and may stop the delegation of data analysis in its tracks. In this thesis, we put forth a two-party system where one party capable of powerful computation can run certain machine learning algorithms from the natural language processing domain on the second party's data, where the first party is limited to learning only specific functions of the second party's data and nothing else. Our system provides simple cryptographic schemes for locating keywords, matching approximate regular expressions, and computing frequency analysis on encrypted data. We present a full implementation of this system in the form of a extendible software library and a command line interface. Finally, we discuss a medical case study where we used our system to run a suite of unmodified machine learning algorithms on encrypted free text patient notes.
by Alex J. Grinman.
M. Eng.
Gli stili APA, Harvard, Vancouver, ISO e altri
38

Westlund, Kenneth P. (Kenneth Peter). "Recording and processing data from transient events". Thesis, Massachusetts Institute of Technology, 1988. https://hdl.handle.net/1721.1/129961.

Testo completo
Abstract (sommario):
Thesis (B.S.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 1988.
Includes bibliographical references.
by Kenneth P. Westlund Jr.
Thesis (B.S.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 1988.
Gli stili APA, Harvard, Vancouver, ISO e altri
39

Setiowijoso, Liono. "Data Allocation for Distributed Programs". PDXScholar, 1995. https://pdxscholar.library.pdx.edu/open_access_etds/5102.

Testo completo
Abstract (sommario):
This thesis shows that both data and code must be efficiently distributed to achieve good performance in a distributed system. Most previous research has either tried to distribute code structures to improve parallelism or to distribute data to reduce communication costs. Code distribution (exploiting functional parallelism) is an effort to distribute or to duplicate function codes to optimize parallel performance. On the other hand, data distribution tries to place data structures as close as possible to the function codes that use it, so that communication cost can be reduced. In particular, dataflow researchers have primarily focused on code partitioning and assignment. We have adapted existing data allocation algorithms for use with an existing dataflow-based system, ParPlum. ParPlum allows the execution of dataflow graphs on networks of workstations. To evaluate the impact of data allocation, we extended ParPlum to more effectively handle data structures. We then implemented tools to extract from dataflow graphs information that is relevant to the mapping algorithms and fed this information to our version of a data distribution algorithm. To see the relation between code and data parallelism we added optimization to optimize the distribution of the loop function components and the data structure access components. All of these are done automatically without programmer or user involvement. We ran a number of experiments using matrix multiplication as our workload. We used different numbers of processors and different existing partitioning and allocation algorithm. Our results show that automatic data distribution greatly improves the performance of distributed dataflow applications. For example, with 15 x 15 matrices, applying data distribution speeds up execution about 80% on 7 machines. Using data distribution and our code-optimizations on 7 machines speeds up execution over the base case by 800%. Our work shows that it is possible to make efficient use of distributed networks with compiler support and shows that both code mapping and data mapping must be considered to achieve optimal performance.
Gli stili APA, Harvard, Vancouver, ISO e altri
40

Villegas, Ruben M. M. "Statistical processing for telecommunication networks applied to ATM traffic monitoring". Thesis, Loughborough University, 1997. https://dspace.lboro.ac.uk/2134/6760.

Testo completo
Abstract (sommario):
Within the fields of network operation and performance measurement, it is a common requirement that the technologies involved must provide the basis for an effective, reliable, measurable and controllable service. In order to comply with the service performance criteria, the constrains often lead to very complex techniques and methodologies for the simulation, control, test, and measurement processes. This thesis addresses some of the factors that contribute to the overall spectrum of statistical performance measurements in telecommunication services. Specifically, it is concerned with the development of three low complexity and effective techniques for real-time traffic generation, control and measurement. These techniques have proved to be accurate and near optimum. In the three cases the work starts with a literature survey of known methodologies, and later new techniques are proposed and investigated by simulating the processes involved. The work is based on the use of high-speed Asynchronous Transfer Mode (ATM) networks. The problem of developing a fast traffic generation technique for the simulation of Variable Bit Rate traffic sources is considered in the first part of this thesis. For this purpose, statistical measures are obtained from the analysis of different traffic profiles or from the literature. With the aid of these measures, a model for the fast generation of Variable Bit Rate traffic at different time resolutions is developed. The simulated traffic is then analysed in order to obtain the equivalent set of statistical measures and these are compared against those observed in real traffic traces. The subject of traffic control comprises a very wide area in communication networks. It refers to the generalised classification of actions such as Connection Admission and Flow Control, Traffic Policing and Shaping. In the second part of this thesis, a method to modify the instantaneous traffic profile of a variable rate source is developed. It is particularly useful for services which have a hard bound on the cell loss probability, but a soft bound on the admissible delay, matching the characteristics of some of the services provided by ATM networks. Finally, this thesis is also concerned with a particular aspect of the operation and management of high speed networks, or OAM functions plane, namely with the monitoring of network resources. A monitoring technique based on numerical approximation and statistical sampling methods is developed and later used to characterise a particular traffic stream, or a particular connection, within a high speed network. The resulting algorithms are simple and computationally inexpensive, but effective and accurate at the same time, and are suitable for real-time processing.
Gli stili APA, Harvard, Vancouver, ISO e altri
41

Jakovljevic, Sasa. "Data collecting and processing for substation integration enhancement". Texas A&M University, 2003. http://hdl.handle.net/1969/93.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
42

Cacciola, Stephen J. "Fusion of Laser Range-Finding and Computer Vision Data for Traffic Detection by Autonomous Vehicles". Thesis, Virginia Tech, 2007. http://hdl.handle.net/10919/36126.

Testo completo
Abstract (sommario):
The DARPA Challenges were created in response to a Congressional and Department of Defense (DoD) mandate that one-third of US operational ground combat vehicles be unmanned by the year 2015. The Urban Challenge is the latest competition that tasks industry, academia, and inventors with designing an autonomous vehicle that can safely operate in an urban environment. A basic and important capability needed in a successful competition vehicle is the ability to detect and classify objects. The most important objects to classify are other vehicles on the road. Navigating traffic, which includes other autonomous vehicles, is critical in the obstacle avoidance and decision making processes. This thesis provides an overview of the algorithms and software designed to detect and locate these vehicles. By combining the individual strengths of laser range-finding and vision processing, the two sensors are able to more accurately detect and locate vehicles than either sensor acting alone. The range-finding module uses the built-in object detection capabilities of IBEO Alasca laser rangefinders to detect the location, size, and velocity of nearby objects. The Alasca units are designed for automotive use, and so they alone are able to identify nearby obstacles as vehicles with a high level of certainty. After some basic filtering, an object detected by the Alasca scanner is given an initial classification based on its location, size, and velocity. The vision module uses the location of these objects as determined by the ranger finder to extract regions of interest from large images through perspective transformation. These regions of the image are then examined for distinct characteristics common to all vehicles such as tail lights and tires. Checking multiple characteristics helps reduce the number of false-negative detections. Since the entire image is never processed, the image size and resolution can be maximized to ensure the characteristics are as clear as possible. The existence of these characteristics is then used to modify the certainty level from the IBEO and determine if a given object is a vehicle.
Master of Science
Gli stili APA, Harvard, Vancouver, ISO e altri
43

Aygar, Alper. "Doppler Radar Data Processing And Classification". Master's thesis, METU, 2008. http://etd.lib.metu.edu.tr/upload/12609890/index.pdf.

Testo completo
Abstract (sommario):
In this thesis, improving the performance of the automatic recognition of the Doppler radar targets is studied. The radar used in this study is a ground-surveillance doppler radar. Target types are car, truck, bus, tank, helicopter, moving man and running man. The input of this thesis is the output of the real doppler radar signals which are normalized and preprocessed (TRP vectors: Target Recognition Pattern vectors) in the doctorate thesis by Erdogan (2002). TRP vectors are normalized and homogenized doppler radar target signals with respect to target speed, target aspect angle and target range. Some target classes have repetitions in time in their TRPs. By the use of these repetitions, improvement of the target type classification performance is studied. K-Nearest Neighbor (KNN) and Support Vector Machine (SVM) algorithms are used for doppler radar target classification and the results are evaluated. Before classification PCA (Principal Component Analysis), LDA (Linear Discriminant Analysis), NMF (Nonnegative Matrix Factorization) and ICA (Independent Component Analysis) are implemented and applied to normalized doppler radar signals for feature extraction and dimension reduction in an efficient way. These techniques transform the input vectors, which are the normalized doppler radar signals, to another space. The effects of the implementation of these feature extraction algoritms and the use of the repetitions in doppler radar target signals on the doppler radar target classification performance are studied.
Gli stili APA, Harvard, Vancouver, ISO e altri
44

Lu, Feng. "Big data scalability for high throughput processing and analysis of vehicle engineering data". Thesis, KTH, Skolan för informations- och kommunikationsteknik (ICT), 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-207084.

Testo completo
Abstract (sommario):
"Sympathy for Data" is a platform that is utilized for Big Data automation analytics. It is based on visual interface and workflow configurations. The main purpose of the platform is to reuse parts of code for structured analysis of vehicle engineering data. However, there are some performance issues on a single machine for processing a large amount of data in Sympathy for Data. There are also disk and CPU IO intensive issues when the data is oversized and the platform need fits comfortably in memory. In addition, for data over the TB or PB level, the Sympathy for data needs separate functionality for efficient processing simultaneously and scalable for distributed computation functionality. This paper focuses on exploring the possibilities and limitations in using the Sympathy for Data platform in various data analytic scenarios within the Volvo Cars vision and strategy. This project re-writes the CDE workflow for over 300 nodes into pure Python script code and make it executable on the Apache Spark and Dask infrastructure. We explore and compare both distributed computing frameworks implemented on Amazon Web Service EC2 used for 4 machine with a 4x type for distributed cluster measurement. However, the benchmark results show that Spark is superior to Dask from performance perspective. Apache Spark and Dask will combine with Sympathy for Data products for a Big Data processing engine to optimize the system disk and CPU IO utilization. There are several challenges when using Spark and Dask to analyze large-scale scientific data on systems. For instance, parallel file systems are shared among all computing machines, in contrast to shared-nothing architectures. Moreover, accessing data stored in commonly used scientific data formats, such as HDF5 is not tentatively supported in Spark. This report presents research carried out on the next generation of Big Data platforms in the automotive industry called "Sympathy for Data". The research questions focusing on improving the I/O performance and scalable distributed function to promote Big Data analytics. During this project, we used the Dask.Array parallelism features for interpretation the data sources as a raster shows in table format, and Apache Spark used as data processing engine for parallelism to load data sources to memory for improving the big data computation capacity. The experiments chapter will demonstrate 640GB of engineering data benchmark for single node and distributed computation mode to evaluate the Sympathy for Data Disk CPU and memory metrics. Finally, the outcome of this project improved the six times performance of the original Sympathy for data by developing a middleware SparkImporter. It is used in Sympathy for Data for distributed computation and connected to the Apache Spark for data processing through the maximum utilization of the system resources. This improves its throughput, scalability, and performance. It also increases the capacity of the Sympathy for data to process Big Data and avoids big data cluster infrastructures.
Gli stili APA, Harvard, Vancouver, ISO e altri
45

Abrahamsson, Henrik. "Network overload avoidance by traffic engineering and content caching". Doctoral thesis, SICS, 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:ri:diva-24104.

Testo completo
Abstract (sommario):
The Internet traffic volume continues to grow at a great rate, now driven by video and TV distribution. For network operators it is important to avoid congestion in the network, and to meet service level agreements with their customers. This thesis presents work on two methods operators can use to reduce links loads in their networks: traffic engineering and content caching. This thesis studies access patterns for TV and video and the potential for caching. The investigation is done both using simulation and by analysis of logs from a large TV-on-Demand system over four months. The results show that there is a small set of programs that account for a large fraction of the requests and that a comparatively small local cache can be used to significantly reduce the peak link loads during prime time. The investigation also demonstrates how the popularity of programs changes over time and shows that the access pattern in a TV-on-Demand system very much depends on the content type. For traffic engineering the objective is to avoid congestion in the network and to make better use of available resources by adapting the routing to the current traffic situation. The main challenge for traffic engineering in IP networks is to cope with the dynamics of Internet traffic demands. This thesis proposes L-balanced routings that route the traffic on the shortest paths possible but make sure that no link is utilised to more than a given level L. L-balanced routing gives efficient routing of traffic and controlled spare capacity to handle unpredictable changes in traffic. We present an L-balanced routing algorithm and a heuristic search method for finding L-balanced weight settings for the legacy routing protocols OSPF and IS-IS. We show that the search and the resulting weight settings work well in real network scenarios.
Gli stili APA, Harvard, Vancouver, ISO e altri
46

Chen, Jiawen (Jiawen Kevin). "Efficient data structures for piecewise-smooth video processing". Thesis, Massachusetts Institute of Technology, 2011. http://hdl.handle.net/1721.1/66003.

Testo completo
Abstract (sommario):
Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2011.
Cataloged from PDF version of thesis.
Includes bibliographical references (p. 95-102).
A number of useful image and video processing techniques, ranging from low level operations such as denoising and detail enhancement to higher level methods such as object manipulation and special effects, rely on piecewise-smooth functions computed from the input data. In this thesis, we present two computationally efficient data structures for representing piecewise-smooth visual information and demonstrate how they can dramatically simplify and accelerate a variety of video processing algorithms. We start by introducing the bilateral grid, an image representation that explicitly accounts for intensity edges. By interpreting brightness values as Euclidean coordinates, the bilateral grid enables simple expressions for edge-aware filters. Smooth functions defined on the bilateral grid are piecewise-smooth in image space. Within this framework, we derive efficient reinterpretations of a number of edge-aware filters commonly used in computational photography as operations on the bilateral grid, including the bilateral filter, edgeaware scattered data interpolation, and local histogram equalization. We also show how these techniques can be easily parallelized onto modern graphics hardware for real-time processing of high definition video. The second data structure we introduce is the video mesh, designed as a flexible central data structure for general-purpose video editing. It represents objects in a video sequence as 2.5D "paper cutouts" and allows interactive editing of moving objects and modeling of depth, which enables 3D effects and post-exposure camera control. In our representation, we assume that motion and depth are piecewise-smooth, and encode them sparsely as a set of points tracked over time. The video mesh is a triangulation over this point set and per-pixel information is obtained by interpolation. To handle occlusions and detailed object boundaries, we rely on the user to rotoscope the scene at a sparse set of frames using spline curves. We introduce an algorithm to robustly and automatically cut the mesh into local layers with proper occlusion topology, and propagate the splines to the remaining frames. Object boundaries are refined with per-pixel alpha mattes. At its core, the video mesh is a collection of texture-mapped triangles, which we can edit and render interactively using graphics hardware. We demonstrate the effectiveness of our representation with special effects such as 3D viewpoint changes, object insertion, depthof- field manipulation, and 2D to 3D video conversion.
by Jiawen Chen.
Ph.D.
Gli stili APA, Harvard, Vancouver, ISO e altri
47

Jakubiuk, Wiktor. "High performance data processing pipeline for connectome segmentation". Thesis, Massachusetts Institute of Technology, 2015. http://hdl.handle.net/1721.1/106122.

Testo completo
Abstract (sommario):
Thesis: M. Eng. in Computer Science and Engineering, Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science, February 2016.
"December 2015." Cataloged from PDF version of thesis.
Includes bibliographical references (pages 83-88).
By investigating neural connections, neuroscientists try to understand the brain and reconstruct its connectome. Automated connectome reconstruction from high resolution electron miscroscopy is a challenging problem, as all neurons and synapses in a volume have to be detected. A mm3 of a high-resolution brain tissue takes roughly a petabyte of space that the state-of-the-art pipelines are unable to process to date. A high-performance, fully automated image processing pipeline is proposed. Using a combination of image processing and machine learning algorithms (convolutional neural networks and random forests), the pipeline constructs a 3-dimensional connectome from 2-dimensional cross-sections of a mammal's brain. The proposed system achieves a low error rate (comparable with the state-of-the-art) and is capable of processing volumes of 100's of gigabytes in size. The main contributions of this thesis are multiple algorithmic techniques for 2- dimensional pixel classification of varying accuracy and speed trade-off, as well as a fast object segmentation algorithm. The majority of the system is parallelized for multi-core machines, and with minor additional modification is expected to work in a distributed setting.
by Wiktor Jakubiuk.
M. Eng. in Computer Science and Engineering
Gli stili APA, Harvard, Vancouver, ISO e altri
48

Nguyen, Qui T. "Robust data partitioning for ad-hoc query processing". Thesis, Massachusetts Institute of Technology, 2015. http://hdl.handle.net/1721.1/106004.

Testo completo
Abstract (sommario):
Thesis: M. Eng., Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science, 2015.
This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.
Cataloged from student-submitted PDF version of thesis.
Includes bibliographical references (pages 59-62).
Data partitioning can significantly improve query performance in distributed database systems. Most proposed data partitioning techniques choose the partitioning based on a particular expected query workload or use a simple upfront scheme, such as uniform range partitioning or hash partitioning on a key. However, these techniques do not adequately address the case where the query workload is ad-hoc and unpredictable, as in many analytic applications. The HYPER-PARTITIONING system aims to ll that gap, by using a novel space-partitioning tree on the space of possible attribute values to dene partitions incorporating all attributes of a dataset. The system creates a robust upfront partitioning tree, designed to benet all possible queries, and then adapts it over time in response to the actual workload. This thesis evaluates the robustness of the upfront hyper-partitioning algorithm, describes the implementation of the overall HYPER-PARTITIONING system, and shows how hyper-partitioning improves the performance of both selection and join queries.
by Qui T. Nguyen.
M. Eng.
Gli stili APA, Harvard, Vancouver, ISO e altri
49

Wanichworanant, Noppadol. "A traffic engineering approach employing genetic algorithms over MPLS networks". online access from Digital Dissertation Consortium access full-text, 2003. http://libweb.cityu.edu.hk/cgi-bin/er/db/ddcdiss.pl?3134009.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
50

Moskaluk, John. "Arterial priority option for the TRANSYT-7F traffic-signal-timing program". Diss., Georgia Institute of Technology, 1987. http://hdl.handle.net/1853/19428.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Offriamo sconti su tutti i piani premium per gli autori le cui opere sono incluse in raccolte letterarie tematiche. Contattaci per ottenere un codice promozionale unico!

Vai alla bibliografia