To see the other types of publications on this topic, follow the link: Data requirements.

Dissertations / Theses on the topic 'Data requirements'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Data requirements.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Meyer, Harald, and Dominik Kuropka. "Requirements for service composition." Universität Potsdam, 2005. http://opus.kobv.de/ubp/volltexte/2009/3309/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Beckman, Joseph M. "Legal requirements of secure systems." Thesis, Kansas State University, 1985. http://hdl.handle.net/2097/9822.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Fuller, Sean P. "Satisfying naval low data mobile communication requirements." Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 1997. http://handle.dtic.mil/100.2/ADA339164.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Holma, Erik. "Data Requirements for a Look-Ahead System." Thesis, Linköping University, Department of Electrical Engineering, 2007. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-10197.

Full text
Abstract:

Look ahead cruise control deals with the concept of using recorded topographic road data combined with a GPS to control vehicle speed. The purpose of this is to save fuel without a change in travel time for a given road. This thesis explores the sensitivity of different disturbances for look ahead systems. Two different systems are investigated, one using a simple precalculated speed trajectory without feedback and the second based upon a model predictive control scheme with dynamic programming as optimizing algorithm.

Defect input data like bad positioning, disturbed angle data, faults in mass estimation and wrong wheel radius are discussed in this thesis. Also some investigations of errors in the environmental model for the systems are done. Simulations over real road profiles with two different types of quantization of the road slope data are done. Results from quantization of the angle data in the system are important since quantization will be unavoidable in an implementation of a topographic road map.

The results from the simulations shows that disturbance of the fictive road profiles used results in quite large deviations from the optimal case. For the recorded real road sections however the differences are close to zero. Finally conclusions of how large deviations from real world data a look ahead system can tolerate are drawn.

APA, Harvard, Vancouver, ISO, and other styles
5

Dulaney, D. R., Kurt J. Maier, and Phillip R. Scheuerman. "Data Requirements for Developing Effective Pathogen TMDLs." Digital Commons @ East Tennessee State University, 2005. https://dc.etsu.edu/etsu-works/2938.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Keerthi, Thomas. "Distilling mobile privacy requirements from qualitative data." Thesis, Open University, 2014. http://oro.open.ac.uk/40121/.

Full text
Abstract:
As mobile computing applications have become commonplace, it is increasingly important for them to address end-users' privacy requirements. Mobile privacy requirements depend on a number of contextual socio-cultural factors to which mobility adds another level of contextual variation. However, traditional requirements elicitation methods do not sufficiently account for contextual factors and therefore cannot be used effectively to represent and analyse the privacy requirements of mobile end users. On the other hand, methods that investigate contextual factors tend to produce data which can be difficult to use for requirements modelling. To address this problem, we have developed a Distillation approach that employs a problem analysis model to extract and refine privacy requirements for mobile applications from raw data gathered through empirical studies involving real users. Our aim was to enable the extraction of mobile privacy requirements that account for relevant contextual factors while contributing to the software design and implementation process. A key feature of the distillation approach is a problem structuring framework called privacy facets (PriF). The facets in the PriF framework support the identification of privacy requirements from different contextual perspectives namely - actors, information, information-flows and places. The PriF framework also aids in uncovering privacy determinants and threats that a system must take into account in order to support the end-user's privacy. In this work, we first show the working of distillation using qualitative data taken from an empirical study which involved social-networking practices of mobile users. As a means of validating distillation, another distinctly separate qualitative dataset from a location-tracking study is used, in both cases, the empirical studies relate to privacy issues faced by real users observed in their mobile environment.
APA, Harvard, Vancouver, ISO, and other styles
7

Korziuk, Kamil, and Tomasz Podbielski. "Engineering Requirements for platform, integrating health data." Thesis, Blekinge Tekniska Högskola, Institutionen för tillämpad signalbehandling, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-16089.

Full text
Abstract:
In the world that we already live people are more and more on the run and population ageing significantly raise, new technologies are trying to bring best they can to meet humans’ expectations. Survey’s results, that was done during technology conference with elderly on Blekinge Institute of Technology showed, that no one of them has any kind of help in their home but they would need it. This Master thesis present human health state monitoring to focus on fall detection. Health care systems will not completely stop cases when humans are falling down, but further studying causes can prevent them.In this thesis, integration of sensors for vital parameters measurements, human position and measured data evaluation are presented. This thesis is based on specific technologies compatible with Arduino Uno and Arduino Mega microcontrollers, measure sensors and data exchange between data base, MATLAB/Simulink and web page. Sensors integrated in one common system bring possibility to examine the patient health state and call aid assistance in case of health decline or serious injury risk.System efficiency was based on many series of measurement. First phase a comparison between different filter was carried out to choose one with best performance. Kalman filtering and trim parameter for accelerometer was used to gain satisfying results and the final human fall detection algorithm. Acquired measurement and data evaluation showed that Kalmar filtering allow to reach high performance and give the most reliable results. In the second phase sensor placement was tested. Collected data showed that human fall detection is correctly recognized by system with high accuracy. Designed system as a result allow to measure human health and vital state like: temperature, heartbeat, position and activity. Additionally, system gives online overview possibility with actual health state, historical data and IP camera preview when alarm was raised after bad health condition.
APA, Harvard, Vancouver, ISO, and other styles
8

Mize, Dennis. "A Study of Requirements Volatility and Footprint Visualization Properties in Evolving Use Case Data Sets." NSUWorks, 2012. http://nsuworks.nova.edu/gscis_etd/251.

Full text
Abstract:
Current Requirements Engineering (RE) mechanisms used to measure Requirements Volatility (RV) employ textual-based artifacts for tracking changes to software requirements that primarily consist of detailed requirements documents that are difficult to understand by most software system stakeholders making it almost impossible for these stakeholders to gain a clear picture of how changes to a requirement will impact the total system overall. Research in the area of RE visualizations have proven that graphically representing software information in the form of visualizations can communicate complex information regarding requirements to system stakeholders in a manner that does not require an in-depth knowledge of RE technical documentation. This research used the concepts of Footprint Visualizations (FVs) to graphically represent software requirements as they evolved over time and analyzed these FV image artifacts to determine RV ratings. This work successfully demonstrated the use of FV analysis to measure RV. This work performed a qualitative study that compared the relationship between the RV ratings that were determined using the FV-based analysis methods proposed in this work to the RV ratings determined using traditional non-visual RV methods that relied on subject matter expert evaluation of a common requirements use case data set. The results of this study expanded the body of knowledge in the field of Requirements Engineering Visualization by demonstrating new analysis methods for measuring volatility in requirements use cases as they evolve over the software development life cycle process in order to aid system stakeholders in understanding the effects of changes made to requirements regardless of the individual stakeholders level of technical requirements documentation training.
APA, Harvard, Vancouver, ISO, and other styles
9

Hedman, Per. "Requirement specification Editor : REQUIREMENTS EDITOR BASED ON CONTRACT THEORY." Thesis, KTH, Skolan för informations- och kommunikationsteknik (ICT), 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-177130.

Full text
Abstract:
Vid utveckling av tyngre fordon inför man allt fler avancerade funktione. Mycket av denna funktionalitet handlar om att maskiner automatiskt ska utföra uppgifter för att assistera föraren. Detta leder till att nya risker uppstår. Och till följd av detta har man börjat skapa nya funktionella säkerhetsstandarder. ISO 26262 är en ny funktionell säkerhetsstandard som finns för vanliga personbilar men som ännu inte trätt i kraft för lastbilar. I ISO-26262 standarden ska krav kunna mappas till andra krav samt till systemarkitektur. I nuläget finns det vissa verktyg på marknaden som stödjer användaren när den skriver kravspecifikationer. Men undersökningar av verktyg ledde till att vi kommit fram till att alla hade någon brist. Och ingen hade bra stöd för mappning mellan krav och systemarkitektur. I detta examensarbete har arbetet varit att testa implementera funktionalitet för ett verktyg som assisterar användaren på olika sätt när den skriver kravspecifikationer. Baserat på kontraktteori och konceptet om portar som hjälp för att koppla samman krav med systemarkitektur ska applikationen se till att det finns en formell koppling mellan dessa. För att testa och validera att portar går att använda för att testa intressant funktionalitet har också en applikation utvecklats där mycket funktionalitet implementerats. Resultatet har varit lyckat då vi baserat på kontraktteori lyckats implementera och validera att det är möjligt att använda portar för att skapa koppling mellan krav och systemarkitektur, samt mellan krav och krav. Validering av att det valda lagringsformatet JSON också förser implementeraren med nog starkt stöd för att kunna spara dessa krav så att data i filerna kan brytas ner och lagras i temporära databasen Neo4J och på så sätt skapa ett fungerande kretslopp.
When developing new heavy vehicles today demands for increasingly more advanced features are asked for. A lot of the new functionality is about machines performing tasks automatically to assist the driver when driving. This leads to new risks, and as a result a new functional safety standard has been created. ISO 26262 is a functional safety standard that today exists for ordinary cars, but has not yet became a standard for trucks. According to the ISO 26262-standard requirements can be mapped to other requirements as well as to the system architecture. At present there are several tools on the market that supports the user when writing specifications. However, our research of the tools has led us to conclude that all lacked something. For example neither of the tools had good support for mapping between requirements and system architecture. In this thesis work, functionality for a tool which is supposed to support the user in various ways when writing requirements specifications was to be examined. Based on contract theory and the concept of ports that links requirements together with system architecture, an application can ensure that there is a formal link between the two. To test the suggested functionality a prototype is being developed. The result has been a successful as we based on contract theory could validate that using ports to create links between different requirements as well as between requirements and system architecture works through the implementation of the tool. Validation that the selected storage format JSON also provides the implementer with enough support to save the requirements in a way so that the data files can be decomposed and stored in the Neo4J database.
APA, Harvard, Vancouver, ISO, and other styles
10

Birkes, Angela Yvette. "Multimedia data definition and requirements for construction applications." Diss., Georgia Institute of Technology, 1999. http://hdl.handle.net/1853/20930.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Madore, Bruno. "Reduced data requirements for phase-contrast MR angiography." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1997. http://www.collectionscanada.ca/obj/s4/f2/dsk2/tape16/PQDD_0014/NQ28005.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Hoffman, Beth Huhn. "Creating a data dictionary from a requirements specification." Thesis, Kansas State University, 1985. http://hdl.handle.net/2097/9850.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Parviainen, A. (Antti). "Product portfolio management requirements for product data management." Master's thesis, University of Oulu, 2014. http://urn.fi/URN:NBN:fi:oulu-201409021800.

Full text
Abstract:
In large organisations today the amount of products is numerous and it is challenging for senior management to have proper control and understanding over all products. As the product is the most important aspect for an organisation to consider, senior management must have ability to manage investments on products and follow development of product related indicators. Managing products as investment on portfolio level, where products are divided into a limited amount of portfolios is a solution for achieving decent control over product investments on senior management level. Product portfolio management is decision making oriented, where the goal is to make the best possible strategic and financial decisions when allocating constraint resources across the entire product portfolio. The product portfolio management aims to increase strategic fit of chosen new product projects, balance in product portfolio and maximizing value of the products. The product portfolio management is constantly ongoing, cross-functional decision making function which is present in all lifecycle states of the portfolios. In this research the product portfolios are seen as investments for mainly internal use of a decision making process. The product portfolios are items that are embodied into the case company’s product data management system and the product portfolios have own lifecycle states. Approach in this research is constructive, where a current state of the case company is analysed and based on the analysis and the literature review a construction is established. The Research questions are: 1) What are the required product structures in product data management systems to support product portfolio management practices? 2) What are the information elements and their lifecycle states and what they should be in product data management systems to support product portfolio decisions? Results of this research are the current state analysis committed in the case company and the construction of product portfolio management structure and lifecycle states. In the construction a portfolio package is defined. The portfolio package is the item used for embodying portfolios into the information systems. An information model for implementing the portfolio packages into the product data management system is introduced. The construction also presents product structure for implementing the portfolio package into the product data management system. Relation of lifecycle states between the portfolio package and other items in a product hierarchy is assessed in a nested lifecycle model. Two models, required and recommended, are suggested for the company to consider for managing the lifecycle of the portfolio package item. All the results are validated from several perspectives.
APA, Harvard, Vancouver, ISO, and other styles
14

Mahakala, Kavya Reddy. "Identifying Security Requirements using Meta-Data and Dependency Heuristics." University of Cincinnati / OhioLINK, 2018. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1543995518151544.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Uhland, Greg. "Synchronous Data Pathing: Synchronous Data Bandwidth Requirements and Its Impact on Telemetry Systems." International Foundation for Telemetering, 2014. http://hdl.handle.net/10150/577479.

Full text
Abstract:
ITC/USA 2014 Conference Proceedings / The Fiftieth Annual International Telemetering Conference and Technical Exhibition / October 20-23, 2014 / Town and Country Resort & Convention Center, San Diego, CA
With industry standard synchronous data, the clock is effectively over twice the rate of the data (Figure 1.). The resultant problem is increased synchronous infrastructure bandwidth requirements and/or costly system architectures designed to avoid transport of synchronous data. This paper will discuss a potential solution.
APA, Harvard, Vancouver, ISO, and other styles
16

Uhland, Greg. "Synchronous Data Pathing: Synchronous Data Bandwidth Requirements and Its Impact on Telemetry Systems." International Foundation for Telemetering, 2013. http://hdl.handle.net/10150/579513.

Full text
Abstract:
ITC/USA 2013 Conference Proceedings / The Forty-Ninth Annual International Telemetering Conference and Technical Exhibition / October 21-24, 2013 / Bally's Hotel & Convention Center, Las Vegas, NV
With industry standard synchronous data, the clock is effectively over twice the rate of the data (Figure 1.). The resultant problem is increased synchronous infrastructure bandwidth requirements and/or costly system architectures designed to avoid transport of synchronous data. This paper will discuss a potential solution.
APA, Harvard, Vancouver, ISO, and other styles
17

Matulevičius, Raimundas. "Process Support for Requirements Engineering : A Requirements Engineering Tool Evaluation Approach." Doctoral thesis, Norwegian University of Science and Technology, Faculty of Information Technology, Mathematics and Electrical Engineering, 2005. http://urn.kb.se/resolve?urn=urn:nbn:no:ntnu:diva-576.

Full text
Abstract:

Requirements engineering (RE) tools are software tools which provide automated assistance during the RE process. However, the RE practice relies on office tools rather than RE-tools provided by various companies. Reasons for not using the RE-tools include financial causes. The part of the problem also lies in the difficulty to evaluate such tools before acquisition to support the RE process. Hence, to support the completeness and effectiveness of RE-tool evaluation, a sound framework providing methodological guidelines is needed.

This work proposes an RE-tool evaluation approach (R-TEA), which provides a systematic way of the RE-tool assessment using two evaluation frameworks. The framework for the functional RE-tool requirements consists of three dimensions: representation, agreement, and specification. The representation dimension deals with the degree of formality, where requirements are described using informal, semiformal and formal languages. The agreement dimension deals with the degree of agreement among project participants through communication means. The specification dimension deals with the degree of requirements understanding and completeness at a given time moment. The second framework categorises the non-functional RE-tool features to process, product, and external requirements. Process requirements characterise constraints placed upon the user’s work practice. Product requirements specify the desired qualitative characteristics of RE-tools. External requirements are derived from the user’s internal and external environment.

Both frameworks are applied to a specification exemplar which application initiates preparation of the requirements specification for the RE-tool selection. Assessment of the RE-tools’ compatibility to the specified RE-tool requirements is performed using different evaluation techniques. Decision about RE-tool selection is made after summarising all the assessment results.

A prototype tool is developed supporting the frameworks and R-TEA. The R-TEA method is tested in a number of case studies. The findings report on positive trends of the frameworks, prototype and the R-TEA method.

APA, Harvard, Vancouver, ISO, and other styles
18

Allegretti, Benjamin P. "Situational awareness data requirements for a combat identification network." Thesis, Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 2000. http://handle.dtic.mil/100.2/ADA384312.

Full text
Abstract:
Thesis (M.S. in Information Technology Management) Naval Postgraduate School, September 2000.
Thesis advisor(s): Osmundson, John ; Brinkley, Douglas E. "September 2000." Includes bibliographical references (p. 161). Also available online.
APA, Harvard, Vancouver, ISO, and other styles
19

Taylor, Kevin. "Data requirements for the establishment of protected area networks." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 2000. http://www.collectionscanada.ca/obj/s4/f2/dsk1/tape4/PQDD_0031/MQ64464.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

CARVALHO, ELAINE ALVES DE. "HEURISTICS FOR DATA WAREHOUSE REQUIREMENTS ELICITATION USING PERFORMANCE INDICATORS." PONTIFÍCIA UNIVERSIDADE CATÓLICA DO RIO DE JANEIRO, 2009. http://www.maxwell.vrac.puc-rio.br/Busca_etds.php?strSecao=resultado&nrSeq=15136@1.

Full text
Abstract:
COORDENAÇÃO DE APERFEIÇOAMENTO DO PESSOAL DE ENSINO SUPERIOR
As organizações se deparam com uma necessidade cada vez maior de mudar e evoluir, mas para isso elas precisam tomar as decisões corretas. Para essa tomada de decisão, as empresas estão adotando os recursos disponibilizados pela Tecnologia da Informação (TI) como parte fundamental para apoiar suas decisões. Um componente de TI essencial para aprimorar o processo de tomada de decisão é o data warehouse. Para cumprir bem o seu papel, o data warehouse deve ser bem definido. Embora existam diversas abordagens que buscam melhorar a tarefa de identificação dos requisitos para data warehouses, poucas exploram as contribuições da Engenharia de Processos de Negócios (EPN) no processo de definição dos requisitos. Esta dissertação estuda um meio de aprimorar a tarefa de elicitação de requisitos para data warehouses, utilizando indicadores de desempenho aliados aos processos de negócio. Para isso é sugerido um conjunto de heurísticas que visam, a partir dos indicadores de desempenho, orientar a descoberta dos requisitos de data warehouse. A aplicação das heurísticas propostas é feita em um caso, facilitando a compreensão da abordagem sugerida nesse trabalho.
Organizations need to change and evolve, but for that it is necessary to make the right decisions. For this decision, companies are using Information Technology (IT) as a fundamental part to support their decisions. An essential IT component to improve the process of decision making is the data warehouse. In order to fulfill its role well, the data warehouse must be well defined. There are various approaches that try to improve the task of identifying data warehouses requirements, but few explore the contributions of Business Processes Engineering (BPE) in the process of requirements gathering. This dissertation studies how to improve data warehouses requirements elicitation using performance indicators allied to business processes. For this it is suggested a set of heuristics designed to guide performance measures identification and data warehouse requirements discovery. The heuristics are applied in a case to facilitate understanding of suggested approach in this work.
APA, Harvard, Vancouver, ISO, and other styles
21

Challa, Harshitha. "Multivariate Time-Series Data Requirements in Deep Learning Models." University of Cincinnati / OhioLINK, 2021. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1626356774254081.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Olthuis, Jorrit. "Verification of Formal Requirements through Tracing." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-289947.

Full text
Abstract:
Software development in the railway application is governed by strict standards which aim to ensure safety. It is for example highly recommended to use formal methods when specifying requirements. Moreover, it is mandatory to have certain roles be fulfilled by different people. A common technique is developing software tests for the requirements. Making sure that software requirements are properly described, interpreted and implemented by different people is a major challenge. Tests fully depend on the tester to cover all scenarios. Having more methods that simplify requirement tracing and that depends less on thoroughness of the tester would provide many benefits. This thesis investigates whether and how software tracing can be used to validate formal requirements of software. The goal is to perform trace validation such that it can be used to complement more traditional verification techniques. By verifying formal requirements on traces, the detection of errors depends on the events in the traces. As a consequence, more traces provide a higher chance of detecting errors. Thereby eliminating risk of the tester missing important cases. The presented verification approach firstly specifies requirements in linear temporal logic and converts this specification to a non-deterministic Büchi automaton, or a finite state machine which is also evaluated. Secondly, the approach describes several alternatives for collecting traces and how to link those to the formal specification. Lastly, the verification approach proposes an algorithm which takes the Büchi automaton and a trace to detect violations of the requirement. The validation approach is implemented in form of multiple tools and its operation shown by means of a toy example. This example models a railway application such that its requirements can be verified using the tools. The results are then used to show how these tools can be used in an actual railway application. Using these research outcomes, and the stand-alone tool, an implementation in Trace Compass is created. This can, just like the stand-alone tool, decide for each pair of trace and requirement whether the trace violates the requirement.
Programvaruutveckling i järnvägsapplikationen styrs av strikta standarder som syftar till att säkerställa säkerheten. Det rekommenderas till exempel starkt att använda formella metoder när krav anges. Dessutom är det obligatoriskt att vissa roller uppfylls av olika ingenjörer. En vanlig teknik är att utveckla programvarutest för kraven. Det är en stor utmaning att se till att programvarukrav beskrivs, tolkas och implementeras på rätt sätt av olika ingenjörer. Tester beror helt på testaren för att täcka alla scenarier. Att ha fler metoder som förenklar spårning av krav och som beror mindre på testarens noggrannhet skulle ge många fördelar. Denna avhandling undersöker om och hur spårning av programvara (software tracing) kan användas för att värdera formella krav på programvara. Målet är att utföra spårningsvalidering så att den kan användas för att komplettera mer traditionella verifieringstekniker. Genom att verifiera formella krav på spår (trace) beror upptäckten av fel på händelserna i spåren. Som en konsekvens ger fler spår större möjlighet för att detektera fel. Därmed elimineras risken för att testaren missar viktiga fall. Den presenterade verifieringsmetoden specificerar först kraven i linear temporal logic och omvandlar denna specifikation till en icke-deterministisk Büchi-automat, eller en ändlig tillståndsautomat som också utvärderas. För det andra beskriver tillvägagångssättet flera alternativ för att samla in spår och hur man länkar dem till den formella specifikationen. Slutligen föreslår verifieringsmetoden en algoritm som tar Büchi-automaten och ett spår för att upptäcka överträdelser av kravet. Valideringsmetoden implementeras i form av flera verktyg och dess funktion visas med hjälp av ett leksaksexempel. Detta exempel modellerar en järnvägsapplikation så att dess krav kan verifieras med verktygen. Resultaten används sedan för att visa hur dessa verktyg kan användas i en verklig järnvägsapplikation. Med hjälp av dessa forskningsresultat och det fristående verktyget skapas en implementering i Trace Compass. Detta kan, precis som det fristående verktyget, avgöra för varje par av spår och krav om spårningen bryter mot kravet.
APA, Harvard, Vancouver, ISO, and other styles
23

Levin, Lukas, and Christoffer Stjernlöf. "Automated Testing Toolkit Service : Software Requirements Specification." Thesis, KTH, Skolan för informations- och kommunikationsteknik (ICT), 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-227859.

Full text
Abstract:
Frequent automated testing of software services is vital to speed up the development cycle and ensure upgrades do not break existing features. With a centralised testing service, it is also possible to catch errors at customer sites before they become severe enough that the customers (or in the end – regular people) start suffering from them. It also gives the customers an insight into how well their services are working at a predictable cost. When developing a larger software system such as an automated testing service toolkit, a requirements specification can drastically cut development costs at the expense of a larger up-front investment. We discover some of the immediately important requirements for the first version of such an automated testing toolkit.
Upprepade automatiserade tester av mjukvarutjänster är mycket viktiga för att öka utvecklingshastigheten och försäkra att uppgraderingar inte påverkar existerande, äldre delar av systemet. Med en centraliserad testningstjänst är det också möjligt att upptäcka fel i kundens miljö innan de blir allvarliga nog att kunden märker av dem. Det ger även kunden en möjlighet att se hur väl deras tjänster fungerar utan att behöva betala oförutsedda driftrelaterade kostnader. När större mjukvarusystem, som en centraliserad tjänst för automatiserade tester, kan en kravspecifikation drastiskt minska utvecklingskostnaden mot en större initial investering. Vi har undersökt vilka några av de omedelbart viktiga kraven är för en första version av denna typ av tjänst.
APA, Harvard, Vancouver, ISO, and other styles
24

Whelan, Peter Timothy. "CAD/CAM data base management systems requirements for mechanical parts." Diss., Georgia Institute of Technology, 1989. http://hdl.handle.net/1853/17692.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Penrose, Craig B. "Data requirements for availability based sparing the U.S. Marine Corps." Thesis, Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 1998. http://handle.dtic.mil/100.2/ADA356861.

Full text
Abstract:
Thesis (M.S. in Information Technology Management) Naval Postgraduate School, September 1998.
"September 1998." Thesis advisor(s): Kevin R. Gue, Mark E. Nissen. Includes bibliographical references (p. 49-50). Also available online.
APA, Harvard, Vancouver, ISO, and other styles
26

Ahlström, Daniel. "Minimizing memory requirements for deterministic test data in embedded testing." Thesis, Linköping University, Linköping University, Department of Computer and Information Science, 2010. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-54655.

Full text
Abstract:

Embedded and automated tests reduce maintenance costs for embedded systems installed in remote locations. Testing multiple components of an embedded system, connected on a scan chain, using deterministic test patterns stored in a system provide high fault coverage but require large system memory. This thesis presents an approach to reduce test data memory requirements by the use of a test controller program, utilizing the observation of that there are multiple components of the same type in a system. The program use deterministic test patterns specific to every component type, which is stored in system memory, to create fully defined test patterns when needed. By storing deterministic test patterns specific to every component type, the program can use the test patterns for multiple tests and several times within the same test. The program also has the ability to test parts of a system without affecting the normal functional operation of the rest of the components in the system and without an increase of test data memory requirements. Two experiments were conducted to determine how much test data memory requirements are reduced using the approach presented in this thesis. The results for the experiments show up to 26.4% reduction of test data memory requirements for ITC´02 SOC test benchmarks and in average 60% reduction of test data memory requirements for designs generated to gain statistical data.

APA, Harvard, Vancouver, ISO, and other styles
27

Grohsschmiedt, Steffen. "Making Big Data Smaller : Reducing the storage requirements for big data with erasure coding for Hadoop." Thesis, KTH, Skolan för informations- och kommunikationsteknik (ICT), 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-177201.

Full text
Abstract:
The amount of data stored in modern data centres is growing rapidly nowadays. Large-scale distributed file systems, that maintain the massive data sets in data centres, are designed to work with commodity hardware. Due to the quality and quantity of the hardware components in such systems, failures are considered normal events and, as such, distributed file systems are designed to be highly fault-tolerant. A common approach to achieve fault tolerance is using redundancy by storing three copies of a file across different storage nodes, thereby increasing the storage requirements by a factor of three and further aggravating the storage problem. A concrete implementation of such a file system is the Hadoop Distributed File System (HDFS). This thesis explores the use of RAID-like mechanisms in order to decrease the storage requirements for big data. We designed and implemented a prototype that extends HDFS with a simple but powerful erasure coding API. Compared to existing approaches, we decided to locate the erasure-coding management logic in the HDFS NameNode, as this allows us to use internal HDFS APIs and state. Because of that, we can repair failures associated with erasurecoded files more quickly and with lower cost. We evaluate our prototype, and we also show that the use of erasure coding instead of replication can greatly decrease the storage requirements of big data without scarifying reliability and availability. Finally, we argue that our API can support a large range of custom encoding strategies, while adding the erasure coding logic to the NameNode can significantly improve the management of the encoded files.
APA, Harvard, Vancouver, ISO, and other styles
28

Pong, Lih, and 龐立. "Formal data flow diagrams (FDFD): a petri-netbased requirements specification language." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 1985. http://hub.hku.hk/bib/B31207406.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Els, Zelda. "Data availability and requirements for flood hazard mapping in South Africa." Thesis, Stellenbosch : Stellenbosch University, 2011. http://hdl.handle.net/10019.1/17803.

Full text
Abstract:
Thesis (MSc)--Stellenbosch University, 2011.
ENGLISH ABSTRACT: Floods have been identified as one of the major natural hazards occurring in South Africa. A disaster risk assessment forms the first phase in planning for effective disaster risk management through identifying and assessing all hazards that occur within a geographical area, as required by the Disaster Management Act (Act No. 57 of 2002). The National Water Act (Act No. 36 of 1998) requires that flood lines be determined for areas where high risk dams exist and where new town developments occur. However, very few flood hazard maps exist in South Africa for rural areas. The data required for flood modelling analysis is very limited, particularly in rural areas. This study investigated whether flood hazard maps can be created using the existing data sources. A literature review of flood modelling methodologies, data requirements and flood hazard mapping was carried out and an assessment of all available flood-related data sources in South Africa was made. The most appropriate data sources were identified and used to assess an evaluation site. Through combining GIS and hydraulic modelling, results were obtained that indicate the likely extent, frequency and depth of predicted flood events. The results indicate that hydraulic modelling can be performed using the existing data sources but that not enough data is available for calibrating and validating the model. The limitations of the available data are discussed and recommendations for the collection of better data are provided.
AFRIKAANSE OPSOMMING: Vloede is van die vernaamste natuurlike gevare wat in Suid-Afrika voorkom. 'n Ramprisiko-analise is die eerste stap in die proses van suksesvolle ramprisiko-beplanning deur middel van die identifisering en analise van alle gevare wat voorkom in 'n geografiese gebied, soos vereis deur die Rampbestuurwet (Wet 57 van 2002). Die Nasionale Waterwet (Wet 36 van 1998) bepaal dat vloedlyne slegs vir gebiede waar hoë-risiko damme voorkom en vir nuwe uitbreidingsplanne in dorpe vasgestel moet word. Egter is die data wat vir vloedmodelleringsanalises benodig word baie skaars in Suid-Afrikaanse landelike gebiede. Hierdie studie het ondersoek of vloedgevaar-kartering met die beskikbare data moontlik is. 'n Literatuurstudie oor vloedmodelleringsmetodologieë, data-vereistes en vloedgevaarkartering is voltooi en alle beskikbare vloed-verwante data in Suid-Afrika is geëvalueer. Geskikte data-bronne is gekies en gebruik om 'n toetsgebied te assesseer. Deur GIS en hidrouliese modellering te kombineer, is die omvang, waarskynlikheid en diepte van die voorspelde vloedgebeurtenisse gemodelleer. Die studie het bevind dat, alhoewel vloedgevaarkartering met die beskikbare data moontlik is, daar nie genoeg data beskikbaar is om die model te kalibreer en te valideer nie. Tekortkominge van die bestaande data word bespreek en aanbevelings oor die verbetering van die bestaande data vir toepassings in vloedgevaarkartering word gemaak.
APA, Harvard, Vancouver, ISO, and other styles
30

Rahman, Tasnim. "Optimization of Cross-Layer Network Data based on Multimedia Application Requirements." Digital WPI, 2019. https://digitalcommons.wpi.edu/etd-theses/1348.

Full text
Abstract:
This thesis proposes a convex network utility maximization (NUM) problem that can be solved to optimize a cross-layer network based on user and system defined requirements for quality and link capacity of multimedia applications. The problem can also be converged to a distributed solution using dual decomposition. Current techniques do not address the changing system's requirements for the network in addition to the user's requirements for an application when optimizing a cross-layer network, but rather focus on optimizing a dynamic network to conform to a real-time application or for a specific performance. Optimizing the cross-layer network for the changing system and user requirements allows a more accurate optimization of the overall cross-layer network of any given multi-node, ad-hoc wireless application for data transmission quality and link capacity to meet overall mission demands.
APA, Harvard, Vancouver, ISO, and other styles
31

Copeland, David J., and Roberto M. Aleman. "USER RF REQUIREMENTS FOR A Ka-BAND DATA RELAY SATELLITE LINK." International Foundation for Telemetering, 1989. http://hdl.handle.net/10150/614542.

Full text
Abstract:
International Telemetering Conference Proceedings / October 30-November 02, 1989 / Town & Country Hotel & Convention Center, San Diego, California
The user G/T and EIRP requirements were determined for a data relay satellite link consisting of a forward link to 360 Mbps at 23 GHz and a return link to 2 Gbps at 26.5 GHz. Hardware for this data link would be a modular expansion to the NASA Data Link Module. Calculations were based on a data relay satellite model of predetermined characteristics patterned after the NASA Tracking and Data Relay Satellite (TDRS). The desired data rates could be achieved with a G/T of 21.7 dB/deg K (forward link) and an EIRP of 68.2 dBW (return link) for the user satellite. Hardware configurations meeting these requirements are discussed in terms of RF performance, efficiency, reliability, and modular flexibility. A planar array configuration emerges as the logical candidate for most NASA missions. Pertinent Ka-band technology and certain ongoing research efforts are reviewed. Areas of particular interest include new power device families, 0.25 um lownoise HEMT technology, and fiber optic distribution and control of RF arrays.
APA, Harvard, Vancouver, ISO, and other styles
32

Williams, Gbolahan. "Architecting tacit information in conceptual data models for requirements process improvement." Thesis, King's College London (University of London), 2013. https://kclpure.kcl.ac.uk/portal/en/theses/architecting-tacit-information-in-conceptual-data-models-for-requirements-process-improvement(2d3369c6-4387-4b69-b625-c9d36705bfac).html.

Full text
Abstract:
Despite extensive work in the field of Requirements Engineering, ineffective require- ments remains a major antecedent to the failure of projects. Requirements Engineering (RE) refers to the body of methods associated with elucidating the needs of a client, when considering the development of a new system or product. In the literature, challenges in RE have been mainly attributed to insufficient client input, incomplete requirements, evolving requirements and lack of understanding of the domain. Accordingly, this has raised the need for methods of effectively eliciting, analysing and recording requirements. In the literature, promising methods have been proposed for using ethnography to improve methods for elicitation because of its strong qualitative and quantitative qualities in understanding human activities. There has also been success with the use of Model Driven Engineering techniques for analysing, recording and communicating requirements through the use of Conceptual Data Models (CDM), to provide a shared understanding of the domain of a system. However, there has been little work that has attempted to integrate these two areas either from an empirical or theoretical perspective. In this thesis, we investigate how ethnographic research methods contribute to a method for data analysis in RE. Specifically, we consider the proposition that a CDM based on explicit and implicit information derived from ethnographic elicitation, will lead to design solutions that more closely match the expectations of clients. As a result of our investigation, this thesis presents the following key contributions: (i) the introduction of an ethnographic approach to RE for elicitation and verification (ii) a rich CDM metamodel and modeling language necessary for defining and recording ethnographic analyses based on implicit and explicit information (iii) a method for mapping CDM’s to high level architectural abstractions called ecologies. To compliment this work, an evaluation case study is provided that demonstrates a real world application of this work.
APA, Harvard, Vancouver, ISO, and other styles
33

Unterkalmsteiner, Michael. "Coordinating requirements engineering and software testing." Doctoral thesis, Blekinge Tekniska Högskola, Institutionen för programvaruteknik, 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-663.

Full text
Abstract:
The development of large, software-intensive systems is a complex undertaking that is generally tackled by a divide and conquer strategy. Organizations face thereby the challenge of coordinating the resources which enable the individual aspects of software development, commonly solved by adopting a particular process model. The alignment between requirements engineering (RE) and software testing (ST) activities is of particular interest as those two aspects are intrinsically connected: requirements are an expression of user/customer needs while testing increases the likelihood that those needs are actually satisfied. The work in this thesis is driven by empirical problem identification, analysis and solution development towards two main objectives. The first is to develop an understanding of RE and ST alignment challenges and characteristics. Building this foundation is a necessary step that facilitates the second objective, the development of solutions relevant and scalable to industry practice that improve REST alignment. The research methods employed to work towards these objectives are primarily empirical. Case study research is used to elicit data from practitioners while technical action research and field experiments are conducted to validate the developed  solutions in practice. This thesis contains four main contributions: (1) An in-depth study on REST alignment challenges and practices encountered in industry. (2) A conceptual framework in the form of a taxonomy providing constructs that further our understanding of REST alignment. The taxonomy is operationalized in an assessment framework, REST-bench (3), that was designed to be lightweight and can be applied as a postmortem in closing development projects. (4) An extensive investigation into the potential of information retrieval techniques to improve test coverage, a common REST alignment challenge, resulting in a solution prototype, risk-based testing supported by topic models (RiTTM). REST-bench has been validated in five cases and has shown to be efficient and effective in identifying improvement opportunities in the coordination of RE and ST. Most of the concepts operationalized from the REST taxonomy were found to be useful, validating the conceptual framework. RiTTM, on the other hand, was validated in a single case experiment where it has shown great potential, in particular by identifying test cases that were originally overlooked by expert test engineers, improving effectively test coverage.
APA, Harvard, Vancouver, ISO, and other styles
34

Killander, Christoffer, and Jonas Modling. "Requirements for Aconex Map Interface." Thesis, KTH, Skolan för informations- och kommunikationsteknik (ICT), 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-186420.

Full text
Abstract:
This thesis describes the process of identifying key requirements for a map-based interface for a document management software. Existing software platforms suitable for implementation of such a system were identified, evaluated and tested. The purpose of this thesis was to provide Aconex, a document management company, a recommendation for the implementation of a map-based interface to its software for its customer Bechtel. Requirements gathering was done in stages: identifying stakeholders, understanding context, defining requirements and finally creating software specifications. Suitable platforms were identified by literature study and consultation. They were selectedbased on a few critical criteria. Selected platforms were used in implementation of a prototype showcasing the platforms’ abilities to satisfy the technical requirements. It turned out that Google Maps provided the best platform for Aconex's needs. OpenLayers was a great alternative and came with the ability of using unlimited numbers of KML-layers, but at the disadvantage of requiring more code.
Denna rapport beskriver processen att identifiera krav för ett kartbaserat gränsnitt till ett dokumenthanteringssystem. Tillgängliga mjukvaruplattformar som kan användas i detta syfte har identifierats, testats och utvärderats. Målet med detta arbete var att ge råd till Aconex, ett företag som arbetar med dokumenthantering, om hur de borde implementera ett kartbaserat gränsnitt till sitt dokumenthanteringssystem för deras kund Bechtel. Kravinsamling gjordes i olika steg: hitta intressenter, definiera systemkontext, samla in krav och utveckla programvaruspecifikationer utifrån kraven. Tillgängligamjukvarulösningar hittades genom en kombination av litteraturstudier och goda råd, och blev sedan prioriterade utifrån kritiska krav. Prototyper skapades i de mest lovande plattformarna, där varje prototyp visar hur plattformen uppfyller de krav som ställts. Resultatet blev att Google Maps var den platform som fungerade bäst för Aconex. OpenLayers var också ett bra alternativ då man där kan visa obegränsat antal KML-lager, men för att OpenLayers ska klara av lika mycket som Google Maps krävs betydligt mer kod.
APA, Harvard, Vancouver, ISO, and other styles
35

Emerson, Glen D. "Projected performance requirements for personnel entering information processing jobs for the federal government /." Full-text version available from OU Domain via ProQuest Digital Dissertations, 1985.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
36

Mitchell, Mark W. "Evaluation of the agricultural field scale irrigation requirement simulation (AFSIRS) in predicting golf course irrigation requirements with site-specific data." [Gainesville, Fla.] : University of Florida, 2004. http://purl.fcla.edu/fcla/etd/UFE0007360.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

Rush, David, F. W. (Bill) Hafner, and Patsy Humphrey. "DEVELOPMENT OF A REQUIREMENTS REPOSITORY FOR THE ADVANCED DATA ACQUISITION AND PROCESSING SYSTEM (ADAPS)." International Foundation for Telemetering, 1999. http://hdl.handle.net/10150/607313.

Full text
Abstract:
International Telemetering Conference Proceedings / October 25-28, 1999 / Riviera Hotel and Convention Center, Las Vegas, Nevada
Standards lead to the creation of requirements listings and test verification matrices allow developer and acquirer to assure themselves and each other that the requested system is actually what is being constructed. Further, in the intricacy of the software test description, traceability of test process to the requirement under test is mandated so the acceptance test process can be accomplished in an efficient manner. In the view of the logistician, the maintainability of the software and the repair of fond faults is primary, while these statistics can be gathered by the producer to ultimately enhance the Capability Maturity Module (CMM) rating of the vendor.
APA, Harvard, Vancouver, ISO, and other styles
38

Pong, Lih. "Formal data flow diagrams (FDFD) : a petri-net based requirements specification language /." [Hong Kong : University of Hong Kong], 1985. http://sunzi.lib.hku.hk/hkuto/record.jsp?B12323019.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Bew, M. D. "Engineering better social outcomes through requirements management & integrated asset data processing." Thesis, University of Salford, 2017. http://usir.salford.ac.uk/42341/.

Full text
Abstract:
The needs of society are no longer serviceable using the traditional methods of infrastructure providers and operators. Urbanisation, pressure on global resources, population growth and migration across borders is placing new demands which traditional methods can no longer adequately serve. The emergence of data and digital technology has enabled new skills to emerge and offer new possibilities as well as set much higher expectations for the younger generation who have only known lives in the digital age. The data describing the physical properties of built assets have been well understood and digital methods such as Building Information Modelling are providing levels of access and quality historically unknown. The concepts of human perception are not so well understood with research only being documented over the last forty years or so, but the understanding of human needs and the impact of poor infrastructure and services has now been linked to poor perception and social outcomes. This research has developed and instantiated a methodology which uses data from the delivery and operational phases of a built asset and with the aid of understanding the user community’s perceptions creates intelligence that can optimise the assets performance for the benefit of its users. The instantiation was accomplished by experiment in an educational environment using the “Test Bench” to gather physical asset data and social perception data and using analytics to implement comparative measurements and double loop feedback to identify actionable interventions. The scientific contributions of this research are the identification of methods which provide valuable and effective relationships between physical and social data to provide ‘’actionable’’ interventions for performance improvement and the instantiation of this discovery through the development and application of the ‘’Test Bench’’. The major implication has been to develop a testable relationship between social outcomes and physical assets, which with further development could provide a valid challenge to the least cost build option that is taken by the vast number of asset owners, by better understanding the full implications on people’s perceptions and social outcomes. The cost of operational staff and resources rapidly outweighs the cost of assets, and the effective motivation and productivity the right environment can provide improved or inhibited performance and social outcomes.
APA, Harvard, Vancouver, ISO, and other styles
40

Poolla, Venkata Sai Abhishek, and Bhargav Krishna Mandava. "Understanding the Challenges and Needs of Requirements Engineering for Data-centric Systems." Thesis, Blekinge Tekniska Högskola, Institutionen för programvaruteknik, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-21108.

Full text
Abstract:
Background: As technology is advancing day by day, people tend to produce enormous volumes of data. This exceptional growth in data is leading to an increase in the development of intelligent systems that make use of this huge data available. We group the development of such type of intelligent software systems and term them as "Data-Centric Systems (DCS)". Such systems include AI/ML components in an aself-driving car, Recommender systems any many more. Developing DCS is complexin the Software development life cycle process; one of the main reasons behind this complexity is the ineffective handling of requirements. Moreover, the literature suggests that a large percentage (48%) of development problems begin during the requirements phase and fixing requirements-related problems consumes a high cost of rework in later stages of software development. To design DCS effectively, RE techniques are considered one of the essential factors, since it is required to promote the combination of a system’s functional and implementation expectations from two entirely different perspectives, such as customers and developers. Although RE frequently plays a critical role in DCS, little is known about how RE can effectively be incorporated into developing such systems. Objectives: This thesis aims to focus on understanding industry experiences in the development of DCS with the main emphasis on RE and investigate the techniques/approaches used in DCS designing during the RE process and identify the challenges practitioners face during the development process. Methods: Two workshop-style interviews are conducted to identify the design process of RE and the practitioners’ challenges during DCS development. To complement the results from the workshop and scaling up the target population, an online survey is conducted. Results: From the workshops, we have identified that no explicit stakeholder is involved during the RE phase of DCS. Still, all people collectively take the decisions when it comes to developing in agile, and the role varies depending on the type of projects the stakeholder is involved in. Four categories of requirements were identified, namely regulatory, infrastructure, data, safety and security requirements. Techniques/approaches used to elicit, document, analyse and validate the requirements were identified. Based on the data received, we have identified ten challenges faced by the practitioners during the DCS. Based on the number of responses recorded in the survey, the categorisation and the techniques/approaches used for RE were prioritised based on the highest number of responses received. A total of 15 themes were generated for the challenges based on the responses received from participants. Conclusions: To conclude, a specific RE architecture needs to be designed to help the practitioners during the development of DCS. We believe that the analysis of these insights provides the industry with a structured overview of the DCS development process in RE. Besides, this thesis allows the academic community to steer future research based on an understanding of industry needs in RE for DCS.
APA, Harvard, Vancouver, ISO, and other styles
41

Panditpautra, Rishi Ashwin. "Requirements Engineering and Software Development Process of an A-SMGCS Earth Magnetic Field Sensor Data Playback and Basic Analysis Tool." Master's thesis, Universitätsbibliothek Chemnitz, 2017. http://nbn-resolving.de/urn:nbn:de:bsz:ch1-qucosa-225243.

Full text
Abstract:
Advanced Surface Movement Guidance and Control Systems (A-SMGCS) help to further improve safety and efficiency of the traffic on the aerodrome surface. The current A-SMGCS sensor technologies have certain operational and functional limitations. A new and unprecedented sensor technology is being tested as a pilot project. This unique sensors is called MagSense®. It works based on the principle of detecting the influence of ferromagnetic materials on earth’s magnetic field. For applications in the aviation environment, learning processes are necessary which are generally based on the graphical depiction of stored sensor data and features to analyze the graphs. For this purpose a visualization and analysis tool is needed. In order to create an adequate tool to allow for depicting stored sensor data and the peaks caused by ferromagnetic objects in aircraft and vehicles, a requirements engineering process will be conducted wherein the requirements of the various stakeholders will be identified and harmonized. In general, the appropriate RE approach will ensure mutual agreement among the stakeholders and a set of requirements for the first edition of the tool without contradictions. The harmonized package of requirements will then be used as the starting point for a software development process, after which the tool will be produced as specified and validated as a part of this Master’s Thesis. This Master’s Thesis puts a special focus on the choice of a suitable method in Requirements Engineering and Requirements Management, adequately adapted to the project size and its quality. The selection of appropriate elements from the methodology as well as the outcomes from applying them on a specific software production project are at the core.
APA, Harvard, Vancouver, ISO, and other styles
42

Ahmed, Saqib, and Bilal Ahmad. "Transforming Requirements to Ontologies." Thesis, Jönköping University, Tekniska Högskolan, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:hj:diva-50048.

Full text
Abstract:
Capturing client’s needs and expectations for a product or service is an important problem in software development. Software requirements are normally captured in natural language and mostly they are unstructured which makes it difficult to automate the process of going from software requirements to the executable code. A big hurdle in this process is the lack of consistency and standardization in software requirements representation. Thus, the aim of the thesis is to present a method for transforming natural language requirement text into ontology. It is easy to store and retrieve information from ontology as it is a semantic model, and it is also easy to infer new knowledge from it. As it is clear from the aim of this work, the main component of our research was software requirements, so there was a need to investigate and decide the types of requirements to define the scope of this research. We selected INCOSE guidelines as a benchmark to scrutinize the properties which we desired in the Natural Language Requirements. These natural language requirements were used in the form of user stories as the input of the transformation process. We selected a combination of two methods for our research i.e. Literature Review and Design Science Research. The reason for selecting these methods was to obtain a good grip on existing work going on in this field and then to combine the knowledge to propose new rules for the requirements to ontology transformation. We studied different domains during literature review such as Requirements Engineering, Ontologies, Natural Language Processing, and Information Extraction. The gathered knowledge was then used to propose the rules and the flow of their implementation. This proposed system was named as “Reqtology”. Reqtology defines the process, from taking the requirements in form of user stories, to extracting the useful information based on the rules and then classifying that information so that it can be used to form ontologies. The workflow consists of a 6-step process which starts from input text in form of user stories and at the end provides us entities which can be used for ontologies formation.
APA, Harvard, Vancouver, ISO, and other styles
43

Majidi, Arghavan. "Design and Implementation of Requirements Handler over the farkle and Optima." Thesis, KTH, Skolan för informations- och kommunikationsteknik (ICT), 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-147545.

Full text
Abstract:
This Master thesis is a part of an ongoing project called industrial Framework for Embedded Systems Tools, iFEST. IFEST is an EU founded project for developing a tool integration framework in order to facilitate both hardware and software co-design and life-cycle aspects for the development of embedded systems. This leads to reducing the engineering life-cycle costs and time-to-market factor for complex embedded system projects. Test Manager is a part of the testing framework that invokes some test cases for testing functionalities of other components of the system which are “Optima” and “Farkle”. When the test is done the Test Manager will get the test results from the system and return it to the tester user. The final implementation and integration of Test Manager is not within the scope of this thesis work. However, a pilot version of Test Manager was implemented as a working prototype to get stakeholders’ feedback and validate the initial requirement and design. After iterating on requirement factors and finding criteria for the optimum design, different design alternatives went through an AHP1 decision making process to come up with an ideal design model. The aforementioned process is followed by four different aspects of our design model; the integration models, the choice of programming language, the choice between web and desktop user interface, and the choice of database system. For each of these four choices, different options are presented during in the literature study. The final design model is the outcome of the AHP analysis. 1 Analytic
Detta examensarbete är en del i ett pågående projekt som kallas industriell ram för inbyggda systemverktyg: iFEST. IFEST är ett EU-grundat projekt för att utveckla ett ramverk för verktygsintegration i syfte att underlätta samtidig design av både hårdvara och mjukvara, samt livscykelaspekter för utveckling av inbyggda system. Detta leder till att minska de tekniska livscykelkostnaderna och time-to-market i komplexa projekt för inbyggda system. Test Manager är en del av en testningsram som anropar tester för att testa funktionerna i andra komponenter i systemet, som "Optima" och "Farkle". När testet är gjort kommer Test Manager att få testresultaten från systemet och returnera dem till den mänskliga testaren. Den slutliga genomförandet och integrationen av Test Manager är inte inom ramen för detta examensarbete. Emellertid har en pilotversion av Test Manager implementerats som en fungerande prototyp för att få intressenternas synpunkter och validera ursprungliga krav och design. Efter iteration av kravfaktorer och sökande efter kriterier för optimal utformning, gick olika designalternativ genom en AHP-baserad beslutsprocess för att komma till en ideal designmodell. Den tidigare nämnda processen följdes av fyra olika aspekter på designmodellen; integrationsmodeller, valet av programmeringsspråk, valet mellan webben eller särskilt användargränssnitt, och valet av databassystem. För vart och ett av dessa fyra aspekter, presenteras olika alternativ i litteraturstudien. Den slutliga utformningen av modellen är resultatet av AHP-analysen.
APA, Harvard, Vancouver, ISO, and other styles
44

Fry, Andrew J. "Aspects of measurement validation." Thesis, University of Oxford, 2000. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.343358.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Welmers, Laura Hazel. "The implementation of an input/output consistency checker for a requirements specification document." Thesis, Kansas State University, 1985. http://hdl.handle.net/2097/9889.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Gustavii, Christer. "MODELLING WITH COMPETENCE REQUIREMENTS IN ENTERPRISING AND ORGANISATIONS." Thesis, University of Skövde, Department of Computer Science, 1997. http://urn.kb.se/resolve?urn=urn:nbn:se:his:diva-253.

Full text
Abstract:

The purpose for this report is to give some suggestions for how to carry out competence analysis in enterprising and organisations by using principles and technics from requirements engineering. The basic idea of this report is that competence development has very much in common with information systems development and I have been trying to give some good examples of such similarities. The Enterprise Modelling methodology, developed in the ESPRIT project 6612 From Fuzzy to Formal (F3) constitutes the base on which my discussion stands. The model is based on a method for business modelling developed by the Swedish Institute for Systems Development (SISU), see further (Persson, 1997). The focus of the report is on modelling with the components of IT-competence and the elicitation of education goals. I’m giving the experiences from the practical project I did within the frames of the examination paper big importance. The project was pursued by order of the "Chalmers tekniska högskola". It’s purpose was to stipulate the IT-competence requirements for the employees at the department for administration at the university.

The conclusions from the project is that "Enterprise Modelling" can be used to carry on competence analysis. The way of using "Enterprise Modelling" and the necessary changes needed is discussed in the report. Furthermore is the concept of competence discussed both in a general perspective and a IT-perspective.

APA, Harvard, Vancouver, ISO, and other styles
47

Güratan, Işıl Aytaç Sıtkı. "The Design and development of a data warehouse using sales database and requirements of a retail group/." [s.l.]: [s.n.], 2005. http://library.iyte.edu.tr/tezler/master/bilgisayaryazilimi/T000414.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Morrow, Katrina. "Photoacclimation in phytoplankton : the requirements of models and the limitations of the data." Thesis, University of Essex, 2010. https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.528856.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Kim, Suduck. "The difference in BIM component data requirements between prescriptive representations and actual practices." Thesis, Virginia Tech, 2015. http://hdl.handle.net/10919/56476.

Full text
Abstract:
Utilizing Building Information Modeling (BIM) for Facility Management (FM) can reduce interoperability costs during the Operations and Maintenance (OandM) phase by improving data management. However, there are technological, process related, and organizational barriers to successful implementation of BIM integrated FM (BIM-FM), and process related barriers might be solved by the use of BIM integrated FM (BIM-FM) guidelines. However, the guidelines need to be updated with lessons learned from actual practices in order to maintain their validity. In order to diagnose current practices and identify key differences between prescriptive representations and actual practices, this exploratory research compares BIM component data requirements between guidelines and actual practices at public higher education institutions in Virginia. The gap in BIM component data requirements between the guidelines and the actual practices may prevent successful implementation of BIM-FM. This research is composed of three parts: a synthesis of prescriptive representations, determination of actual data requirements in practice, and comparison of differences between guidelines and practices. Document analysis and case study via document analysis and in-person interviews were conducted to collect data. Then, direct comparison was conducted to test the research question. Though the researcher disapproved the established hypothesis of 'There would be some differences in BIM component data requirements between prescriptive representations and actual practices' due to the difference in level of information and details between prescriptive representations and actual practices, this exploratory research provides useful information.
Master of Science
APA, Harvard, Vancouver, ISO, and other styles
50

Ström, Fabian. "Quantitative Requirements Testing for Autonomous Systems-of-Systems : Case Studies in Platooning." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-251659.

Full text
Abstract:
A platoon, or a road train, has many different benefits such as safety, fuel efficiency and road utilisation. However it must also be safe to use and must thus be thoroughly tested. This report looks at the specific use case of emergency braking using three different emergency braking strategies and two models of communication. A cooperative emergency braking strategy, a naive emergency braking strategy and the combination of both are tested. It is found that the combination of both performs the best.
En platoon, eller ett vägtåg, har många fördelar såsom bränsleeffektivitet och vägutnyttjande. Däremot måste den också vara säker att använda och därför grundligt testad. Denna rapport beaktar det specifika användarfallet nödbromsning genom tre olika nödbromsstrategier och två modeller för kommunikation. En samarbetande nödbromsstrategi, en naiv nödbromsstrategi och kombinationen av de två testas. Resultatet är att kombinationen av de två presterar bäst.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography