Dissertations / Theses on the topic 'Software engineering Research Methodology'

To see the other types of publications on this topic, follow the link: Software engineering Research Methodology.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Software engineering Research Methodology.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Rönkkö, Kari. "Making Methods Work in Software Engineering : Method Deployment - as a Social Achievement." Doctoral thesis, Ronneby : Blekinge Institute of Technology, 2005. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-00264.

Full text
Abstract:
The software engineering community is concerned with improvements in existing methods and development of new and better methods. The research approaches applied to take on this challenge have hitherto focused heavily on the formal and specifying aspect of the method. This has been done for good reasons, because formalizations are the means in software projects to predict, plan, and regulate the development efforts. As formalizations have been successfully developed new challenges have been recognized. The human and social role in software development has been identified as the next area that needs to be addressed. Organizational problems need to be solved if continued progress is to be made in the field. The social element is today a little explored area in software engineering. Following with the increased interest in the social element it has been identified a need of new research approaches suitable for the study of human behaviour. The one sided focus on formalizations has had the consequence that concepts and explanation models available in the community are one sided related in method discourses. Definition of method is little explored in the software engineering community. In relation to identified definitions of method the social appears to blurring. Today the software engineering community lacks powerful concepts and explanation models explaining the social element. This thesis approaches the understanding of the social element in software engineering by applying ethnomethodologically informed ethnography and ethnography. It is demonstrated how the ethnographic inquiry contributes to software engineering. Ethnography is also combined with an industrial cooperative method development approach. The results presented demonstrate how industrial external and internal socio political contingencies both hindered a method implementation, as well as solved what the method was targeted to do. It is also presented how project members’ method deployment - as a social achievement is played out in practice. In relation to this latter contribution it is provided a conceptual apparatus and explanation model borrowed from social science, The Documentary method of interpretation. This model addresses core features in the social element from a natural language point of view that is of importance in method engineering. This model provides a coherent complement to an existing method definition emphasizing formalizations. This explanation model has also constituted the underpinning in research methodology that made possible the concrete study results.
APA, Harvard, Vancouver, ISO, and other styles
2

Eliote, Yvssa Carneiro Desmots. "Implanta??o e an?lise do framework scrum no desenvolvimento da plataforma aberta Nosso Exerc?cio." UFVJM, 2018. http://acervo.ufvjm.edu.br/jspui/handle/1/1817.

Full text
Abstract:
Submitted by Raniere Barreto (raniere.barros@ufvjm.edu.br) on 2018-10-26T19:47:13Z No. of bitstreams: 2 license_rdf: 0 bytes, checksum: d41d8cd98f00b204e9800998ecf8427e (MD5) yvssa_carneiro_desmots_eliote.pdf: 2710883 bytes, checksum: d37c2db27a339449a69986281db2966a (MD5)
Approved for entry into archive by Rodrigo Martins Cruz (rodrigo.cruz@ufvjm.edu.br) on 2018-11-10T11:13:46Z (GMT) No. of bitstreams: 2 license_rdf: 0 bytes, checksum: d41d8cd98f00b204e9800998ecf8427e (MD5) yvssa_carneiro_desmots_eliote.pdf: 2710883 bytes, checksum: d37c2db27a339449a69986281db2966a (MD5)
Made available in DSpace on 2018-11-10T11:13:47Z (GMT). No. of bitstreams: 2 license_rdf: 0 bytes, checksum: d41d8cd98f00b204e9800998ecf8427e (MD5) yvssa_carneiro_desmots_eliote.pdf: 2710883 bytes, checksum: d37c2db27a339449a69986281db2966a (MD5) Previous issue date: 2018
Este estudo prop?e a implementa??o e an?lise do Framework Scrum no desenvolvimento de novas funcionalidade para o website Nosso Exerc?cio. Esta aplica??o web consiste em um dos projetos do Programa de Educa??o Tutorial, PET-UFVJM/Campus do Mucuri situada na cidade de Te?filo Otoni-MG e tem como finalidade o compartilhamento aberto de exerc?cios did?ticos de diversas ?reas do conhecimento. Por solicita??o de seus idealizadores, novas funcionalidades foram demandadas para o Nosso Exerc?cio, por?m, n?o existiam para este projeto metas claras nem um plano de trabalho a ser seguido, os requisitos n?o estavam formalizados e a equipe dispon?vel n?o estava madura nas tecnologias utilizadas. Levando-se em considera??o o desafio enfrentado na gera??o de software de qualidade e o limite de tempo dispon?vel para a realiza??o desta pesquisa, foi proposto o uso de um processo da Engenharia de Software com intuito de se obter maior controle e qualidade do produto final a ser desenvolvido. O m?todo ?gil Scrum foi o escolhido para gerenciar as atividades de desenvolvimento para este software. Assim, o objetivo geral desta pesquisa consistiu em buscar uma resposta para a seguinte pergunta-problema: Quais benef?cios e/ou dificuldades podem ser obtidas atrav?s da aplica??o do Framework Scrum na evolu??o do desenvolvimento da plataforma aberta Nosso Exerc?cio? Seguindo o modelo sugerido por Coughlan e Coghlan (2002), o m?todo de pesquisa-a??o foi utilizado para descrever a din?mica conduzida durante este trabalho. A implanta??o do Scrum no Nosso Exerc?cio ocorreu em duas etapas, a primeira, visou realizar uma capacita??o sobre o Scrum e as ferramentas tecnol?gicas utilizadas no desenvolvimento do Nosso Exerc?cio. J? a segunda, tratou do desenvolvimento das funcionalidades para este website. A an?lise dos resultados mostrou v?rios benef?cios obtidos com a implanta??o do Scrum neste projeto, como: o foco e compromisso do Time durante o seu trabalho, o atendimento ?s reais necessidades do cliente (Product Owner), a flexibilidade do framework em se ajustar ?s condi??es de ambiente e trabalho de cada Time criado; o aprendizado cont?nuo do software e do processo resultante das discuss?es feitas nas Reuni?es de Planejamento, Revis?o e Retrospectiva. O ponto cr?tico do trabalho deu-se sobre a imaturidade com as tecnologias utilizadas para o desenvolvimento do website. As li??es aprendidas com esta pesquisa indicam que ? poss?vel obter benef?cios com a implanta??o do framework Scrum que superem as suas dificuldades, desde que sejam feitas as devidas an?lises do ambiente em que o mesmo for adotado.
Disserta??o (Mestrado Profissional) ? Programa de P?s-Gradua??o em Tecnologia, Sa?de e Sociedade, Universidade Federal dos Vales do Jequitinhonha e Mucuri, 2018.
This study proposes na implementation and analysis of Scrum Framework in the development of new functionalities to ?Nosso Exerc?cio? (Our Exercise) website. This web application consists in one of the Tutorial Education Program Projects, PET-UFVJM/Mucuri Campus, located in Te?filo Otoni city, Minas Gerais state and it has as its objective the open share of teaching exercises of several knowledge areas. At the request of its creators, new functionalities were demanded to ?Nosso Exerc?cio? (Our Exercise), however, there were not clear goals for this project nor a work plan to be followed, the requirements were not formalised and the available team was not mature enough on the tecnologies applied. Considering the challenge faced on the generation of a good quality software and the time limit available for taking this research, the use of a software engineering was proposed aiming to get bigger quality control of the final Product to be developed. The agile method Scrum was the one chosen to manage the developing activities to this software. So, the general goal of this research consisted in searching for an answer to the following question-problem: Which benefits and/or difficulties can be obtained through the apllication of Scrum Framework on the evolution of the development of ?Nosso Exerc?cio? open plataforma? According to the model suggested by Coughlan and Coghlan (2002), the research-action method was used to describe a dinamic conducted during this work. The Scrum implantation on ?Nosso Exerc?cio? occurred in two stages, the first one, aimed to do a training about Scrum and the technological tools used in the development of ?Nosso Exerc?cio?. The second one, dealt with the development of functionalities to this website. The analyses of the results showed many benefits gotten with the implatation of Scrum in this project, like focus and commitment of ?Time? during its work, the attendance to the costumer?s real needs (Product Owner), the framework flexibility in adjusting to environment and work conditions of each ?Time? created, the continuous learning of the software and of the process resulting of the discussions taken on planning, reviewing and retrospecto meetings. The critical work point was about the immaturity with the technologies used for the website development. The learned lessons with this research indicate that it?s possible to get benefits with the implantation of Scrum Framework which overcome the difficulties, as long as the needed analysis of the enviroment where it was adopted be done.
APA, Harvard, Vancouver, ISO, and other styles
3

Vũ, John Huân. "Software Internationalization: A Framework Validated Against Industry Requirements for Computer Science and Software Engineering Programs." DigitalCommons@CalPoly, 2010. https://digitalcommons.calpoly.edu/theses/248.

Full text
Abstract:
View John Huân Vũ's thesis presentation at http://youtu.be/y3bzNmkTr-c. In 2001, the ACM and IEEE Computing Curriculum stated that it was necessary to address "the need to develop implementation models that are international in scope and could be practiced in universities around the world." With increasing connectivity through the internet, the move towards a global economy and growing use of technology places software internationalization as a more important concern for developers. However, there has been a "clear shortage in terms of numbers of trained persons applying for entry-level positions" in this area. Eric Brechner, Director of Microsoft Development Training, suggested five new courses to add to the computer science curriculum due to the growing "gap between what college graduates in any field are taught and what they need to know to work in industry." He concludes that "globalization and accessibility should be part of any course of introductory programming," stating: A course on globalization and accessibility is long overdue on college campuses. It is embarrassing to take graduates from a college with a diverse student population and have to teach them how to write software for a diverse set of customers. This should be part of introductory software development. Anything less is insulting to students, their family, and the peoples of the world. There is very little research into how the subject of software internationalization should be taught to meet the major requirements of the industry. The research question of the thesis is thus, "Is there a framework for software internationalization that has been validated against industry requirements?" The answer is no. The framework "would promote communication between academia and industry ... that could serve as a common reference point in discussions." Since no such framework for software internationalization currently exists, one will be developed here. The contribution of this thesis includes a provisional framework to prepare graduates to internationalize software and a validation of the framework against industry requirements. The requirement of this framework is to provide a portable and standardized set of requirements for computer science and software engineering programs to teach future graduates.
APA, Harvard, Vancouver, ISO, and other styles
4

Vat, Kam Hou. "REALSpace AKE : an appreciative knowledge environment architected through soft systems methodology and scenario-based design." Thesis, University of Macau, 2011. http://umaclib3.umac.mo/record=b2492481.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Perez, Luis G. "Development of a Methodology that Couples Satellite Remote Sensing Measurements to Spatial-Temporal Distribution of Soil Moisture in the Vadose Zone of the Everglades National Park." FIU Digital Commons, 2014. http://digitalcommons.fiu.edu/etd/1663.

Full text
Abstract:
Spatial-temporal distribution of soil moisture in the vadose zone is an important aspect of the hydrological cycle that plays a fundamental role in water resources management, including modeling of water flow and mass transport. The vadose zone is a critical transfer and storage compartment, which controls the partitioning of energy and mass linked to surface runoff, evapotranspiration and infiltration. This dissertation focuses on integrating hydraulic characterization methods with remote sensing technologies to estimate the soil moisture distribution by modeling the spatial coverage of soil moisture in the horizontal and vertical dimensions with high temporal resolution. The methodology consists of using satellite images with an ultrafine 3-m resolution to estimate soil surface moisture content that is used as a top boundary condition in the hydrologic model, SWAP, to simulate transport of water in the vadose zone. To demonstrate the methodology, herein developed, a number of model simulations were performed to forecast a range of possible moisture distributions in the Everglades National Park (ENP) vadose zone. Intensive field and laboratory experiments were necessary to prepare an area of interest (AOI) and characterize the soils, and a framework was developed on ArcGIS platform for organizing and processing of data applying a simple sequential data approach, in conjunction with SWAP. An error difference of 3.6% was achieved when comparing radar backscatter coefficient (σ0) to surface Volumetric Water Content (VWC); this result was superior to the 6.1% obtained by Piles during a 2009 NASA SPAM campaign. A registration error (RMSE) of 4% was obtained between model and observations. These results confirmed the potential use of SWAP to simulate transport of water in the vadose zone of the ENP. Future work in the ENP must incorporate the use of preferential flow given the great impact of macropore on water and solute transport through the vadose zone. Among other recommendations, there is a need to develop procedures for measuring the ENP peat shrinkage characteristics due to changes in moisture content in support of the enhanced modeling of soil moisture distribution.
APA, Harvard, Vancouver, ISO, and other styles
6

Naseem, Junaid, and Wasim Tahir. "Study and analysis of the challenges and guidelines of transitioning from waterfall development model to Scrum." Thesis, Blekinge Tekniska Högskola, Avdelningen för programvarusystem, 2009. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-2679.

Full text
Abstract:
Software engineering practices have experienced significant changes over the period of past two decades. Keeping in view the competitive market trends, now is the high time for many organizations to shift from traditional waterfall models to more agile technologies like Scrum [22][23]. A change of this magnitude is often not easy to undertake. The reason that both software engineering techniques are different in many respects, organizations require considerable amount of analysis of the whole transitioning process and possible scenarios that may occur along the way. Small and medium organizations are normally very skeptical to the change of this magnitude. The scale of change is not limited to only software processes, in fact, difficult part is to deal with old attitudes and thinking processes and mold them for the new agile based Scrum development. The process of change therefore need to be understood in the first place and then carefully forwarded to the implementation phase.
APA, Harvard, Vancouver, ISO, and other styles
7

Redfearn, Brady Edwin. "User Experience Engineering Adoption and Practice: A Longitudinal Case Study." BYU ScholarsArchive, 2013. https://scholarsarchive.byu.edu/etd/3762.

Full text
Abstract:
User Experience Engineering (UxE) incorporates subject areas like usability, HCI, interaction experience, interaction design, "human factors", ergonomics", cognitive psychology", behavioral psychology and psychometrics", systems engineering", [and] "computer science," (Hartson, 1998). It has been suggested that UxE will be the main success factor in organizations as we enter the "loyalty decade" of software development, where the repeat usage of a product by a single customer will be the metric of product success (Alghamdi, 2010; Law & van Schaik, 2010, p. 313; Nielsen, 2008; Van Schaik & Ling, 2011). What is relatively unknown in the current academic literature is whether existing UxE methodologies are effective or not when placed in a longitudinal research context (Law & van Schaik, 2010). There is room for the exploration of the effects of long-term UxE practices in a real-world case study scenario. The problem, addressed in this study, is that a lack of the application of UxE-related processes and practices with an industrial partner had resulted in customer dissatisfaction and a loss of market share. A three-year case study was performed during which 10 UxE-related metrics were gathered and analyzed to measure the improvements in the design of the customer's experience that long-term UxE practices could bring to a small corporate enterprise. The changes that occurred from the corporate and customer's point of view were analyzed as the customer's experience evolved throughout this long-term UxE study. Finally, an analysis of the problems and issues that arose in the implementation of UxE principles during the application of long-term UxE processes was performed. First-hand training between the research team and company employees proved essential to the success of this project. Although a long-term UxE process was difficult to implement within the existing development practices of the industrial partner, a dramatic increase in customer satisfaction and customer engagement with the company system was found. UxE processes led to increased sales rates and decreased development costs in the long-term. All 10 metrics gathered throughout this study showed measurable improvements after long-term UxE processes and practices were adopted by the industrial partner.
APA, Harvard, Vancouver, ISO, and other styles
8

McMeekin, David Andrew. "A software inspection methodology for cognitive improvement in software engineering." Thesis, Curtin University, 2010. http://hdl.handle.net/20.500.11937/400.

Full text
Abstract:
This thesis examines software inspections application in a non-traditional use through examining the cognitive levels developers demonstrate while carrying out software inspection tasks. These levels are examined in order to assist in increasing developers’ ability to understand, maintain and evolve software systems.The results from several empirical studies carried out are presented. These indicate several important findings: student software developers find structured reading techniques more helpful as an aid than less structured reading techniques, while professional developers find the more structured techniques do not allow their experience to be applied to the problem at hand; there is a correlation between the effectiveness of a software inspection and an inspector’s ability to successfully add new functionality to the inspected software artefact; the cognitive levels that student developers functioned at while carrying out software inspection tasks were at higher orders of thinking when structured inspection techniques were implemented than when unstructured techniques were applied.From the empirical results a mapping has been created of several software inspection techniques to the cognitive process models they support and the cognitive levels, as measured using Bloom’s Taxonomy that they facilitate. This mapping is to understand the impact carrying out a software inspection has upon a developer’s cognitive understanding of the inspected system.The knowledge and understanding of the findings of this research has culminated in the creation of a code reading methodology to increase the cognitive level software developers operate at while reading software code. The reading methodology distinguishes where in undergraduate and software developer training courses different software inspection reading techniques are to be implemented in order to maximise a software developer’s code reading ability dependent upon their experience level.
APA, Harvard, Vancouver, ISO, and other styles
9

King, Myron Decker. "A methodology for hardware-software codesign." Thesis, Massachusetts Institute of Technology, 2013. http://hdl.handle.net/1721.1/84891.

Full text
Abstract:
Thesis (Ph. D.)--Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science, 2013.
Cataloged from PDF version of thesis.
Includes bibliographical references (pages 150-156).
Special purpose hardware is vital to embedded systems as it can simultaneously improve performance while reducing power consumption. The integration of special purpose hardware into applications running in software is difficult for a number of reasons. Some of the difficulty is due to the difference between the models used to program hardware and software, but great effort is also required to coordinate the simultaneous execution of the application running on the microprocessor with the accelerated kernel(s) running in hardware. To further compound the problem, current design methodologies for embedded applications require an early determination of the design partitioning which allows hardware and software to be developed simultaneously, each adhering to a rigid interface contract. This approach is problematic because often a good hardware-software decomposition is not known until deep into the design process. Fixed interfaces and the burden of reimplementation prevent the migration of functionality motivated by repartitioning. This thesis presents a two-part solution to the integration of special purpose hardware into applications running in software. The first part addresses the problem of generating infrastructure for hardware-accelerated applications. We present a methodology in which the application is represented as a dataflow graph and the computation at each node is specified for execution either in software or as specialized hardware using the programmer's language of choice. An interface compiler as been implemented which takes as input the FIFO edges of the graph and generates code to connect all the different parts of the program, including those which communicate across the hardware/software boundary. This methodology, which we demonstrate on an FPGA platform, enables programmers to effectively exploit hardware acceleration without ever leaving the application space. The second part of this thesis presents an implementation of the Bluespec Codesign Language (BCL) to address the difficulty of experimenting with hardware/software partitioning alternatives. Based on guarded atomic actions, BCL can be used to specify both hardware and low-level software. Based on Bluespec SystemVerilog (BSV) for which a hardware compiler by Bluespec Inc. is commercially available, BCL has been augmented with extensions to support more efficient software generation. In BCL, the programmer specifies the entire design, including the partitioning, allowing the compiler to synthesize efficient software and hardware, along with transactors for communication between the partitions. The benefit of using a single language to express the entire design is that a programmer can easily experiment with many different hardware/software decompositions without needing to re-write the application code. Used together, the BCL and interface compilers represent a comprehensive solution to the task of integrating specialized hardware into an application.
by Myron King.
Ph.D.
APA, Harvard, Vancouver, ISO, and other styles
10

Moradian, Esmiralda. "Integrating Security in Software Engineering Process: The CSEP Methodology." Doctoral thesis, KTH, Programvaruteknik och Datorsystem, SCS, 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-95393.

Full text
Abstract:
In today’s organizations, a vast amount of existing software systems is insecure, which results in compromised valuable assets and has negative consequences on the organizations. Throughout the years, many attempts have been made to build secure software systems, but the solutions proposed were limited to a few add-on fixes made after implementation and installation of the system.The contribution of the research in this thesis is a software security engineering methodology, called Controlled Security Engineering Process, which provides support to developers when developing more secure software systems by integrating software lifecycle and security lifecycle, and enhancing the control in the engineering process. The proposed methodology implements security in every phase of general software system engineering, i.e., requirement, design, implementation, and testing, as well as operation and maintenance to certify that software systems are built with security in mind.The Controlled Security Engineering Process methodology addresses security problems in the development lifecycle. Construction of a secure software system involves specific steps and activities, which include security requirements specifications of system behavior, secure software design, an analysis of the design, implementation, with secure coding and integration, and operating and maintenance procedures.The methodology incorporates software security patterns and control of the engineering process. The software security patterns can be used as security controls and information sources to demonstrate how a specific security task should be performed or a specific security problem solved. Many patterns can be implemented in an automated way, which can facilitate the work of software engineers.The control of the engineering process provides visibility over the development process. The control assures that authorised developers access legitimate and necessary information and projects’ documents by using authentication, and authorization.To support implementation of automated patterns and provide control over the engineering process, a design of a multi-agent system is provided. The multi-agent system supports implementation of patterns and extracting security information, and provides traceability in the engineering process. The security information is requirements, threats and security mechanisms that are provided by matching project documents, and traceability is achieved by monitoring and logging services.The Controlled Security Engineering Process methodology has been evaluated through interviews with developers, security professionals, and decision makers in different types of organizations but also through a case study which was carried out in an organization.

QC 20120514

APA, Harvard, Vancouver, ISO, and other styles
11

Ramsin, Raman. "The engineering of an object-oriented software development methodology." Thesis, University of York, 2006. http://etheses.whiterose.ac.uk/9898/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Sadrieh, Afshin. "Applied novel software development methodology for process engineering application." Thesis, Sadrieh, Afshin (2017) Applied novel software development methodology for process engineering application. PhD thesis, Murdoch University, 2017. https://researchrepository.murdoch.edu.au/id/eprint/38742/.

Full text
Abstract:
Chemical processes are nonlinear continuous/discrete dynamic systems that are subject to considerable uncertainties and variations during their design and operation. These systems are designed to operate at an economically optimal steady-state. However, minor changes in process parameters’ values might cause deviations and elicit dynamic responses from processes. Controllability—defined as the ability of holding a process within a specified operating regime and the controllability assessment of each given process system—should be taken into account during the system design phase. This emphasises the necessity of effective software tools that could assist process engineers in their controllability evaluation. Although there are few multipurpose tools available for this task, developing software tools for controllability analysis is a tedious and sophisticated undertaking. It involves elaboration from multiple disciplines, and the requirements of controllability assessments are so vast that it is almost impossible to create general software that covers all controllability measures and cases. This thesis aims to systematically tackle the challenge of developing practical and high-quality software tools for controllability problems while reducing the required time and effort, regardless of the size and scale of the controllability problem. Domain-specific language (DSL) methodology is proposed for this purpose. DSLs are programming languages designed to address the programming problems of a specific domain. Therefore, well-designed DSLs are simple, easy to use and capable of solving any problem defined in their domains. Based on DSL methodology, this study proposes a four-element framework to partition the software system into decoupled elements, and discusses the design and implementation steps of each element as well as communication between elements. The superiority of the developed methodology based on DSL is compared with traditional programming techniques for controllability assessment of various case studies. Essentially, the major advantage of the proposed methodology is the performance of the software product. Performance measures used in this study are total time to develop (TD) the software tool and its modifiability. Total time and effort to implement and use the result products presents up to five times improvement. Moreover, the result product’s modifiability is assessed by applying modifications, which also demonstrates up to five times improvement. All measures are tested on continuous stirred-tank reaction (CSTR) and forced-circulation evaporator (FCE) case studies. In conclusion, this study significantly contributes to two fields. The first is DSL, since this thesis studies different types of DSLs and evaluates their applications in the controllability analysis. The second is the controllability evaluation, since this study examines a new methodology for software development in controllability assessment.
APA, Harvard, Vancouver, ISO, and other styles
13

Collins, Tamar L. "A methodology for engineering neural network systems." Thesis, University of Exeter, 1997. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.284620.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Moritz, Evan Alexander. "Tracelab: Reproducing Empirical Software Engineering Research." W&M ScholarWorks, 2013. https://scholarworks.wm.edu/etd/1539626949.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Al-Azawi, Rula K. "Agent Oriented Software Engineering (AOSE) approach to game development methodology." Thesis, De Montfort University, 2015. http://hdl.handle.net/2086/11120.

Full text
Abstract:
This thesis investigates existing game development methodologies, through the process of researching game and system development models. The results indicate that these methodologies are engineered to solve specific problems, and most are suitable only for specific game genres. Different approaches to building games have been proposed in recent years. However, most of these methodologies focus on the design and implementation phase. This research aims to enhance game development methodologies by proposing a novel game development methodology, with the ability to function in generic game genres, thereby guiding game developers and designers from the start of the game development phase to the end of the implementation and testing phase. On a positive note, aligning development practice with universal standards makes it far easier to incorporate extra team members at short notice. This increased the confidence when working in the same environment as super developers. In the gaming industry, most game development proceeds directly from game design to the implementation phase, and the researcher observes that this is the only industry in which this occurs. It is a consequence of the game industry’s failure to integrate with modern development techniques. The ultimate aim of this research to apply a new game development methodology using most game elements to enhance success. This development model will align with different game genres, and resolve the gap between industry and research area, so that game developers can focus on the important business of creating games. The primary aim of Agent Oriented Agile Base (AOAB) game development methodology is to present game development techniques in sequential steps to facilitate game creation and close the gap in the existing game development methodologies. Agent technology is used in complex domains such as e-commerce, health, manufacturing, games, etc. In this thesis we are interested in the game domain, which comprises a unique set of characteristics such as automata, collaboration etc. Our AOAB will be based on a predictive approach after adaptation of MaSE methodology, and an adaptive approach using Agile methodology. To ensure proof of concept, AOAB game development methodology will be evaluated against industry principles, providing an industry case study to create a driving test game, which was the problem motivating this research. Furthermore, we conducted two workshops to introduce our methodology to both academic and industry participants. Finally, we prepared an academic experiment to use AOAB in the academic sector. We have analyzed the feedbacks and comments and concluded the strengths and weakness of the AOAB methodology. The research achievements are summarized and proposals for future work outlined.
APA, Harvard, Vancouver, ISO, and other styles
16

Tran, Quynh Nhu Information Systems Technology &amp Management Australian School of Business UNSW. "MOBMAS - A methodology for ontology-based multi-agent systems development." Awarded by:University of New South Wales. School of Information Systems, Technology and Management, 2005. http://handle.unsw.edu.au/1959.4/24254.

Full text
Abstract:
???Agent-based systems are one of the most vibrant and important areas of research and development to have emerged in information technology in the 1990s??? (Luck et al. 2003). The use of agents as a metaphor for designing and constructing software systems represents an innovative movement in the field of software engineering: ???Agent- Oriented Software Engineering (AOSE)??? (Lind 2000; Luck et al. 2003). This research contributes to the evolution of AOSE by proposing a comprehensive ontology-based methodology for the analysis and design of Multi-Agent Systems (MAS). The methodology is named MOBMAS, which stands for ???Methodology for Ontology-Based MASs???. A major improvement of MOBMAS over the existing agentoriented MAS development methodologies is its explicit and extensive support for ontology-based MAS development. Ontologies have been widely acknowledged for their significant benefits to interoperability, reusability, MAS development activities (such as system analysis and agent knowledge modelling) and MAS operation (such as agent communication and reasoning). Recognising these desirable ontology???s benefits, MOBMAS endeavours to identify and implement the various ways in which ontologies can be used in the MAS development process and integrated into the MAS model definitions. In so doing, MOBMAS has exploited ontologies to enhance its MAS development process and MAS development product with various strengths. These strengths include those ontology???s benefits listed above, and those additional benefits uncovered by MOBMAS, e.g. support for verification and validation, extendibility, maintainability and reliability. Compared to the numerous existing agent-oriented methodologies, MOBMAS is the first that explicitly and extensively investigates the diverse potential advantages of ontologies in MAS development, and which is able to implement these potential advantages via an ontology-based MAS development process and a set of ontology-based MAS model definitions. Another major contribution of MOBMAS to the field of AOSE is its ability to address all key concerns of MAS development in one methodological framework. The methodology provides support for a comprehensive list of methodological requirements, which are important to agent-oriented analysis and design, but which may not be wellsupported by the current methodologies. These methodological requirements were identified and validated by this research from three sources: the existing agent-oriented methodologies, the existing evaluation frameworks for agent-oriented methodologies and conventional system development methodologies, and a survey of practitioners and researchers in the field of AOSE. MOBMAS supports the identified methodological requirements by combining the strengths of the existing agent-oriented methodologies (i.e. by reusing and enhancing the various strong techniques and model definitions of the existing methodologies where appropriate), and by proposing new techniques and model definitions where necessary. The process of developing MOBMAS consisted of three sequential research activities. The first activity identified and validated a list of methodological requirements for an Agent Oriented Software Engineering methodology as mentioned above. The second research activity developed MOBMAS by specifying a development process, a set of techniques and a set of model definitions for supporting the identified methodological requirements. The final research activity evaluated and refined MOBMAS by collecting expert reviews on the methodology, using the methodology on an application and conducting a feature analysis of the methodology.
APA, Harvard, Vancouver, ISO, and other styles
17

Galal, Galal Hassan. "An interpretive approach to information systems engineering (using the grounded systems engineering methodology)." Thesis, Brunel University, 1996. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.285051.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Chen, Xin. "Towards an integrated methodology for the development of hybrid information systems." Thesis, University of Sunderland, 1998. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.263000.

Full text
Abstract:
Our modern information society has produced many sophisticated requirements for the development of information systems. A new challenge is the study of hybrid information systems that combine traditional information systems with knowledge-based systems. This new generation information system is considerably more powerful than a simple extrapolation of existing system concepts. It is easy to imagine the advantages of powerful knowledge-based systems with efficient access to several large databases, and of large traditional information systems with added intelligence. Due to the complex nature of hybrid information systems, it is umealistic to expect that they can be developed using one standard method. The use of several independently developed methods has a number of drawbacks, such as inconsistency, redundancy, amount of effort required and possible loss of information. In an attempt to provide at least a partial solution to this problem. this thesis describes a new integrated methodology for developing hybrid information systems. This methodology combines the method for developing traditional information systems with the method for developing knowledge-based systems. The new methodology provides a hybrid lifecycle process model to combine the conventional waterfall process with rapid prototyping and model-based approaches. The proposed methodology integrates four eXlstmg methods using two integration approaches: intra-process and inter-process. In the requirements analysis phase. a structured method is applied to function analysis, an information modelling method is applied to data analysis, and a knowledge acquisition method is applied to knowledge analysis. An intraprocess approach is then used to integrate these techniques using consistency rules. In the design phase. the new methodology uses an inter-process approach to transform requirements analysis to object-oriented design by a transformation algorithm. Finally, an object-oriented method is applied to the design and implementation of hybrid information systems. Using the new methodology, a hybrid medical information system for dizziness (HMISD) was developed, which combines components of traditional medical information systems with components of medical expert systems. The construction and development of this software are described in detail. The system can support activities in hospitals including registration, diagnosis, investigations, drug management and clinical research. It provides assistance to hospital doctors and general practitioners. The performance of HMISD is evaluated by testing ninety three real patient cases and taking two investigations from medical staff and patients. The evaluation results show that HMISD is of good quality and that most of its users are satisfied. Three approaches are used to evaluate the proposed methodology: analysis of the development of HMISD, comparison with existing methodologies using CMD and expert evaluations. The evaluation conclusions indicate that this new integrated methodology can take advantage of the four existing methods and also remove some of the limitations of each individual method. It is applicable to the development of traditional information systems, knowledge-based systems, and large and complex hybrid information systems.
APA, Harvard, Vancouver, ISO, and other styles
19

Sulemani, Kashif Ali, and Muhammad Nadeem Nasir. "Communication Support to Scrum Methodology in Offshore Development." Thesis, Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation, 2009. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-2815.

Full text
Abstract:
In today world, the software companies are expanded above the continents. The software development work span continents and the distributed team work together over the borders. The agile development methodology demands close collaboration with client, rapid requirements change and an iterative development of fixed length. When this way of agile approach is applied in a distributed project, it requires frequent communication and knowledge exchange among the dispersed team members and need collaboration with customer over distance. Besides the geographical, the linguistic and the different time zone barriers in a distributed project, the computer mediated tools suppose being useful media in connecting and to coordinating among dispersed colleague in a project. Though, these tools depict variant effectiveness in communication exchange, however, there efficient use connects peoples at the two sites. The aim of this thesis study is to explore the communication channel support to the Scrum practices in a distributed project. The purpose of the research is to analyse collaboration and communication in distributed teams working together through computer mediated technology. For this purpose, a company case is studied doing a distributed agile project. The author’s insight the communication and information exchange through the ICT in an agile project. Based on the case analysis, the authors suggest the recommendations for implementing and establishing agile practices in a distance project.
Mobil: +46700381303
APA, Harvard, Vancouver, ISO, and other styles
20

Deshpande, Shweta. "A Study Of Software Engineering Practices for Micro Teams." The Ohio State University, 2011. http://rave.ohiolink.edu/etdc/view?acc_num=osu1299620089.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Clegg, Ben. "A systems approach to reengineering business processes towards concurrent engineering principles." Thesis, De Montfort University, 1999. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.391643.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

McKay, Everett Norcross. "A methodology for software implemented transient error recovery in spacecraft computation." Thesis, Massachusetts Institute of Technology, 1985. http://hdl.handle.net/1721.1/81503.

Full text
Abstract:
Thesis (M.S.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 1985.
MICROFICHE COPY AVAILABLE IN ARCHIVES AND ENGINEERING.
Bibliography: leaves 136-137.
by Everett Norcross McKay.
M.S.
APA, Harvard, Vancouver, ISO, and other styles
23

Crawford, Ivan D. "A methodology for incorporating HCI requirements into CASE." Thesis, University of Ulster, 1997. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.245499.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Harris, Chester A. "Distributed computing configuration: A combined user, software, and hardware model and analysis methodology /." The Ohio State University, 1997. http://rave.ohiolink.edu/etdc/view?acc_num=osu1487946103568637.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Hu, Xiaolin. "A simulation-based software development methodology for distributed real-time systems." Diss., The University of Arizona, 2004. http://hdl.handle.net/10150/280514.

Full text
Abstract:
Powered by the rapid advance of computer, network, and sensor/actuator technologies, distributed real-time systems that continually and autonomously control and react to the environment have been widely used. The combination of temporal requirements, concurrent environmental entities, and high reliability requirements, together with distributed processing make the software to control these systems extremely hard to design and difficult to verify. In this work, we developed a simulation-based software development methodology to manage the complexity of distributed real-time software. This methodology, based on discrete event system specification (DEVS), overcomes the "incoherence problem" between different design stages by emphasizing "model continuity" through the development process. Specifically, techniques have been developed so that the same control models that are designed can be tested and analyzed by simulation methods and then easily deployed to the distributed target system for execution. To improve the traditional software testing process where real-time embedded software needs to be hooked up with real sensor/actuators and placed in a physical environment for meaningful test and analysis, we developed a virtual test environment that allows software to be effectively tested and analyzed in a virtual environment, using virtual sensor/actuators. Within this environment, stepwise simulation methods have been developed so that different aspects, such as logic and temporal behaviors, of a real-time system can be tested and analyzed incrementally. Based on this methodology, a simulation and testing environment for distributed autonomous robotic systems is developed. This environment has successfully supported the development and investigation of several distributed autonomous robotic systems. One of them is a "dynamic team formation" system in which mobile robots search for each other, and then form a team dynamically through self-organization. Another system is a scalable robot convoy system in which robots convoy and maintain a line formation in a coordinated way.
APA, Harvard, Vancouver, ISO, and other styles
26

Schilling, Walter William Jr. "A Cost Effective Methodology for Quantitative Evaluation of Software Reliability using Static Analysis." University of Toledo / OhioLINK, 2007. http://rave.ohiolink.edu/etdc/view?acc_num=toledo1189820658.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Acar, Hayri. "Software development methodology in a Green IT environment." Thesis, Lyon, 2017. http://www.theses.fr/2017LYSE1256/document.

Full text
Abstract:
Le nombre de périphériques mobiles (smartphone, tablette, ordinateur portable, etc.) et les internautes augmentent continuellement. En raison de l'accessibilité du cloud computing, de l'Internet et de l'Internet des Objets (IdO), les utilisateurs utilisent de plus en plus d'applications logicielles qui provoquent un effet croissant sur les émissions de gaz à effet de serre. Ainsi, les TIC (Technologies de l'Information et de la Communication) sont responsables d'environ 2% des émissions mondiales de gaz à effet de serre qui sont équivalentes à celles émises par l'industrie aérienne. Selon des rapports récents, le Groupe d'experts Intergouvernemental sur l'Evolution du Climat (GIEC), les émissions de CO2 dus aux TIC augmentent rapidement. Néanmoins, les TIC, en permettant de résoudre des problèmes complexes dans d'autres secteurs, peuvent grandement et facilement participer pour réduire une partie importante des 98% restants des émissions mondiales de CO2. L'utilisation du logiciel implique des opérations matérielles qui sont physiquement responsables de la consommation d'énergie. Par conséquent, le logiciel est indirectement impliqué dans la consommation d'énergie. Ainsi, nous devons réduire la consommation d'énergie du logiciel tout en conservant les mêmes fonctionnalités pour le logiciel afin de créer des logiciels durables et verts. Premièrement, dans ce travail de thèse, nous définissons les termes «durable et vert» dans le domaine du logiciel afin de créer des logiciels respectant les critères de ces termes. Pour créer un produit logiciel, nous devons suivre un processus d'ingénierie logicielle. Par conséquent, nous décrivons des critères durables et verts à respecter après chaque étape de ce processus afin d'établir un processus d'ingénierie logicielle durable et écologique. En particulier, nous nous concentrons sur l'estimation de la consommation d'énergie du logiciel. De nombreux travaux ont essayé de proposer divers outils pour estimer la consommation d'énergie due aux logiciels afin de réduire l'empreinte carbone. Pendant longtemps, les solutions proposées se sont concentrées uniquement sur la conception du matériel, mais ces dernières années, les aspects logiciels sont également devenus importants. Malheureusement, ces études, dans la plupart des cas, ne considèrent que le CPU et négligent tous les autres composants. Les modèles de consommation d'énergie existants doivent être améliorés en tenant compte de plus de composants susceptibles de consommer de l'énergie pendant l'exécution d'une application. L'écriture d'un logiciel durable, performant et vert nécessite de comprendre le comportement de consommation d'énergie d'un programme informatique. L'un des avantages est que les développeurs, en améliorant leurs implémentations du code source, optimiseront la consommation d'énergie du logiciel. De plus, il existe un manque d'outil d'analyse pour surveiller dynamiquement la consommation d'énergie du code source de plusieurs composants. Ainsi, nous proposons GMTEEC (Méthodologie Générique d'Outil pour Estimer la Consommation Energétique) qui se compose de quatre couches aidant et guidant la construction d'un outil permettant d'estimer la consommation énergétique d'un logiciel. Ainsi, dans notre travail, en respectant les couches de GMTEEC, nous créons TEEC (Outil pour Estimer la Consommation Energétique) qui repose sur une formule mathématique établie pour chaque composant (CPU, mémoire, disque dur, réseau) susceptible de consommer de l'énergie afin d'estimer la consommation totale d'énergie du logiciel composée de la somme de chaque consommation d'énergie par composant. De plus, nous ajoutons à TEEC la capacité de localiser dynamiquement les points chauds qui sont les parties du code source consommant la plus grande quantité d'énergie afin d'aider et guider les développeurs à optimiser leur code source et à créer des logiciels efficaces, durables et verts... [etc]
The number of mobile devices (smartphone, tablet, laptop, etc.) and Internet users are continually increasing. Due to the accessibility provided by cloud computing, Internet and Internet of Things (IoT), users use more and more software applications which cause an increasing effect on gas emission. Thus, ICT (Information and Communication Technologies) is responsible of around 2% worldwide greenhouse gas emissions which is equivalent of that emitted by the airline industry. According to recent reports, the Intergovernmental Panel on Climate Change (IPCC), CO2 emissions due to ICT are increasing widely. Nevertheless, ICT, in allowing to solve complex problems in other sectors, can greatly and easily participate to reduce significant portion of the remaining 98% of global CO2 emissions. The use of software implies hardware operations which are physically responsible of energy consumption. Consequently, software is indirectly involved in the energy consumption. Thus, we need to reduce software energy consumption while maintaining the same functionalities for the software in order to build sustainable and green software. Firstly, in this thesis work, we define the terms sustainable and green in the area of software development. To build a software product, we need to follow a software engineering process. Hence, we define and describe sustainable and green criteria to be respected after each step of this process in order to establish a sustainable and green software engineering process. Then, we focus on the software energy consumption estimation. Many research works tried to propose various tools to estimate the energy consumption due to software in order to reduce carbon footprint. Unfortunately, these studies, in the majority of cases, consider only the CPU and neglects all others components. Existing power consumption methodologies need to be improved by taking into account more components susceptible to consume energy during runtime of an application. Writing sustainable, power efficient and green software necessitates to understand the power consumption behavior of a computer program. One of the benefits is the fact that developers, by improving their source code implementations, will optimize software power consumption. Moreover, there is a lack of analyzing tool to dynamically monitor source code energy consumption of several components. Thus, we propose GMTEEC (Generic Methodology of a Tool to Estimate Energy Consumption) which is composed of four layers assisting developers to build a tool estimating the software power consumption. Hence, in our work, respecting the layers of GMTEEC, we develop TEEC (Tool to Estimate Energy Consumption) which is based on mathematical formula established for each component (CPU, memory, hard disk, network) in order to estimate the total software energy consumption. Moreover, we add in TEEC the capacity to locate dynamically the hotpoints which are the parts of source code consuming the greater amount of energy in order to help and guide developers to optimize their source code and build efficient, sustainable and green software. We performed a variety of experiments to validate the accuracy and quality of the sustainable and green software engineering process and TEEC. The results demonstrate the possibility to save significant quantity of energy and time at limited costs with an important positive impact on environment
APA, Harvard, Vancouver, ISO, and other styles
28

Girot, Etienne. "Practical implementation of SCRUM and associated practices." Thesis, Linköpings universitet, Institutionen för datavetenskap, 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-100123.

Full text
Abstract:
Even though Scrum is nowadays widely known in the Software Industry field and its theoretical frame extensively described in the literature, its implementation is far from being straightforward. As a matter of fact, the literature describes a meta-process that new practitioners must adapt to their project specific constraints. However, this practical aspect is crucial and very scarcely tackled. To that extend, this thesis work describes the difficulties we faced while putting Scrum into practice and how, through the study of the project contextual factors, the insight of a Proof Of Concept and the support of a couple of agile practices, we worked it out.
APA, Harvard, Vancouver, ISO, and other styles
29

Sheinbein, Rachel Felice 1975. "Applying supply chain methodology to a centralized software licensing strategy." Thesis, Massachusetts Institute of Technology, 2004. http://hdl.handle.net/1721.1/34781.

Full text
Abstract:
Thesis (M.B.A.)--Massachusetts Institute of Technology, Sloan School of Management; and, (S.M.)--Massachusetts Institute of Technology, Dept. of Civil and Environmental Engineering; in conjunction with the Leaders for Manufacturing Program at MIT, 2004.
Includes bibliographical references (p. 76).
Eleven percent of companies spend between $150K and $200K per year per engineer on software development tools and nine percent spend more than $200K, according to a Silicon Integration Initiative/Gartner/EE Times study from 2002. For Agilent Technologies, these costs result in spending tens of millions of dollars each year on software, and for Motorola, the costs are more than $100M each year. From the current trends in software spending, one can infer that companies will pay even more for software in the future, because the cost of the software itself is rising and because of the complexity of the technology needed for innovation. In order to understand whether the total spending on software is appropriate and necessary, Agilent sponsored this project to create a model that analyzes the trade-offs between the cost of software and the cost of software unavailability. The model treats software licenses as supplies to the development of a product, and thus, supply chain methodologies such as inventory (cost of licenses), stock outs (cost of unavailability) and service level are applied. The goal of the model is to minimize software costs while maintaining a satisfactory level of service. The thesis explains the model and then shows the results from applying it to four software products that Agilent currently uses. The results show that in the absence of this type of analysis, Agilent spends more than necessary for software licenses. In fact, Agilent can reduce costs by at least 5%. This model can be used by Agilent and other companies to optimize software purchases.
by Rachel Felice Sheinbein.
S.M.
M.B.A.
APA, Harvard, Vancouver, ISO, and other styles
30

Gröner, Markus K. "Capturing Requirements Meeting Customer Intent: A Methodological Approach." Diss., Virginia Tech, 2002. http://hdl.handle.net/10919/27857.

Full text
Abstract:
Product quality is directly related to how well that product meets the customerâ s needs and intents. It is paramount, therefore, to capture customer requirements correctly and succinctly. Unfortunately, most development models tend to avoid, or only vaguely define the process by which requirements are generated. Other models rely on formalistic characterizations that require specialized training to understand. To address such drawbacks we introduce the Requirements Generation Model (RGM) that (a) decomposes the conventional â requirements analysisâ phase into sub-phases which focus and refine requirements generation activities, (b) constrains and structures those activities, and (c) incorporates a monitoring methodology to assist in detecting and resolving deviations from process activities defined by the RGM. We present an empirical study of the RGM in an industrial setting, and results derived from this study that substantiate the effectiveness of the RGM in producing a better set of requirements.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
31

Gray, David Philip Harry. "Software defect prediction using static code metrics : formulating a methodology." Thesis, University of Hertfordshire, 2013. http://hdl.handle.net/2299/11067.

Full text
Abstract:
Software defect prediction is motivated by the huge costs incurred as a result of software failures. In an effort to reduce these costs, researchers have been utilising software metrics to try and build predictive models capable of locating the most defect-prone parts of a system. These areas can then be subject to some form of further analysis, such as a manual code review. It is hoped that such defect predictors will enable software to be produced more cost effectively, and/or be of higher quality. In this dissertation I identify many data quality and methodological issues in previous defect prediction studies. The main data source is the NASA Metrics Data Program Repository. The issues discovered with these well-utilised data sets include many examples of seemingly impossible values, and much redundant data. The redundant, or repeated data points are shown to be the cause of potentially serious data mining problems. Other methodological issues discovered include the violation of basic data mining principles, and the misleading reporting of classifier predictive performance. The issues discovered lead to a new proposed methodology for software defect prediction. The methodology is focused around data analysis, as this appears to have been overlooked in many prior studies. The aim of the methodology is to be able to obtain a realistic estimate of potential real-world predictive performance, and also to have simple performance baselines with which to compare against the actual performance achieved. This is important as quantifying predictive performance appropriately is a difficult task. The findings of this dissertation raise questions about the current defect prediction body of knowledge. So many data-related and/or methodological errors have previously occurred that it may now be time to revisit the fundamental aspects of this research area, to determine what we really know, and how we should proceed.
APA, Harvard, Vancouver, ISO, and other styles
32

Junered, Marcus. "Enabling hardware technology for GNSS software radio research." Licentiate thesis, Luleå : Luleå University of Technology, 2007. http://epubl.ltu.se/1402-1757/2007/32/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Hilal, Daoud Kassem. "The circumstantial occurrence methodology : a proposed way forward in strategic knowledge engineering." Thesis, Cranfield University, 1994. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.282178.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Anderson, Patrick William 1976. "A modular framework for reusable research software." Thesis, Massachusetts Institute of Technology, 2000. http://hdl.handle.net/1721.1/81539.

Full text
Abstract:
Thesis (M.Eng.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2000.
Includes bibliographical references (leaves 63-64).
by Patrick William Anderson.
M.Eng.
APA, Harvard, Vancouver, ISO, and other styles
35

Lin, Chia-En. "A Comparison of Agent-Oriented Software Engineering Frameworks and Methodologies." Thesis, University of North Texas, 2003. https://digital.library.unt.edu/ark:/67531/metadc4411/.

Full text
Abstract:
Agent-oriented software engineering (AOSE) covers issues on developing systems with software agents. There are many techniques, mostly agent-oriented and object-oriented, ready to be chosen as building blocks to create agent-based systems. There have been several AOSE methodologies proposed intending to show engineers guidelines on how these elements are constituted in having agents achieve the overall system goals. Although these solutions are promising, most of them are designed in ad-hoc manner without truly obeying software developing life-cycle fully, as well as lacking of examinations on agent-oriented features. To address these issues, we investigated state-of-the-art techniques and AOSE methodologies. By examining them in different respects, we commented on the strength and weakness of them. Toward a formal study, a comparison framework has been set up regarding four aspects, including concepts and properties, notations and modeling techniques, process, and pragmatics. Under these criteria, we conducted the comparison in both overview and detailed level. The comparison helped us with empirical and analytical study, to inspect the issues on how an ideal agent-based system will be formed.
APA, Harvard, Vancouver, ISO, and other styles
36

Miklaski, Michael H. Babbitt Joel D. "A methodology for developing timing constraints for the Ballistic Missile Defense System /." Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 2003. http://library.nps.navy.mil/uhtbin/hyperion-image/03Dec%5FMiklaski.pdf.

Full text
Abstract:
Thesis [M.H. Miklaski]-(M.S. in Systems Technology) and (M.S. in Software Engineering)--Naval Postgraduate School, December 2003. Thesis [J.D. Babbitt]-(M.S. in Computer Science)--Naval Postgraduate School, December 2003.
Thesis advisor(s): Man-Tak Shing, James Bret Michael. Includes bibliographical references (p. 287-289). Also available online.
APA, Harvard, Vancouver, ISO, and other styles
37

Bradley, Roxanne. "Software design : communication between human factors engineers and software developers /." Thesis, This resource online, 1991. http://scholar.lib.vt.edu/theses/available/etd-08222009-040239/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Pek, Ekaterina [Verfasser]. "Corpus-based empirical research in software engineering / Ekaterina Pek." Koblenz : Universitätsbibliothek Koblenz, 2014. http://d-nb.info/1063286468/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Krein, Jonathan L. "Replication and Knowledge Production in Empirical Software Engineering Research." BYU ScholarsArchive, 2014. https://scholarsarchive.byu.edu/etd/4296.

Full text
Abstract:
Although replication is considered an indispensable part of the scientific method in software engineering, few replication studies are published each year. The rate of replication, however, is not surprising given that replication theory in software engineering is immature. Not only are replication taxonomies varied and difficult to reconcile, but opinions on the role of replication contradict. In general, we have no clear sense of how to build knowledge via replication, particularly given the practical realities of our research field. Consequently, most replications in software engineering yield little useful information. In particular, the vast majority of external replications (i.e., replications performed by researchers unaffiliated with the original study) not only fail to reproduce the original results, but defy explanation. The net effect is that, as a research field, we consistently fail to produce usable (i.e., transferable) knowledge, and thus, our research results have little if any impact on industry. In this dissertation, we dissect the problem of replication into four primary concerns: 1) rate and explicitness of replication; 2) theoretical foundations of replication; 3) tractability of methods for context analysis; and 4) effectiveness of inter-study communication. We address each of the four concerns via a two-part research strategy involving both a theoretical and a practical component. The theoretical component consists of a grounded theory study in which we integrate and then apply external replication theory to problems of replication in empirical software engineering. The theoretical component makes three key contributions to the literature: first, it clarifies the role of replication with respect to the overall process of science; second, it presents a flexible framework for reconciling disparate replication terminology; and third, it informs a broad range of practical replication concerns. The practical component involves a series of replication studies, through which we explore a variety of replication concepts and empirical methods, ultimately culminating in the development of a tractable method for context analysis (TCA). TCA enables the quantitative evaluation of context variables in greater detail, with greater statistical power, and via considerably smaller datasets than previously possible. As we show (via a complex, real-world example), the method ultimately enables the empirically and statistically-grounded reconciliation and generalization of otherwise contradictory results across dissimilar replications—which problem has previously remained unsolved in software engineering.
APA, Harvard, Vancouver, ISO, and other styles
40

Tumkur, Anand AswathaNarayana, and Avijit Dutta. "PROTOCOL PERFORMANCE MEASUREMENT METHODOLOGY - EXPERIMENTATION WITH SIGNALING SYSTEM NO 7." Thesis, Mälardalen University, School of Innovation, Design and Engineering, 2009. http://urn.kb.se/resolve?urn=urn:nbn:se:mdh:diva-6226.

Full text
Abstract:

Performance is the driving force for the effective network utilization in the current telecommunication world. The thesis aims to define suitable performance measurement methodologies for communication over stack based Signalling System No 7 (SS7). This thesis also throws a quick glance on open source SS7 and Ericsson proprietary SS7 protocols, to devise performance measurement approach that can be adopted to develop sophisticated tools. We adopt a scientific experimental approach for numerical measurement of throughput and latency of the protocol stack. Our current work finishes experimentation with open source SS7 protocol (SCTP) in Fedora based two identical servers. SCTP (Stream Control Transmission Protocol) is an important transport layer protocol for communication of SS7 message over an IP network. Message communication using SCTP protocol over an IP/Ethernet network between these two identical servers has been measured and analyzed using the IPerf tool. TCP (Transmission Control Protocol) being another important transport layer protocol of TCP/IP stack, the performance of TCP is compared with SCTP. The results prove that under normal circumstances TCP gains over SCTP and our analysis support that under multi homing support, SCTP should gain over TCP when throughput is measured

APA, Harvard, Vancouver, ISO, and other styles
41

Zheng, Junyu. "Quantification of Variability and Uncertainty in Emission Estimation: General Methodology and Software Implementation." NCSU, 2002. http://www.lib.ncsu.edu/theses/available/etd-05192002-201242/.

Full text
Abstract:
The use of probabilistic analysis methods for dealing with variability and uncertainty is being more widely recognized and recommended in the development of emission factor and emission inventory. Probabilistic analysis provides decision-makers with quantitative information about the confidence with which an emission factor may be used. Variability refers to the heterogeneity of a quantity with respect to time, space, or different members of a population. Uncertainty refers to the lack of knowledge regarding the true value of an empirical quantity. Ignorance of the distinction between variability and uncertainty may lead to erroneous conclusions regarding emission factor and emission inventory. This dissertation extensively and systematically discusses methodologies associated with quantification of variability and uncertainty in the development of emission factors and emission inventory, including the method based upon use of mixture distribution and the method for accounting for the effect of measurement error on variability and uncertainty analysis. A general approach for developing a probabilistic emission inventory is presented. A few example case studies were conducted to demonstrate the methodologies. The case studies range from utility power plant emission source to highway vehicle emission sources. A prototype software tool, AUVEE, was developed to demonstrate the general approach in developing a probabilistic emission inventory based upon an example utility power plant emission source. A general software tool, AuvTool, was developed to implement all methodologies and algorithms presented in this dissertation for variability and uncertainty analysis. The tool can be used in any quantitative analysis fields where variability and uncertainty analysis are needed in model inputs.
APA, Harvard, Vancouver, ISO, and other styles
42

Gupta, Jatin. "Application of Hazard and Operability (HAZOP) Methodology to Safety-Related Scientific Software." The Ohio State University, 2014. http://rave.ohiolink.edu/etdc/view?acc_num=osu1398983873.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Woyak, Scott A. "An object-oriented methodology and supporting framework for creating engineering software by dynamic integration." Diss., Virginia Tech, 1995. http://hdl.handle.net/10919/40211.

Full text
Abstract:
Software design within the engineering community has generally been relegated to encoding algorithms for the purpose of executing them very rapidly. This is a very important purpose, however substantially more is required to build an entire CAD application. Structure must be provided to the data maintained in the application. Various analyses must be integrated and coordinated in an orderly fashion. Interaction with the user must be managed. These topics have traditionally received secondary attention. The result has been engineering applications that are difficult to use, costly to create, and expensive to maintain or modify. The system created in this dissertation, the Dynamic Integration System, addresses these issues with respect to engineering-related software. Code constructed with Dynamic Integration System techniques anticipate future needs, such as integration, before those needs explicitly arise. This greatly reduces downstream costs and facilitates the development of engineering-related software. The Dynamic Integration System consists of two primary constructs: Dynamic Variables and dependency hierarchies. Dynamic Variables are used to model the key parameters in an application while a dependency hierarchy is built from the relationships between Dynamic Variables. Using these constructs, issues such as integration and analysis coordination are automated by the underlying Dynamic Integration System facilities.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
44

Fan, Yao-Long. "Re-engineering the solicitation management system." CSUSB ScholarWorks, 2006. https://scholarworks.lib.csusb.edu/etd-project/3179.

Full text
Abstract:
The scope of this project includes a re-engineering of the internal architecture of the Solicitation Management System (SMS), a web-based application that facilitates the running of grant proposal solicitations for the Office of Technology Transfer and Commercialization at California State University San Bernardino (CSUSB). A goal of the project is to increase consistency and efficiency of the code base of the system, making it easier to understand, maintain, and extend. The previous version of SMS was written to rely on the Spring and Hibernate frameworks. The project includes a restructuring of the system to remove reliance on the Spring framework, but maintain reliance on Hibernate. The result is an updated version of the SMS. The system was written using current technologies such as Java, JSP, and CSS.
APA, Harvard, Vancouver, ISO, and other styles
45

Dario, Claudia Filomena Bratficher. "Uma metodologia unificada para o desenvolvimento de sistemas orientados a agentes." [s.n.], 2005. http://repositorio.unicamp.br/jspui/handle/REPOSIP/259813.

Full text
Abstract:
Orientador: Ricardo Ribeiro Gudwin
Dissertação (mestrado) - Universidade Estadual de Campinas, Faculdade de Engenharia Eletrica e de Computação
Made available in DSpace on 2018-08-05T14:04:06Z (GMT). No. of bitstreams: 1 Dario_ClaudiaFilomenaBratficher_M.pdf: 11451853 bytes, checksum: 8ecd6ccb893fcfa7790f5da4d3011ffd (MD5) Previous issue date: 2005
Resumo: Este trabalho propõe uma Metodologia Unificada para o desenvolvimento de sistemas orientados a agentes. A elaboração desta metodologia foi realizada a partir de um estudo do papel do agente dentro da Engenharia de Software e da análise de diversas metodologias orientadas a agentes encontradas na literatura, enfocando-se principalmente em três destas: MaSE (Muitiagent Systems Engimering Metkodoiogy), Prornetheus e Tropos, além da linguagem de modelagem AÜML {Agem Vnijied Modeling Language). A Metodologia Unificada proposta visa aproveitar o que há de melhor nestas metodologias, buscando elementos comuns a todas elas, de modo análogo ao que ocorreu com a metodologia unificada (RUP - Raiional Unified Process) em sistemas orientados a objetos. Para validar a Metodologia Unificada e analisar as demais metodologias, um estudo de caso foi modelado. A Metodologia Unificada se mostrou eficiente no projeto, documentação e construção de sistemas multi-agentes, sendo considerada uma metodologia detalhada e mais completa por cobrir os estágios de especificação de requisitos, de análise e de projeto no desenvolvimento de software orientados a agentes
Abstract: This work proposes a Unified Methodology for the development of agent-oriented systems. The methodology was elaborated based on a study of agent's role within software engineering and the analysis of different agent-based software development methodologies found in the literature, focusing in three main ones: MaSE (Muitiagent Systems Engineering Methodology), Prometheus and Tropos, in addition to the modeling language AUM.L (Agent Unified Modeling Language). The Unified Methodology aims at taking advantage of the best from each methodology, searching for common elements among them, in an effort similar to what happened with the Unified Methodology (RUP - Rational Unified Process) in object-oriented systems. To validate the Unified Methodology and analyze the other methodologies, a case study was developed. The Unified Methodology has shown to be efficient in the design, documentation and construction of multi-agent systems. We conclude it to be a detailed and more complete methodology, covering requirements specification, analysis and design stages of agent-based software development
Mestrado
Engenharia de Computação
Mestre em Engenharia Elétrica
APA, Harvard, Vancouver, ISO, and other styles
46

Behnke, Matthew J. "An implementation methodology and software tool for an entropy based engineering model for evolving systems." Thesis, Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 2003. http://library.nps.navy.mil/uhtbin/hyperion-image/03Jun%5FBehnke.pdf.

Full text
Abstract:
Thesis (M.S. in Software Engineering)--Naval Postgraduate School, June 2003.
Thesis advisor(s): Mantak Shing, Christopher D. Miles. Includes bibliographical references (p. 69-70). Also available online.
APA, Harvard, Vancouver, ISO, and other styles
47

Ahmad, M. O. (Muhammad Ovais). "Exploring Kanban in software engineering." Doctoral thesis, Oulun yliopisto, 2016. http://urn.fi/urn:isbn:9789526214085.

Full text
Abstract:
Abstract To gain competitive advantage and thrive in the market, companies have introduced Kanban in software development. Kanban has been used in the manufacturing industry for over six decades. In the software engineering domain, Kanban was introduced in 2004 to increase flexibility in coping with dynamic requirements, bring visibility to workflow and related tasks, improve communication, and promote the pull system. However, the existing scientific literature lacks empirical evidence of the use of Kanban in software companies. This doctoral thesis aims to improve the understanding of the use of Kanban in software engineering. The research was performed in two phases: 1) analysis of scientific literature on Kanban in software engineering and industrial engineering and 2) investigation of Kanban implementation trends in software companies. The data was collected through systematic literature reviews, survey and semi-structured interviews. The results were synthesized to draw conclusions and outline implications for research and practice. The results indicate growing interest in the use of Kanban in software companies. The findings suggest that Kanban is applicable to software development, software maintenance, and portfolio management in software companies. Kanban brings visibility to task and offering status, limits work in progress at any given time gives people greater control over their work and limit task switching. Although Kanban offers several benefits, as reported in this dissertation, the findings show that software companies find it challenging to implement Kanban incrementally
Tiivistelmä Ohjelmistoteollisuudessa Kanbanin käyttö on yleistynyt vuodesta 2004 alkaen. Sillä pyritään tuomaan joustavuutta muuttuvien vaatimusten hallintaan, tuomaan näkyvyyttä työnkulkuun ja toisiinsa liittyviin tehtäviin, parantamaan kommunikaatiota sekä edistämään imuohjauksen hyödyntämistä. Kanbania on käytetty valmistavassa teollisuudessa jo yli kuuden vuosikymmenen ajan. Olemassa olevassa tieteellisessä kirjallisuudessa on kuitenkin esitetty hyvin vähän empiirisiä tutkimustuloksia Kanbanin käytöstä ohjelmistoyrityksissä. Väitöskirjan tavoitteena on parantaa ymmärrystä Kanbanin käytöstä ohjelmistotuotannossa. Tutkimus toteutettiin kahdessa vaiheessa: 1) Kirjallisuusanalyysi Kanbanin käytöstä ohjelmistotuotannossa ja tuotantotekniikassa ja 2) Empiirinen tutkimus Kanbanin käyttöönoton trendeistä ohjelmistoyrityksissä. Tutkimusaineisto kerättiin systemaattisten kirjallisuuskatsausten, kyselytutkimuksen ja puolistrukturoitujen teemahaastattelujen kautta. Tutkimustulosten synteesin pohjalta tehtiin johtopäätöksiä Kanbanin käytöstä ohjelmistotuotannossa sekä niiden merkityksestä alan tutkimukselle ja Kanbanin käytölle yrityksissä. Tutkimuksen tulokset osoittavat kasvavaa kiinnostusta Kanbanin käyttöä kohtaan ohjelmistoyrityksissä. Tulosten perusteella Kanban soveltuu käytettäväksi ohjelmistokehityksessä, ohjelmistojen ylläpidossa sekä tuoteportfolion hallinnassa. Kanban tuo näkyvyyttä ohjelmistokehitykseen, niin meneillään olevien tehtävien kuin portfoliotarjoaman osalta. Se myös auttaa rajoittamaan työtehtävien ruuhkautumista ja antaa kehittäjille paremman tavan hallita työtään rajoittamalla työtehtävien vaihtoa. Vaikka Kanbanin käytöllä on mahdollista saavuttaa väitöskirjatutkimuksessa esitettyjä hyötyjä, tulokset osoittavat, että ohjelmistoyrityksillä on haasteita Kanbanin inkrementaalisessa käyttöönotossa
APA, Harvard, Vancouver, ISO, and other styles
48

Thomas, Mathew. "Semi-Automated Dental Cast Analysis Software." Wright State University / OhioLINK, 2011. http://rave.ohiolink.edu/etdc/view?acc_num=wright1310404863.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Vajrapu, Rakesh Guptha, and Sravika Kothwar. "Software Requirements Prioritization Practices in Software Start-ups : A Qualitative research based on Start-ups in India." Thesis, Blekinge Tekniska Högskola, Institutionen för programvaruteknik, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-15967.

Full text
Abstract:
Context: Requirements prioritization is used in software product management and is concerned with identifying the most valuable requirements from a given set. This is necessary to satisfy the needs of customers, to provide support for stakeholders and more importantly for release planning. Irrespective of the size of the organization (small, medium and large), requirements prioritization is important to minimize the risk during development. However, few studies explore how requirements prioritization is practiced in start-ups. Software start-ups are becoming important suppliers of innovative and software-intensive products.Earlier studies suggest that requirements discovery and validation is the core activity in start-ups. However, due to limited resources, start-ups need to prioritize on what requirements to focus. If they do it wrong it leads to wasted resources.While larger organizations may afford such waste, start-ups cannot.Moreover, researchers have identified that start-ups are not small versions of large companies and the existing software development practices cannot be transferred directly due to low rigor in current studies.Thus, we planned to conduct an exploratory study on requirements prioritization practices in the context of software start-ups. Objectives: The main aim of our study is to explore the state-of-art of requirements prioritization practices used in start-ups.We also identify the challenges associated with the corresponding practices and few possible solutions. Methods: In this qualitative research, we conduct a literature review by referring to many article sources like IEEE Xplore, Scopus, and Google Scholar to identify the prioritization practices and challenges in general. An interview study is conducted by using semi-structured interviews to collect data from practitioners.Thematic analysis was used to analyze the interview data. Results: We have identified 15 practices from 8 different start-ups companies with corresponding challenges and possible solutions. Our results show mixed reviews in terms of the prioritization practices at start-ups. From the total of 8 companies about 6 companies followed formal methods while in the remaining 2 companies, prioritization was informal and not clear. The results show that value-based method is the dominant prioritization technique in start-ups. The results also show that customer input and return on investment aspects of prioritization play a key role when compared to other aspects. Conclusions: The results of this study provide an understanding of the various requirements prioritization practices in start-ups and challenges faced in implementing them.These results are validated from the answers found in the literature. The solutions identified for the corresponding challenges allow the practitioners to approach them in a better way. As this study focused only on Indian software start-up companies, it is recommended to extend to Swedish software start-up companies as well to get a broader perspective. Scaling of sample size is also recommended. This study may help future research on requirements engineering in start-ups. It may also help practitioners who have an intention to begin a software start-up company to get an idea of what challenges they may face while prioritizing requirements and can use these solutions to mitigate them.
APA, Harvard, Vancouver, ISO, and other styles
50

Koch, Stefan, and Georg Schneider. "Results from software engineering research into open source development projects using public data." Institut für Informationsverarbeitung und Informationswirtschaft, WU Vienna University of Economics and Business, 2000. http://epub.wu.ac.at/494/1/document.pdf.

Full text
Abstract:
This paper presents first results from research into open source projects from a software engineering perspective. The research methodology employed relies on public data retrieved from the CVS-repository of the GNOME project and relevant discussion groups. This methodology is described in detail and some of the results concerning the special characteristics of open source software development are given. (author's abstract)
Series: Diskussionspapiere zum Tätigkeitsfeld Informationsverarbeitung und Informationswirtschaft
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography