Dissertations / Theses on the topic 'Computer software industry Quality control'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the top 50 dissertations / theses for your research on the topic 'Computer software industry Quality control.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.
Pryor, Alan N. "A Discrimination of Software Implementation Success Criteria." Thesis, University of North Texas, 1999. https://digital.library.unt.edu/ark:/67531/metadc2196/.
Full textUnderwood, B. Alan. "A framework for the certification of critical application systems." Thesis, Queensland University of Technology, 1994.
Find full textGarcia, Sotelo Gerardo Javier. "Get the right price every day." CSUSB ScholarWorks, 2005. https://scholarworks.lib.csusb.edu/etd-project/2729.
Full textAlbanez, Altamar Urbanetz de Araújo. "Associação entre CMMI-DEV 1.2 e ISO/TS 16949." Universidade Tecnológica Federal do Paraná, 2012. http://repositorio.utfpr.edu.br/jspui/handle/1/558.
Full textThe automotive sector is one of the most daring in terms of quality, requiring because of that certification to ISO/TS 16949. Although these companies dominate this certification, some lose in the subsequent audits or get little improvement beyond existing. There is evidence that they do not have the maturity to obtain or maintain such certification or guidelines to continually improve. In previous work, it was found out that certified companies had at least level 2 maturity, 1 (minimum) and 5 (maximum), which means a company defined and manageable process. However, what enables the company to improve its indexes have the process is controlled and integrated. The lack of maturity of a product development process (PDP) triggers scrap and rework, compromising the efficient use of resources, impacting the time and cost of development and, indirectly, the quality of the process and final product. However, the guidelines do not have certified companies to improve their processes. For this, the ISO would require some resource associated in order to provide guidance on the aspects that need to be improved. Whereas CMMI is an effective method for obtaining diagnostic and maturity that considers the integration of PDP, this work aims to identify the association between the ISO/TS 16949 and CMMI-DEV 1.2 method. Presenting an overview of PDPs, quality certification and process maturity. Later, associated variables are involved in a process of ISO 9001 certification and the variables evaluated in the ISO/TS 16949 with the variables involved in assessing the maturity level 2 with CMMI-DEV 1.2. The paper explains which items are considered by the ISO/TS 16949, CMMI highlighting items that could be used for diagnosis complement for companies that wish to improve the quality factor, adding, in parallel, more efficiency and productivity of their production processes.
Wilburn, Cathy A. "Using the Design Metrics Analyzer to improve software quality." Virtual Press, 1994. http://liblink.bsu.edu/uhtbin/catkey/902489.
Full textDepartment of Computer Science
Walsh, Martha Geiger. "A system of automated tools to support control of software development through software configuration management." Thesis, Kansas State University, 1985. http://hdl.handle.net/2097/9892.
Full textKrishnamurthy, Janaki. "Quality Market: Design and Field Study of Prediction Market for Software Quality Control." NSUWorks, 2010. http://nsuworks.nova.edu/gscis_etd/352.
Full textHammons, Rebecca L. "Continuing professional education for software quality assurance." Muncie, Ind. : Ball State University, 2009. http://cardinalscholar.bsu.edu/759.
Full textDorgelo, Eric G. "Strategic analysis of a software business /." Burnaby B.C. : Simon Fraser University, 2006. http://ir.lib.sfu.ca/handle/1892/3698.
Full textKwan, Pak Leung. "Design metrics forensics : an analysis of the primitive metrics in the Zage design metrics." Virtual Press, 1994. http://liblink.bsu.edu/uhtbin/catkey/897490.
Full textDepartment of Computer Science
Black, Angus Hugh. "Software quality assurance in a remote client/contractor context." Thesis, Rhodes University, 2006. http://hdl.handle.net/10962/d1006615.
Full textWalia, Gursimran Singh. "Using error modeling to improve and control software quality an empirical investigation /." Diss., Mississippi State : Mississippi State University, 2009. http://library.msstate.edu/etd/show.asp?etd=etd-04032009-070637.
Full textStineburg, Jeffrey. "Software reliability prediction based on design metrics." Virtual Press, 1999. http://liblink.bsu.edu/uhtbin/catkey/1154775.
Full textDepartment of Computer Science
Pipkin, Jeffrey A. "Applying design metrics to large-scale telecommunications software." Virtual Press, 1996. http://liblink.bsu.edu/uhtbin/catkey/1036178.
Full textDepartment of Computer Science
Hall, Tracy. "The critical success factors of quality assurance and measurement practice in the software industry." Thesis, City University London, 1998. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.266297.
Full textBhattrai, Gopendra R. "An empirical study of software design balance dynamics." Virtual Press, 1995. http://liblink.bsu.edu/uhtbin/catkey/958786.
Full textDepartment of Computer Science
Seo, Jongwon. "Graphical interface design for equipment control in unstructured environments /." Digital version accessible at:, 1998. http://wwwlib.umi.com/cr/utexas/main.
Full textLim, Edwin C. "Software metrics for monitoring software engineering projects." Thesis, Edith Cowan University, Research Online, Perth, Western Australia, 1994. https://ro.ecu.edu.au/theses/1100.
Full textMoschoglou, Georgios Moschos. "Software testing tools and productivity." Virtual Press, 1996. http://liblink.bsu.edu/uhtbin/catkey/1014862.
Full textDepartment of Computer Science
Smedley, Peter John. "Development of computer based aids to hazard analysis critical control point (HACCP)." Thesis, London South Bank University, 1997. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.245126.
Full textLow, Gregory Norman. "A software licensing framework." Thesis, Queensland University of Technology, 1998.
Find full textHerrero, Juan Jose. "Columbus : a solution using metadata for integrating document management, project hosting and document control in the construction industry." Thesis, University of Greenwich, 2003. http://gala.gre.ac.uk/8671/.
Full textPerera, Dinesh Sirimal. "Design metrics analysis of the Harris ROCC project." Virtual Press, 1995. http://liblink.bsu.edu/uhtbin/catkey/935930.
Full textDepartment of Computer Science
Glazunov, Vladimir. "Quality assessment of a large real world industry project." Thesis, Linnéuniversitetet, Institutionen för datavetenskap (DV), 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-31155.
Full textHultgren, Richard, and Peter Kullgard. "Improving configuration management, quality management and development methods in the computer game industry." Thesis, Blekinge Tekniska Högskola, Institutionen för programvaruteknik och datavetenskap, 2001. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-1677.
Full textDatorspelsindustrin är en omogen bransch vilket medför att företagen använder sig av icke kommersiella metoder och processer för utveckling. Magisterarbetets fallstudie ger läsaren en överblick av spelföretags mognadsnivå inom några delar av utvecklingsprocessen. Dessutom presenterar magisterarbetet råd för att förbättra de brister som har blivit definierade under fallstudien.
Questions about the master thesis can be sent to Peter Kullgard (kullgard@hotmail.com) or Richard Hultgren (richie@pc.nu) Special thanks to Martin Walfisz and Massive Entertainment of Sweden
Roems, Raphael. "The implications of deviating from software testing processes : a case study of a software development company in Cape Town, South Africa." Thesis, Cape Peninsula University of Technology, 2017. http://hdl.handle.net/20.500.11838/2686.
Full textEnsuring that predetermined quality standards are met is an issue which software development companies, and the software development industry at large, is having issues in attaining. The software testing process is an important process within the larger software development process, and is done to ensure that software functionality meets user requirements and software defects are detected and fixed prior to users receiving the developed software. Software testing processes have progressed to the point where there are formal processes, dedicated software testing resources and defect management software in use at software development organisations. The research determined implications that the case study software development organisation could face when deviating from software testing processes, with a focus on function performed by the software tester role. The analytical dimensions of duality of structure framework, based on Structuration Theory, was used as a lens to understand and interpret the socio-technical processes associated with software development processes at the case study organisation. Results include the identification of software testing processes, resources and tools, together with the formal software development processes and methodologies being used. Critical e-commerce website functionality and software development resource costs were identified. Tangible and intangible costs which arise due to software defects were also identified. Recommendations include the prioritisation of critical functionality for test execution for the organisation’s e-commerce website platform. The necessary risk management should also be undertaken in scenarios with time constraints on software testing, which balances risk with quality, features, budget and schedule. Numerous process improvements were recommended for the organisation, to assist in preventing deviations from prescribed testing processes. A guideline was developed as a research contribution to illustrate the relationships of the specific research areas and the impact on software project delivery.
Mohamed, Essack. "A knowledge approach to software testing." Thesis, Stellenbosch : University of Stellenbosch, 2004. http://hdl.handle.net/10019.1/16391.
Full textENGLISH ABSTRACT: The effort to achieve quality is the largest component of software cost. Software testing is costly - ranging from 50% to 80% of the cost of producing a first working version. It is resource intensive and an intensely time consuming activity in the overall Systems Development Life Cycle (SDLC) and hence could arguably be the most important phase of the process. Software testing is pervasive. It starts at the initiation of a product with nonexecution type testing and continues to the retirement of the product life cycle beyond the post-implementation phase. Software testing is the currency of quality delivery. To understand testing and to improve testing practice, it is essential to see the software testing process in its broadest terms – as the means by which people, methodology, tools, measurement and leadership are integrated to test a software product. A knowledge approach recognises knowledge management (KM) enablers such as leadership, culture, technology and measurements that act in a dynamic relationship with KM processes, namely, creating, identifying, collecting, adapting, organizing, applying, and sharing. Enabling a knowledge approach is a worthy goal to encourage sharing, blending of experiences, discipline and expertise to achieve improvements in quality and adding value to the software testing process. This research was developed to establish whether specific knowledge such as domain subject matter or business expertise, application or technical skills, software testing competency, and whether the interaction of the testing team influences the degree of quality in the delivery of the application under test, or if one is the dominant critical knowledge area within software testing. This research also set out to establish whether there are personal or situational factors that will predispose the test engineer to knowledge sharing, again, with the view of using these factors to increase the quality and success of the ‘testing phase’ of the SDLC. KM, although relatively youthful, is entering its fourth generation with evidence of two paradigms emerging - that of mainstream thinking and that of the complex adaptive system theory. This research uses pertinent and relevant extracts from both paradigms appropriate to gain quality/success in software testing.
AFRIKAANSE OPSOMMING: By verre die grootste komponent van sagte ware koste is dié verwant aan kwaliteitsversekering. Toetsing van sagte ware is koste intensief en verteenwoordig tussen 50% en 80% van die kostes om ‘n beta weergawe vry te stel. Die toetsing van sagte ware is nie alleenlik duursaam nie, maar ook arbeidintensief en ‘n tydrowende aktiwteit in die sagte ware ontwikkelings lewensiklus en kan derhalwe gereken word as die mees belangrike fase. Toetsing is deurdringend – dit begin by die inisiëring van ‘n produk deur middel van nie-uitvoerende tipe toetsing en eindig by die voleinding van die produklewensiklus na die implementeringsfase. Sagte ware toetsing word beskou as die geldwaarde van kwalitatiewe aflewering. Om toetsing ten volle te begryp en die toepassing daarvan te verbeter, is dit noodsaaklik om die toetsproses holisties te beskou – as die medium en mate waartoe mense, metodologie, tegnieke, meting en leierskap integreer om ‘n sagte ware produk te toets. ‘n Benadering gekenmerk deur kennis erken die dinamiese verhouding waarbinne bestuurselemente van kundigheid, soos leierskap, kultuur, tegnologie en maatstawwe reageer en korrespondeer met prosesse van kundigheid, naamlik skep, identifiseer, versamel, aanpas, organiseer, toepas en meedeel. Die fasilitering van ‘n benadering gekenmerk deur kennis is ‘n waardige doelwit om meedeling, vermenging van ervaringe, dissipline en kundigheid aan te moedig ten einde kwaliteit te verbeter en waarde toe te voeg tot die proses van safte ware toetsing. Die doel van hierdie navorsing is om te bepaal of die kennis van ‘n spesifieke onderwerp, besigheidskundigheid, tegniese vaardighede of die toepassing daarvan, kundigheid van sagte ware toetsing, en/of die interaksie van die toetsspan die mate van kwaliteit beïnvloed, of een van voorgenoemde die dominante kritieke area van kennis is binne die konteks van sagte ware toetsing. Die navorsing beoog ook om te bepaal of daar persoonlike of situasiegebonde fakfore bestaan wat die toetstegnikus vooropstel om kennis te deel, weer eens, met die oog om deur middel van hierdie faktore kwaliteit te verbeter en die toetsfase binne die sagte ware ontwikkelingsiklus suksesvol af te lewer. Ten spyte van die relatiewe jeudgigheid van die bestuur van kennis, betree dit die vierde generasie waaruit twee denkwyses na vore kom – dié van hoofstroom denke en dié van ingewikkelde aangepaste stelselsdenke. Hierdie navorsing illustreer belangrike en toepaslike insette van beide denkwyses wat geskik is vir meedeling van kennis en vir die bereiking van verbeterde kwaliteit / sukses in sagte ware toetsing.
West, James F. "An examination of the application of design metrics to the development of testing strategies in large-scale SDL models." Virtual Press, 2000. http://liblink.bsu.edu/uhtbin/catkey/1191725.
Full textDepartment of Computer Science
Milicic, Drazen, and Pontus Svensson. "Sparks to a living quality organization." Thesis, Blekinge Tekniska Högskola, Institutionen för programvaruteknik och datavetenskap, 1998. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-2731.
Full textLand, Lesley Pek Wee Information Systems Technology & Management Australian School of Business UNSW. "Software group reviews and the impact of procedural roles on defect detection performance." Awarded by:University of New South Wales. School of Information Systems, Technology and Management, 2000. http://handle.unsw.edu.au/1959.4/21838.
Full textChen, Dejiu. "Systems Modeling and Modularity Assessment for Embedded Computer Control Applications." Doctoral thesis, KTH, Maskinkonstruktion, 2004. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-3792.
Full textQC 20100524
Bhargava, Manjari. "Analysis of multiple software releases of AFATDS using design metrics." Virtual Press, 1991. http://liblink.bsu.edu/uhtbin/catkey/834502.
Full textDepartment of Computer Science
Feng, Xin, and 馮昕. "MIST: towards a minimum set of test cases." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2002. http://hub.hku.hk/bib/B24521012.
Full textZhu, Liming Computer Science & Engineering Faculty of Engineering UNSW. "Software architecture evaluation for framework-based systems." Awarded by:University of New South Wales. Computer Science and Engineering, 2007. http://handle.unsw.edu.au/1959.4/28250.
Full textClause, James Alexander. "Enabling and supporting the debugging of software failures." Diss., Georgia Institute of Technology, 2011. http://hdl.handle.net/1853/39514.
Full textKossoski, Clayton. "Proposta de um método de teste para processos de desenvolvimento de software usando o paradigma orientado a notificações." Universidade Tecnológica Federal do Paraná, 2015. http://repositorio.utfpr.edu.br/jspui/handle/1/1405.
Full textO Paradigma Orientado a Notificações (PON) é uma alternativa para o desenvolvimento de aplicações em software e propõe resolver certos problemas existentes nos paradigmas usuais de programação, nomeadamente o Paradigma Declarativo (PD) e o Paradigma Imperativo (PI). Na verdade, o PON unifica as principais vantagens do PD e do PI, ao mesmo tempo que resolve (em termos de modelo) várias de suas deficiências e inconvenientes relativas ao cálculo lógico- causal em aplicações monoprocessados a de software, completamente supostamente multiprocessados. desde O PON ambientes tem sido materializado em termos de programação e modelagem, mas ainda não possuía um método formalizado para orientar os desenvolvedores na elaboração de teste de software. Esta dissertação propõe um método de teste para projetos de software que empregam o PON no seu desenvolvimento. O método de teste de software proposto foi desenvolvido para ser aplicado nas fases de teste unitário e teste de integração. O teste unitário considera as menores entidades testáveis do PON e requer critérios de teste específicos. O teste de integração considera o funcionamento das entidades PON em conjunto para realização de casos de uso e pode ser realizado em duas etapas: (1) teste sobre as funcionalidades descritas nos requisitos e no caso de uso e (2) teste que exercitem diretamente as entidades PON que compõem o caso de uso (como Premisses, Conditions e Rules). Esse método de teste foi aplicado em um caso de estudo que envolve a modelagem e desenvolvimento de um software de combate aéreo e os resultados desta pesquisa mostram que o método proposto possui grande importância no teste de programas PON.
The Notification Oriented Paradigm (NOP) is an alternative to the development of software applications and proposes to solve certain problems in the usual programming paradigms, including the Declarative Paradigm (DP) and Imperative Paradigm (IP). Indeed, the NOP unifies the main advantages of DP and IP while solving (in terms of model) several of its deficiencies and inconveniences related to logical-causal calculation, apparently from both mono and multiprocessor environments. The NOP has been materialized in terms of programming and modeling, but still did not have a formalized method to guide developers in designing and software testing activity. This dissertation proposes a test method for software projects that use the NOP in its development. The proposed software testing method was developed for use in the phases of unit testing and integration testing. The unit testing considers the smallest testable entities of the NOP and requires specific techniques for generating test cases. The integration testing considers the operation of the PON entities together to carry out use cases and can be accomplished in two steps: (1) test on the features described in the requirements and use case and (2) test that directly exercise the NOP entities that make up the use case (as Premisses, Conditions and Rules). The test method was applied in a case study involving the modeling and development of a simple air combat and the results of this research show that the proposed method has great importance in testing NOP programs in both unit and integration testing.
Kilic, Eda. "Quality Of Service Aware Dynamic Admission Control In Ieee 802.16j Non-transparent Relay Networks." Master's thesis, METU, 2010. http://etd.lib.metu.edu.tr/upload/12611631/index.pdf.
Full textcall admission control in non-transparent relay networks that support coverage extension. In this thesis, a Quality of Service (QoS) aware dynamic admission control algorithm for IEEE 802.16j non-transparent relay networks is introduced. Our objectives are admitting more service flows, utilizing the bandwidth, giving individual control to each relay station (RS) on call acceptance and rejection, and finally not affecting ongoing service flow quality in an RS due to the dense population of service flows in other RSs. The simulation results show that the proposed algorithm outperforms the other existing call admission control algorithms. Moreover, this algorithm can be interpreted as pioneer call admission control algorithm in IEEE 802.16j non-transparent networks.
Lin, Burch. "Neural networks and their application to metrics research." Virtual Press, 1996. http://liblink.bsu.edu/uhtbin/catkey/1014859.
Full textDepartment of Computer Science
Baah, George Kofi. "Statistical causal analysis for fault localization." Diss., Georgia Institute of Technology, 2012. http://hdl.handle.net/1853/45762.
Full textBanús, Paradell Núria. "New solutions to control robotic environments: quality control in food packaging." Doctoral thesis, Universitat de Girona, 2021. http://hdl.handle.net/10803/673469.
Full textEls sistemes de visió per computador i les tècniques d’intel·ligència artificial són dues àrees de recerca actives en el context de la Indústria 4.0. La seva combinació permet la reproducció de procediments humans millorant al mateix temps el rendiment dels processos. Malgrat això, per aconseguir l’automatització completa desitjada, hi ha la necessitat de noves aplicacions capaces de cobrir el màxim d’escenaris i processos industrials possibles. Una de les àrees que necessita més investigació i desenvolupament és el control de qualitat dels envasos d’aliments, i més concretament, el control del tancament i del segellat d’envasos termoformats. Les necessitats en aquesta àrea van ser identificades per TAVIL que, amb col·laboració amb GILAB, van proposar un Doctorat Industrial per investigar, desenvolupar i integrar en escenaris reals nous mètodes per millorar l’etapa d’envasat de la indústria alimentària mitjançant sistemes de visió per computador i tècniques d’intel·ligència artificial. En el context d’aquest Doctorat Industrial, s’han seguit dues línies d’investigació que es diferencien en el nivell en el qual estudien el problema. La primera línia es basa en el control de qualitat d’envasos d’aliments, mentre que la segona es basa en el control eficient de sistemes de visió per computador en escenaris industrials
Programa de Doctorat en Tecnologia
Fiala, John C. "Computer representations of machined parts for automatic inspection." Thesis, Virginia Polytechnic Institute and State University, 1985. http://hdl.handle.net/10919/76033.
Full textMaster of Science
Haskins, Bertram Peter. "A feasibility study on the use of agent-based image recognition on a desktop computer for the purpose of quality control in a production environment." Thesis, [Bloemfontein?] : Central University of Technology, Free State, 2006. http://hdl.handle.net/11462/66.
Full textA multi-threaded, multi-agent image recognition software application called RecMaster has been developed specifically for the purpose of quality control in a production environment. This entails using the system as a monitor to identify invalid objects moving on a conveyor belt and to pass on the relevant information to an attached device, such as a robotic arm, which will remove the invalid object. The main purpose of developing this system was to prove that a desktop computer could run an image recognition system efficiently, without the need for high-end, high-cost, specialised computer hardware. The programme operates by assigning each agent a task in the recognition process and then waiting for resources to become available. Tasks related to edge detection, colour inversion, image binarisation and perimeter determination were assigned to individual agents. Each agent is loaded onto its own processing thread, with some of the agents delegating their subtasks to other processing threads. This enables the application to utilise the available system resources more efficiently. The application is very limited in its scope, as it requires a uniform image background as well as little to no variance in camera zoom levels and object to lens distance. This study focused solely on the development of the application software, and not on the setting up of the actual imaging hardware. The imaging device, on which the system was tested, was a web cam capable of a 640 x 480 resolution. As such, all image capture and processing was done on images with a horizontal resolution of 640 pixels and a vertical resolution of 480 pixels, so as not to distort image quality. The application locates objects on an image feed - which can be in the format of a still image, a video file or a camera feed - and compares these objects to a model of the object that was created previously. The coordinates of the object are calculated and translated into coordinates on the conveyor system. These coordinates are then passed on to an external recipient, such as a robotic arm, via a serial link. The system has been applied to the model of a DVD, and tested against a variety of similar and dissimilar objects to determine its accuracy. The tests were run on both an AMD- and Intel-based desktop computer system, with the results indicating that both systems are capable of efficiently running the application. On average, the AMD-based system tended to be 81% faster at matching objects in still images, and 100% faster at matching objects in moving images. The system made matches within an average time frame of 250 ms, making the process fast enough to be used on an actual conveyor system. On still images, the results showed an 87% success rate for the AMD-based system, and 73% for Intel. For moving images, however, both systems showed a 100% success rate.
Van, der Linde P. L. "A comparative study of three ICT network programs using usability testing." Thesis, [Bloemfontein?] : Central University of Technology, Free State, 2013. http://hdl.handle.net/11462/186.
Full textThis study compared the usability of three Information and Communication Technology (ICT) network programs in a learning environment. The researcher wanted to establish which program was most adequate from a usability perspective among second-year Information Technology (IT) students at the Central University of Technology (CUT), Free State. The Software Usability Measurement Inventory (SUMI) testing technique can measure software quality from a user perspective. The technique is supported by an extensive reference database to measure a software product’s quality in use and is embedded in an effective analysis and reporting tool called SUMI scorer (SUMISCO). SUMI was applied in a controlled laboratory environment where second-year IT students of the CUT, utilized SUMI as part of their networking subject, System Software 1 (SPG1), to evaluate each of the three ICT network programs. The results, strengths and weaknesses, as well as usability improvements, as identified by SUMISCO, are discussed to determine the best ICT network program from a usability perspective according to SPG1 students.
Pollard, II Leonard Maurice I. "Perceived service quality's impact on behavioral intentions in the timeshare industry." Doctoral diss., University of Central Florida, 2010. http://digital.library.ucf.edu/cdm/ref/collection/ETD/id/4627.
Full textID: 029049825; System requirements: World Wide Web browser and PDF reader.; Mode of access: World Wide Web.; Thesis (Ph.D.)--University of Central Florida, 2010.; Includes bibliographical references (p. 126-133).
Ph.D.
Doctorate
Department of Industrial Engineering and Management Systems
Engineering and Computer Science
Nilsson, Daniel, and Henrik Norin. "Adaptive QoS Management in Dynamically Reconfigurable Real-Time Databases." Thesis, Linköping University, Department of Computer and Information Science, 2005. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-2800.
Full textDuring the last years the need for real-time database services has increased due to the growing number of data-intensive applications needing to enforce real-time constraints. The COMponent-based Embedded real-Time database (COMET) is a real-time database developed to meet these demands. COMET is developed using the AspeCtual COmponent-based Real-time system Development (ACCORD) design method, and consists of a number of components and aspects, which can be composed into a number of different configurations depending on system demands, e.g., Quality of Service (QoS) management can be used in unpredictable environments.
In embedded systems with requirementson high up-time it may not be possible to temporarily shut down the system for reconfiguration. Instead it is desirable to enable dynamic reconfiguration of the system, exchanging components during run-time. This in turn sets demands on the feedback control of the system to adjust to these new conditions, since a new time variant system has been created.
This thesis project implements improvements in COMET to create a more stable database suitable for further development. A mechanism for dynamic reconfiguration of COMET is implemented, thus, enabling components and aspects to be swapped during run-time. Adaptive feedback control algorithms are also implemented in order to better adjust to workload variations and database reconfiguration.
Salman, Rosine Hanna. "Exploring Capability Maturity Models and Relevant Practices as Solutions Addressing IT Service Offshoring Project Issues." PDXScholar, 2014. https://pdxscholar.library.pdx.edu/open_access_etds/1843.
Full textNiemann, Johan. "Development of a reconfigurable assembly system with enhanced control capabilities and virtual commissioning." Thesis, Bloemfontein : Central University of Technology, Free State, 2013. http://hdl.handle.net/11462/184.
Full textThe South African (SA) manufacturing industry requires developing similar levels of sophistication and expertise in automation as its international rivals to compete for global markets. To achieve this, manufacturing plants need to be managed extremely efficiently to ensure the quality of manufactured products and these plants must also have the relevant infrastructure. Furthermore, this industry must also compensate for rapid product introduction, product changes and short product lifespan. To support this need, this industry must engage in the current trend in automation known as reconfigurable manufacturing. The aim of the study is to develop a reconfigurable assembly system with enhanced control capabilities by utilizing virtual commissioning. In addition, this system must be capable of assembling multiple different products of a product range; reconfigure to accommodate the requirements of these products; autonomously reroute the product flow and distribute workload among assembly cells; handle erroneous products; and implement enhanced control methods. To achieve this, a literature study was done to confirm the type of components to be used, reveal design issues and what characteristics such a system must adhere to. Software named DELMIA was used to create a virtual simulation environment to verify the system and simultaneously scrutinize the methods of verification. On completion, simulations were conducted to verify software functions, device movements and operations, and the control software of the system. Based on simulation results, the physical system was built, and then verified with a multi agent system as overhead control to validate the entire system. The final results showed that the project objectives are achievable and it was also found that DELMIA is an excellent tool for system verification and will expedite the design of a system. By obtaining these results it is indicated that companies can design and verify their systems earlier through virtual commissioning. In addition, their systems will be more flexible, new products or product changes can be introduced more frequently, with minimum cost and downtime. This will enable SA manufacturing companies to be more competitive, ensure increased productivity, save time and so ensure them an advantage over their international competition.
Bissi, Wilson. "WS-TDD: uma abordagem ágil para o desenvolvimento de serviços WEB." Universidade Tecnológica Federal do Paraná, 2016. http://repositorio.utfpr.edu.br/jspui/handle/1/1829.
Full textTest Driven Development (TDD) is an agile practice that gained popularity when defined as a fundamental part in eXtreme Programming (XP). This practice determines that the tests should be written before implementing the code. TDD and its effects have been widely studied and compared with the Test Last Development (TLD) in several studies. However, few studies address TDD practice in the development of Web Services (WS), due to the complexity of testing the dependencies among distributed components and the specific characteristics of Service Oriented Architecture (SOA). This study aims to define and validate an approach to develop WS based on the practice of TDD, called WS-TDD. This approach guides developers to use TDD to develop WS, suggesting tools and techniques to deal with SOA particularities and dependencies, focusing on the creation of the unitary and integrated automated tests in Java. In order to define and validate the proposed approach, four research methods have been carried out: (i) questionnaire; (ii) practical experiment; (iii) personal interview with each participant in the experiment and (iv) triangulation of the results with the people who participated in the three previous methods. According to the obtained results, WS-TDD was more efficient compared to TLD, increasing internal software quality and developer productivity. However, the external software quality has decreased due to a greater number of defects compared to the TLD approach. Finally, it is important to highlight that the proposed approach is a simple and practical alternative for the adoption of TDD in the development of WS, bringing benefits to internal quality and contributing to increase the developers’ productivity. However, the external software quality has decreased when using WS-TDD.
Leal, Gislaine Camila Lapasini. "Know-cap: um método para capitalização de conhecimento no desenvolvimento de software." Universidade Tecnológica Federal do Paraná, 2015. http://repositorio.utfpr.edu.br/jspui/handle/1/1709.
Full textThe intensive character in knowledge of software production and its rising demand suggest the need to establish mechanisms to properly manage the knowledge involved in order to meet the requirements of deadline, costs and quality. The knowledge capitalization is a process that involves from identification to evaluation of the knowledge produced and used. Specifically, for software development, capitalization enables easier access, minimize the loss of knowledge, reducing the learning curve, avoid repeating errors and rework. Thus, this thesis presents the know-Cap, a method developed to organize and guide the capitalization of knowledge in software development. The Know-Cap facilitates the location, preservation, value addition and updating of knowledge, in order to use it in the execution of new tasks. The method was proposed from a set of methodological procedures: literature review, systematic review and analysis of related work. The feasibility and appropriateness of Know-Cap were analyzed from an application study, conducted in a real case, and an analytical study of software development companies. The results obtained indicate the Know- Cap supports the capitalization of knowledge in software development.
Araujo, Sandro de. "Proposição para adaptação de termos do CMMI-DEV 1.3 para aplicação em PDPS de empresas de manufatura." Universidade Tecnológica Federal do Paraná, 2013. http://repositorio.utfpr.edu.br/jspui/handle/1/810.
Full textThrough a global market increasingly aggressive and competitive, many industries are seeking ways to keep competitive. The Product Development Process (PDP) plays an important role in the strategy of companies that look for a competitive advantage. However, for the PDP become a competitive advantage, it must provide a minimum level of maturity, which represents the growth potential of training, the wealth of the organization's process and the consistency which it is applied in all its projects. There are several models for assessing the maturity of the PDP, but the Capability Maturity Model Integration (CMMI) provides an integrated solution that covers development activities and maintenance of products and services. However, it was originally created to analyze the information technology industries, not covering the terms used in manufacturing companies. Thus, the aim of this work is propose a strategy to adapt the CMMI - DEV 1.3, enabling easier understanding of their goals and practices for manufacturing companies. For it is presented a review on the CMMI - DEV 1.3 PDP manufacturing companies and strategies used to adapt terms of methods models or tools among different speciality areas, including a detailed concept of their items in order to identify the part of the model to be adapted in this work. After this definition, the terms are correlated with similar terms to those found in the literature of manufacturing companies and validated through peer review. In order to verify the effectiveness of the strategy to adapt the terms, the study performed interviews with seven professionals from four manufacturing industries and one academic, all of them ranging from three to fifteen years of experience in the PDP. Among the results, the study contributes to a proposition for adaptation of CMMI-DEV 1.3 used in IT industries for the PDP of manufacturing companies.