To see the other types of publications on this topic, follow the link: Software generation.

Dissertations / Theses on the topic 'Software generation'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Software generation.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Fischer, Scott Edward. "Standard form 254 generation software." Thesis, Georgia Institute of Technology, 1991. http://hdl.handle.net/1853/22386.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Liang, Dong. "Automatic generation of software applications." Doctoral thesis, Technische Universitaet Bergakademie Freiberg Universitaetsbibliothek "Georgius Agricola", 2014. http://nbn-resolving.de/urn:nbn:de:bsz:105-qucosa-149742.

Full text
Abstract:
The Model Driven Architecture (MDA) allows moving the software development from the time consuming and error-prone level of writing program code to the next higher level of modeling. In order to gain benefit from this innovative technology, it is necessary to satisfy two requirements. These are first, the creation of compact, complete and correct platform independent models (PIM) and second, the development of a flexible and extensible model transformation framework taking into account frequent changes of the target platform. In this thesis a platform-based methodology is developed to create PIM by abstracting common modeling elements into a platform independent modeling library called Design Platform Model (DPM). The DPM contains OCL-based types for modeling primitive and collection types, a platform independent GUI toolkit as well as other common modeling elements, such as those for IO-operations. Furthermore, a DPM profile containing diverse domain specific and design pattern-based stereotypes is also developed to create PIM with high-level semantics. The behavior in PIM is specified using an OCL-like action language called eXecutable OCL (XOCL), which is also developed in this thesis. For model transformation, the model compiler MOCCA is developed based on a flexible and extensible architecture. The model mapper components in the current version of MOCCA are able to map desktop applications onto JSE platform; the both business object layer and persistence layer of a three-layered enterprise applications onto JEE platform and SAP ABAP platform. The entire model transformation process is finished with complete code generation.
APA, Harvard, Vancouver, ISO, and other styles
3

Turnas, Daniel. "Next generation software process improvement." Thesis, Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 2003. http://library.nps.navy.mil/uhtbin/hyperion-image/03Jun%5FTurnas.pdf.

Full text
Abstract:
Thesis (M.S. in Software Engineering)--Naval Postgraduate School, June 2003.
Thesis advisor(s): Mikhail Auguston, Christopher D. Miles. Includes bibliographical references (p. 59-61). Also available online.
APA, Harvard, Vancouver, ISO, and other styles
4

Paraskevas, Zaharias. "Code generation for dataflow software pipelining." Thesis, McGill University, 1989. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=55627.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Holmes, Stephen Terry. "Heuristic generation of software test data." Thesis, University of South Wales, 1996. https://pure.southwales.ac.uk/en/studentthesis/heuristic-generation-of-software-test-data(aa20a88e-32a5-4958-9055-7abc11fbc541).html.

Full text
Abstract:
Incorrect system operation can, at worst, be life threatening or financially devastating. Software testing is a destructive process that aims to reveal software faults. Selection of good test data can be extremely difficult. To ease and assist test data selection, several test data generators have emerged that use a diverse range of approaches. Adaptive test data generators use existing test data to produce further effective test data. It has been observed that there is little empirical data on the adaptive approach. This thesis presents the Heuristically Aided Testing System (HATS), which is an adaptive test data generator that uses several heuristics. A heuristic embodies a test data generation technique. Four heuristics have been developed. The first heuristic, Direct Assignment, generates test data for conditions involving an input variable and a constant. The Alternating Variable heuristic determines a promising direction to modify input variables, then takes ever increasing steps in this direction. The Linear Predictor heuristic performs linear extrapolations on input variables. The final heuristic, Boundary Follower, uses input domain boundaries as a guide to locate hard-to-find solutions. Several Ada procedures have been tested with HATS; a quadratic equation solver, a triangle classifier, a remainder calculator and a linear search. Collectively they present some common and rare test data generation problems. The weakest testing criterion HATS has attempted to satisfy is all branches. Stronger, mutation-based criteria have been used on two of the procedures. HATS has achieved complete branch coverage on each procedure, except where there is a higher level of control flow complexity combined with non-linear input variables. Both branch and mutation testing criteria have enabled a better understanding of the test data generation problems and contributed to the evolution of heuristics and the development of new heuristics. This thesis contributes the following to knowledge: Empirical data on the adaptive heuristic approach to test data generation. How input domain boundaries can be used as guidance for a heuristic. An effective heuristic termination technique based on the heuristic's progress. A comparison of HATS with random testing. Properties of the test software that indicate when HATS will take less effort than random testing are identified.
APA, Harvard, Vancouver, ISO, and other styles
6

Enoiu, Eduard. "Automatic test generation for industrial control software." Doctoral thesis, Mälardalens högskola, Inbyggda system, 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:mdh:diva-33364.

Full text
Abstract:
Since the early days of software testing, automatic test generation has been suggested as a way of allowing tests to be created at a lower cost. However, industrially useful and applicable tools for automatic test generation are still scarce. As a consequence, the evidence regarding the applicability or feasibility of automatic test generation in industrial practice is limited. This is especially problematic if we consider the use of automatic test generation for industrial safety-critical control systems, such as are found in power plants, airplanes, or trains. In this thesis, we improve the current state of automatic test generation by developing a technique based on model-checking that works with IEC 61131-3 industrial control software. We show how automatic test generation for IEC 61131-3 programs, containing both functional and timing information, can be solved as a model checking problem for both code and mutation coverage criteria.  The developed technique has been implemented in the CompleteTest tool. To evaluate the potential application of our technique, we present several studies where the tool is applied to industrial control software. Results show that CompleteTest is viable for use in industrial practice; it is efficient in terms of the time required to generate tests that satisfy both code and mutation coverage and scales well for most of the industrial programs considered. However, our results also show that there are still challenges associated with the use of automatic test generation. In particular, we found that while automatically generated tests, based on code coverage, can exercise the logic of the software as well as tests written manually, and can do so in a fraction of the time, they do not show better fault detection compared to manually created tests. Specifically, it seems that manually created tests are able to detect more faults of certain types (i.e, logical replacement, negation insertion and timer replacement) than automatically generated tests. To tackle this issue, we propose an approach for improving fault detection by using mutation coverage as a test criterion. We implemented this approach in the CompleteTest tool and used it to evaluate automatic test generation based on mutation testing. While the resulting tests were more effective than automatic tests generated based on code coverage, in terms of fault detection, they still were not better than manually created tests. In summary, our results highlight the need for improving the goals used by automatic test generation tools. Specifically, fault detection scores could be increased by considering some new mutation operators as well as higher-order mutations. Our thesis suggests that automatically generated test suites are significantly less costly in terms of testing time than manually created test suites. One conclusion, strongly supported by the results of this thesis, is that automatic test generation is efficient but currently not quite as effective as manual testing. This is a significant progress that needs to be further studied; we need to consider the implications and the extent to which automatic test generation can be used in the development of reliable safety-critical systems.
APA, Harvard, Vancouver, ISO, and other styles
7

坂部, 俊樹, 正彦 酒井, 圭一朗 草刈, and 直樹 西田. "Program Generation Techniques for Developing Reliable Software." INTELLIGENT MEDIA INTEGRATION NAGOYA UNIVERSITY / COE, 2004. http://hdl.handle.net/2237/10387.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Jesus, Fábio Miguel Rodrigues de. "CodeGen: improving software development through code generation." Master's thesis, Instituto Politécnico de Setúbal.Escola Superior de Tecnologia de Setúbal, 2019. http://hdl.handle.net/10400.26/31332.

Full text
Abstract:
Projeto realizado em âmbito empresarial
A project submitted in fulfillment of the requirements for the degree of master’s in software engineering
Developing software applications requires time and experience that developers often lack. Additionally, development is more about the problem’s domain and not about the coding process itself, making the automatization of the process quite challenging and engaging, unlike other successfully automated processes. To further reduce the developer’s engagement in corporal developing standards such as following specific patterns or rules, CodeGen presents itself as a code generating tool that, while limited as a prototype, is intended to build and test applications in a set of languages and patterns. In order to do so, an exploratory research on the topics of code generation, architectural and design patterns, and programming languages is required, in order to evaluate what can be done with the current technology and knowledge available. Supported by this research, a prototype is developed as a proof of concept for a Visual Studio Extension that generates web applications in .NET MVC (Model-View-Controller). Since Visual Studio can’t compile Java and the user is not restricted to the choice of development environment, the current research also analyses the possibility of having more than one user interface.
APA, Harvard, Vancouver, ISO, and other styles
9

Yang, Bo. "Software test generation based on flow models." Thesis, University of Ottawa (Canada), 1991. http://hdl.handle.net/10393/7925.

Full text
Abstract:
Software testing is one of the most widely used quality assurance methodologies. A large software system usually has a hierarchical structure: for example, system, subsystems (programs), subprograms, and procedures, where a subsystem is composed of a number of subprograms, each of which in turn is composed of a number of procedures. Testing of a system can be done at different levels with different emphases. In this thesis, we focus on testing at the two lowest levels, namely procedure and subprogram levels. Until recently, many testing techniques used control flow graph or its variants for selecting the tests. It is known that such a flow model is only capable of capturing the control flow. For data flow oriented testing, the def-use graph is used to represent both control flow and data flow. Based on this model, we propose a class of data flow oriented test selection criteria using input-output and input-predicate relations. These criteria are shown to have certain merits over the existing test selection criteria. However, a def-use graph can be only used to capture control flow and data flow within a procedure. Since control flow and data flow also exist among interacting procedures in a subprogram, a more expressive model is necessary in order to perform testing at subprogram level. Such a model is proposed by extending the def-use graph model. By using this model, the tests generated using a test selection criterion are shown to be more meaningful and, likely, more effective in detecting errors.
APA, Harvard, Vancouver, ISO, and other styles
10

Eriksson, Mattias. "Integrated Software Pipelining." Licentiate thesis, Linköping : Department of Computer and Information Science, Linköpings universitet, 2009. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-16170.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Cousins, Michael Anthony. "Automated structural test data generation." Thesis, University of Portsmouth, 1995. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.261234.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Cullum, James J. "Performance analysis of automated attack graph generation software." Thesis, Monterey, Calif. : Naval Postgraduate School, 2006. http://bosun.nps.edu/uhtbin/hyperion.exe/06Dec%5FCullum.pdf.

Full text
Abstract:
Thesis (M.S. in Computer Science)--Naval Postgraduate School, December 2006.
Thesis Advisor(s): Cynthia Irvine, Timothy Levin. "December 2006." Includes bibliographical references (p. 137- 138). Also available in print.
APA, Harvard, Vancouver, ISO, and other styles
13

Bunnin, Francis Oliver. "Automatic generation of software components for financial modelling." Thesis, Imperial College London, 2001. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.249245.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Medeiros, de Campos José Carlos. "Search-based unit test generation for evolving software." Thesis, University of Sheffield, 2017. http://etheses.whiterose.ac.uk/20271/.

Full text
Abstract:
Search-based software testing has been successfully applied to generate unit test cases for object-oriented software. Typically, in search-based test generation approaches, evolutionary search algorithms are guided by code coverage criteria such as branch coverage to generate tests for individual coverage objectives. Although it has been shown that this approach can be effective, there remain fundamental open questions. In particular, which criteria should test generation use in order to produce the best test suites? Which evolutionary algorithms are more effective at generating test cases with high coverage? How to scale up search-based unit test generation to software projects consisting of large numbers of components, evolving and changing frequently over time? As a result, the applicability of search-based test generation techniques in practice is still fundamentally limited. In order to answer these fundamental questions, we investigate the following improvements to search-based testing. First, we propose the simultaneous optimisation of several coverage criteria at the same time using an evolutionary algorithm, rather than optimising for individual criteria. We then perform an empirical evaluation of different evolutionary algorithms to understand the influence of each one on the test optimisation problem. We then extend a coverage-based test generation with a non-functional criterion to increase the likelihood of detecting faults as well as helping developers to identify the locations of the faults. Finally, we propose several strategies and tools to efficiently apply search-based test generation techniques in large and evolving software projects. Our results show that, overall, the optimisation of several coverage criteria is efficient, there is indeed an evolutionary algorithm that clearly works better for test generation problem than others, the extended coverage-based test generation is effective at revealing and localising faults, and our proposed strategies, specifically designed to test entire software projects in a continuous way, improve efficiency and lead to higher code coverage. Consequently, the techniques and toolset presented in this thesis - which provides support to all contributions here described - brings search-based software testing one step closer to practical usage, by equipping software engineers with the state of the art in automated test generation.
APA, Harvard, Vancouver, ISO, and other styles
15

Forsyth, Charles Harkness. "More adaptable code generation." Thesis, University of York, 1990. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.280449.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Taylor, Brian J. "Regressive model approach to the generation of test trajectories." Morgantown, W. Va. : [West Virginia University Libraries], 1999. http://etd.wvu.edu/templates/showETD.cfm?recnum=1077.

Full text
Abstract:
Thesis (M.S.)--West Virginia University, 1999.
Title from document title page. Document formatted into pages; contains xi, 125 p. : ill. (some col.). Includes abstract. Includes bibliographical references (p. 108-111).
APA, Harvard, Vancouver, ISO, and other styles
17

Tip, Frank. "Generation of program analysis tools." Amsterdam : Amsterdam : Institute for Logic, Language and Computation ; Universiteit van Amsterdam [Host], 1995. http://dare.uva.nl/document/33639.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

To, Chi-cheung Solomon. "Marketing of fourth generation software products in Hong Kong /." [Hong Kong : University of Hong Kong], 1987. http://sunzi.lib.hku.hk/hkuto/record.jsp?B12335745.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Günterberg, Herbert. "Case study of rapid software prototyping and automated software generation: an Inertial Navigation System." Thesis, Monterey, California. Naval Postgraduate School, 1989. http://hdl.handle.net/10945/26909.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Imanian, James A. "Automated test case generation for reactive software systems based on environment models." Thesis, Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 2005. http://library.nps.navy.mil/uhtbin/hyperion/05Jun%5FImanian.pdf.

Full text
Abstract:
Thesis (M.S. in Computer Science)--Naval Postgraduate School, June 2005.
Thesis Advisor(s): Mikhail Auguston, James B. Michael. Includes bibliographical references (p. 55-56). Also available online.
APA, Harvard, Vancouver, ISO, and other styles
21

Wigent, Mark A., and Andrea M. Mazzario. "TENA Software Decommutation System." International Foundation for Telemetering, 2012. http://hdl.handle.net/10150/581846.

Full text
Abstract:
The Test and Training Enabling Architecture (TENA) is implemented within the TENA Software Decommutation System (TSDS) in order to bring TENA as close as possible to the sensor interface. Key attributes of TSDS include: • TSDS is a software-based approach to telemetry stream decommutation implemented within Java. This offers technical advantages such as platform independence and portability. • TSDS uses auto code generation technologies to further reduce the effort associated with updating decommutation systems to support new telemetry stream definitions. Users of TSDS within the range are not required to have detailed knowledge of proprietary protocols, nor are they required to have an understanding of how to implement decommutation within software. The use of code generation in software decommutation offers potential cost savings throughout the entire T&E community. • TSDA offers a native TENA interface so that telemetry data can be published directly into TENA object models.
APA, Harvard, Vancouver, ISO, and other styles
22

Franz, Michael Steffen Oliver. "Code-generation on-the-fly : a key portable software /." Zürich, 1994. http://e-collection.ethbib.ethz.ch/show?type=diss&nr=10497.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Misra, Sudip. "A software test plan generation approach for pedagogical purposes." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 2000. http://www.collectionscanada.ca/obj/s4/f2/dsk1/tape3/PQDD_0029/MQ65510.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

To, Chi-cheung Solomon, and 杜志祥. "Marketing of fourth generation software products in Hong Kong." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 1987. http://hub.hku.hk/bib/B31263902.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Meudec, Christophe. "Automatic generation of software test cases from formal specifications." Thesis, Queen's University Belfast, 1998. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.263505.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Moyer, Daniel Raymond. "Software development resource estimation in the 4th generation environment." Thesis, Kansas State University, 1986. http://hdl.handle.net/2097/9956.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Arcuri, Andrea. "Automatic software generation and improvement through search based techniques." Thesis, University of Birmingham, 2009. http://etheses.bham.ac.uk//id/eprint/400/.

Full text
Abstract:
Writing software is a difficult and expensive task. Its automation is hence very valuable. Search algorithms have been successfully used to tackle many software engineering problems. Unfortunately, for some problems the traditional techniques have been of only limited scope, and search algorithms have not been used yet. We hence propose a novel framework that is based on a co-evolution of programs and test cases to tackle these difficult problems. This framework can be used to tackle software engineering tasks such as Automatic Refinement, Fault Correction and Improving Non-functional Criteria. These tasks are very difficult, and their automation in literature has been limited. To get a better understanding of how search algorithms work, there is the need of a theoretical foundation. That would help to get better insight of search based software engineering. We provide first theoretical analyses for search based software testing, which is one of the main components of our co-evolutionary framework. This thesis gives the important contribution of presenting a novel framework, and we then study its application to three difficult software engineering problems. In this thesis we also give the important contribution of defining a first theoretical foundation.
APA, Harvard, Vancouver, ISO, and other styles
28

Tkachuk, Oksana. "Domain-specific environment generation for modular software model checking." Diss., Manhattan, Kan. : Kansas State University, 2008. http://hdl.handle.net/2097/1122.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Kot, andriy. "Effective Large Scale Computing Software for Parallel Mesh Generation." W&M ScholarWorks, 2011. https://scholarworks.wm.edu/etd/1539623585.

Full text
Abstract:
Scientists commonly turn to supercomputers or Clusters of Workstations with hundreds (even thousands) of nodes to generate meshes for large-scale simulations. Parallel mesh generation software is then used to decompose the original mesh generation problem into smaller sub-problems that can be solved (meshed) in parallel. The size of the final mesh is limited by the amount of aggregate memory of the parallel machine. Also, requesting many compute nodes on a shared computing resource may result in a long waiting, far surpassing the time it takes to solve the problem.;These two problems (i.e., insufficient memory when computing on a small number of nodes, and long waiting times when using many nodes from a shared computing resource) can be addressed by using out-of-core algorithms. These are algorithms that keep most of the dataset out-of-core (i.e., outside of memory, on disk) and load only a portion in-core (i.e., into memory) at a time.;We explored two approaches to out-of-core computing. First, we presented a traditional approach, which is to modify the existing in-core algorithms to enable out-of-core computing. While we achieved good performance with this approach the task is complex and labor intensive. An alternative approach, we presented a runtime system designed to support out-of-core applications. It requires little modification of the existing in-core application code and still produces acceptable results. Evaluation of the runtime system showed little performance degradation while simplifying and shortening the development cycle of out-of-core applications. The overhead from using the runtime system for small problem sizes is between 12% and 41% while the overlap of computation, communication and disk I/O is above 50% and as high as 61% for large problems.;The main contribution of our work is the ability to utilize computing resources more effectively. The user has a choice of either solving larger problems, that otherwise would not be possible, or solving problems of the same size but using fewer computing nodes, thus reducing the waiting time on shared clusters and supercomputers. We demonstrated that the latter could potentially lead to substantially shorter wall-clock time.
APA, Harvard, Vancouver, ISO, and other styles
30

Lawrence, Peter James. "Mesh generation by domain bisection." Thesis, University of Greenwich, 1994. http://gala.gre.ac.uk/6220/.

Full text
Abstract:
The research reported in this dissertation was undertaken to investigate efficient computational methods of automatically generating three dimensional unstructured tetrahedral meshes. The work on two dimensional triangular unstructured grid generation by Lewis and Robinson [LeR76] is first examined, in which a recursive bisection technique of computational order nlog(n) was implemented. This technique is then extended to incorporate new methods of geometry input and the automatic handling of multiconnected regions. The method of two dimensional recursive mesh bisection is then further modified to incorporate an improved strategy for the selection of bisections. This enables an automatic nodal placement technique to be implemented in conjunction with the grid generator. The dissertation then investigates methods of generating triangular grids over parametric surfaces. This includes a new definition of surface Delaunay triangulation with the extension of grid improvement techniques to surfaces. Based on the assumption that all surface grids of objects form polyhedral domains, a three dimensional mesh generation technique is derived. This technique is a hybrid of recursive domain bisection coupled with a min-max heuristic triangulation algorithm. This is done to achieve a computationlly efficient and reliable algorithm coupled with a fast nodal placement technique. The algorithm generates three dimensional unstructured tetrahedral grids over polyhedral domains with multi-connected regions in an average computational order of less than nlog(n).
APA, Harvard, Vancouver, ISO, and other styles
31

Pawar, Sourabh A. "A Common Software Development Framework For Coordinating Usability Engineering and Software Engineering Activities." Thesis, Virginia Tech, 2004. http://hdl.handle.net/10919/33023.

Full text
Abstract:
Currently, the Usability Engineering (UE) and Software Engineering (SE) processes are practiced as being independent of each other. However, several dependencies and constraints exist between the interface specifications and the functional core, which make coordination between the UE and the SE teams crucial. Failure of coordination between the UE and SE teams leads to software that often lacks necessary functionality and impedes user performance. At the same time, the UE and SE processes cannot be integrated because of the differences in focus, techniques, and terminology. We therefore propose a development framework that incorporates SE and UE efforts to guide current software development.

The framework characterizes the information exchange that must exist between the UE and SE teams during software development to form the basis of the coordinated development framework. The UE Scenario-Based Design (SBD) process provides the basis for identifying UE activities. Similarly, the Requirements Generation Model (RGM), and Structured Analysis and Design are used to identify SE activities. We identify UE and SE activities that can influence each other, and identify the high-level exchange of information that must exist among these activities. We further examine these interactions to gain a more in-depth understanding as to the precise exchange of information that must exist among them.

The identification of interacting activities forms the basis of a coordinated development framework that incorporates and synchronizes the UE and SE processes. An examination of the Incremental and Spiral models as they relate to the SBD is provided, and outlines how our integration framework can be composed. Using the results of and insights gained from our research, we also suggest additional avenues for future work.
Master of Science

APA, Harvard, Vancouver, ISO, and other styles
32

Noik, Emanuel Gerald. "Automating the generation of interactive applications." Thesis, University of British Columbia, 1990. http://hdl.handle.net/2429/29690.

Full text
Abstract:
As user interfaces become more powerful and easier to use they are often harder to design and implement. This has caused a great demand for interface tools. While existing tools ease interface creation, they typically do not provide mechanisms to simplify application development and are too low-level. Furthermore, existing tools do not provide effective mechanisms to port interactive applications across user interfaces. While some tools provide limited mechanisms to port applications across user interfaces which belong to the same class (e.g., the class of all standard graphical direct-manipulation user interfaces), very few can provide the ability to port applications across different interface classes (e.g., command-line, hypermedia, speech recognition and voice synthesis, virtual reality, etc.). With my approach, the programmer uses an abstract model to describe the structure of the application including the information that the application must exchange with the user, rather than describing a user interface which realizes these characteristics. By specifying application semantics at a very high level of abstraction it is possible to obtain a much greater separation between the application and the user interface. Consequently, the resulting applications can be ported not only across user interfaces which belong to a common interface class, but across interfaces which belong to distinct classes. This can be realized through simple recompilation - source code does not have to be modified. NAAG (Not Another Application Generator), a tool which embodies these ideas, enables programmers to create interactive applications with minimal effort. An application is modelled as a set of operations which manipulate objects belonging to user-defined object classes. The input to NAAG is a source program which describes classes, operations and their inputs and outputs, and the organization of operations within the application. Classes and operations are implemented as data structures and functions in a conventional programming language such as C. This model simplifies not only the specification and generation of the user interface, but the design and implementation of the underlying application. NAAG utilizes existing technology such as macro-preprocessors, compilers, make programs, and low-level interface tools, to reduce the programming task. An application that is modified by adding, removing, or reorganizing artifacts (classes, operations, and menus), can be regenerated with a single command. Traditionally, software maintenance has been a very difficult task as well. Due to the use of a simple abstract model, NAAG applications are also easier to maintain. Furthermore, this approach encourages software reuse: applications consisting of arbitrary collections of original and pre-existing artifacts can be composed easily; functions which implement abstract operations are independent of both, user interface aspects, and the context in which they are employed. Application development is further simplified in the following ways: the programmer describes the semantics of the user interface - a conventional explicit specification is not required; output primitives are defined in an interface-independent manner; many programming tasks such as resource management, event processing, and communication, are either handled directly by the tool or else simplified greatly for the programmer. NAAG is currently used by the members of the Laboratory for Computational Vision at the University of British Columbia to maintain a sophisticated image processing system.
Science, Faculty of
Computer Science, Department of
Graduate
APA, Harvard, Vancouver, ISO, and other styles
33

Aldanmaz, Senol Lokman. "Environment Generation Tool For Enabling Aspect Verification." Master's thesis, METU, 2010. http://etd.lib.metu.edu.tr/upload/12612078/index.pdf.

Full text
Abstract:
Aspects are units of aspect oriented programming developed for influencing the software behavior. In order to use an aspect confidently in any software, first it should be verified. For verification of an aspect, the mock classes for the original software should be prepared. These mock classes are a model of the aspect environment which the aspect is woven. In this study, considering that there are not enough tools for supporting the aspect oriented programming developers, we have developed a tool for enabling aspect verification and unit testing. The tool enables verification by generating the general environment of the aspect. By this tool the users are ensured to focus on the verification of aspects isolated from woven software.
APA, Harvard, Vancouver, ISO, and other styles
34

Su, Mehmet Onur. "Business Process Moedlling Based Computer-aided Software Functional Requirements Generation." Master's thesis, METU, 2004. http://etd.lib.metu.edu.tr/upload/3/12604698/index.pdf.

Full text
Abstract:
Problems of requirements which are identified in the earlier phase of a software development project can deeply affect the success of the project. Thus studies which aim to decrease these problems are crucial. Automation is foreseen to be one of the possible solutions for decreasing or removing some of the problems originating from requirements. This study focuses on the development and implementation of an automated tool that will generate requirements in natural language from business process models. In this study, The benefits of the tool are discussed, and the tool is compared with other software requirement s related tools with respect to their functionality. The developed tool has been tested within a large military project and the results of using the tool are presented.
APA, Harvard, Vancouver, ISO, and other styles
35

Bureau, Cédric. "Developing a harmonic power flow software in distributed generation systems." Thesis, KTH, Elektriska energisystem, 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-103181.

Full text
Abstract:
The main topic of this thesis is harmonic power flow and its use in a simulation software that I have developped. The idea of the software is to combine distribution grids' description, non-linear load models and power flow methods. Nowadays, power electronics is more and more present in electric devices in distributed generation systems. Those power electronics systems can emit or absorb harmonics that can damage the devices in the grid. Thus, it is important to be able to estimate harmonic behaviour in the grid in order to be able to prevent the possible problems that could occur.The main contribution of this internship is the precise expression of the needs and goals of the software, and an implementation of its structure. In this thesis, It is explained howthe grid's components and non-linear devices are modelled in the software in order to beable to represent the distribution system. There is also a study the possible input of this software and create a symbolic representation of the grid that is helpful when it comes toload flow calculation. Then, the different load flow and harmonic load flow algorithms that are presented in the literature are analysed and compared them together in order to determine the methods that should be implemented in the future software.Two of the implemented fundamental load flows with a single-phase system are tested.Thus, it also validates the input reading and the grid representation construction. The software developped is a first implementation of a more global software that will require further studies. Indeed, the developpement stage will be done by external contractors or computer science specialists, that will insist on parallelization of algorithms and software optimization, in order to have a software as efficient and fast as possible.
APA, Harvard, Vancouver, ISO, and other styles
36

Williams, Alan Webber. "Software component interaction testing: Coverage measurement and generation of configurations." Thesis, University of Ottawa (Canada), 2002. http://hdl.handle.net/10393/6387.

Full text
Abstract:
Systems constructed from components, including distributed systems, consist of a number of elements that interact with each other. When a system is integrated, there may be undesired interactions among those components that cause system failures. There are two complementary problems in testing a software system. The first problem is to create a test suite, given a description of the expected behaviour of a system configuration. The second problem is to deal with a large number of distinct test configurations. We investigate the second problem in this thesis: the situation when there are various system parameters, each of which can take on a value from a discrete set. The trade-off that the system tester faces is the thoroughness of test configuration coverage, versus availability of limited resources. We introduce a coverage measure that can provide a basis for determining a set of configurations with "sufficient" coverage, or for evaluation of a set of test configurations that already exists. This thesis addresses the problem of testing interactions among components of a software system: the "interaction test coverage" problem. We formally define this problem, and give it a set-theoretic framework. This is done through the introduction of an "interaction element," which becomes the unit of test coverage. The problem is compared to, and distinguished from, the minimum set cover problem and the {0,1} integer programming problem. As a result, the status of the NP-completeness of this problem remains open. Methods from statistical experimental design are introduced, and applied to the problem of generating a set of configurations that achieve coverage of all pair-wise combinations of parameter values. We present a fast, deterministic algorithm to generate such a set of test configurations. The method is compared with other methods, and shown to produce fewer configurations in most situations. The number of configurations generated is logarithmic in the number of parameters, and polynomial in the number of values per parameter. As a result, the number of configurations is usually feasible in practice, and is a significant reduction from the number of possible configurations.
APA, Harvard, Vancouver, ISO, and other styles
37

Silva, Carlos Eduardo da. "Dynamic generation of adaptation plans for self-adaptive software systems." Thesis, University of Kent, 2011. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.544042.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Grammel, Birgit. "Automatic Generation of Trace Links in Model-driven Software Development." Doctoral thesis, Saechsische Landesbibliothek- Staats- und Universitaetsbibliothek Dresden, 2014. http://nbn-resolving.de/urn:nbn:de:bsz:14-qucosa-155839.

Full text
Abstract:
Traceability data provides the knowledge on dependencies and logical relations existing amongst artefacts that are created during software development. In reasoning over traceability data, conclusions can be drawn to increase the quality of software. The paradigm of Model-driven Software Engineering (MDSD) promotes the generation of software out of models. The latter are specified through different modelling languages. In subsequent model transformations, these models are used to generate programming code automatically. Traceability data of the involved artefacts in a MDSD process can be used to increase the software quality in providing the necessary knowledge as described above. Existing traceability solutions in MDSD are based on the integral model mapping of transformation execution to generate traceability data. Yet, these solutions still entail a wide range of open challenges. One challenge is that the collected traceability data does not adhere to a unified formal definition, which leads to poorly integrated traceability data. This aggravates the reasoning over traceability data. Furthermore, these traceability solutions all depend on the existence of a transformation engine. However, not in all cases pertaining to MDSD can a transformation engine be accessed, while taking into account proprietary transformation engines, or manually implemented transformations. In these cases it is not possible to instrument the transformation engine for the sake of generating traceability data, resulting in a lack of traceability data. In this work, we address these shortcomings. In doing so, we propose a generic traceability framework for augmenting arbitrary transformation approaches with a traceability mechanism. To integrate traceability data from different transformation approaches, our approach features a methodology for augmentation possibilities based on a design pattern. The design pattern supplies the engineer with recommendations for designing the traceability mechanism and for modelling traceability data. Additionally, to provide a traceability mechanism for inaccessible transformation engines, we leverage parallel model matching to generate traceability data for arbitrary source and target models. This approach is based on a language-agnostic concept of three similarity measures for matching. To realise the similarity measures, we exploit metamodel matching techniques for graph-based model matching. Finally, we evaluate our approach according to a set of transformations from an SAP business application and the domain of MDSD.
APA, Harvard, Vancouver, ISO, and other styles
39

Sthamer, Harmen-Hinrich. "The automatic generation of software test data using genetic algorithms." Thesis, University of South Wales, 1995. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.320726.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Lepird, John R. "Multi-objective optimization of next-generation aircraft collision avoidance software." Thesis, Massachusetts Institute of Technology, 2015. http://hdl.handle.net/1721.1/98566.

Full text
Abstract:
Thesis: S.M., Massachusetts Institute of Technology, Sloan School of Management, Operations Research Center, 2015.
This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.
"June 2015." Cataloged from student-submitted PDF version of thesis.
Includes bibliographical references (pages 85-90).
Developed in the 1970's and 1980's, the Traffic Alert and Collision Avoidance System (TCAS) is the last safety net to prevent an aircraft mid-air collision. Although TCAS has been historically very effective, TCAS logic must adapt to meet the new challenges of our increasingly busy modern airspace. Numerous studies have shown that formulating collision avoidance as a partially-observable Markov decision process (POMDP) can dramatically increase system performance. However, the POMDP formulation relies on a number of design parameters modifying these parameters can dramatically alter system behavior. Prior work tunes these design parameters with respect to a single performance metric. This thesis extends existing work to handle more than one performance metric. We introduce an algorithm for preference elicitation that allows the designer to meaningfully define a utility function. We also discuss and implement a genetic algorithm that can perform multi-objective optimization directly. By appropriately applying these two methods, we show that we are able to tune the POMDP design parameters more effectively than existing work.
by John R. Lepird.
S.M.
APA, Harvard, Vancouver, ISO, and other styles
41

Corre, Youenn. "Automated generation of heterogeneous multiprocessor architectures : software and hardware aspects." Lorient, 2013. https://hal.archives-ouvertes.fr/tel-01130482.

Full text
Abstract:
L'évolution des systèmes embarqués a conduit à l'émergence des H-MPSoCs qui répondent aux contraintes de performances et d'énergie. Cela se traduit par une complexité de conception et de programmation accrue. Il est donc nécessaire de réaliser des outils permettant aux concepteurs de mobiliser leurs efforts sur les étapes à forte valeur ajoutée. L'objectif est donc d'automatiser les tâches fastidieuses propres à la conception d'H-MPSoCs, notamment sur FPGA, en élevant le niveau d'abstraction selon une approche qui unifie la HLS et la co-conception logicielle/matérielle au-delà des approches existantes qui se révèlent partielles ou inadaptées. Cette thèse présente un outil de conception reposant sur l'automatisation des tâches fastidieuses et laissant la main au concepteur là où celui-ci le souhaite. On s'appuie sur un modèle d'architecture défini via un formalisme de haut-niveau indépendant des détails d'implémentation, palliant ainsi l'absence d'architecture multiprocesseur sous-jacente dans les FPGA. Ce modèle permet également au concepteur de fournir les contraintes à différents niveaux de détails en fonction de ses connaissances du système ou de son niveau d'implication. L'exploration de l'espace de conception se fait grâce à un algorithme scalable et reposant sur des estimations rapides et précises. Une méthode d'exploration des accélérateurs matériels, utilisant la HLS pour une estimation rapide des coûts, est introduite. L'utilisation de méthodes d'IDM permet la génération du design final facilitant ainsi la portabilité et la réutilisation des designs. L'outil a été validé à travers deux études de cas: un décodeur MJPEG et une application de détection de visage
Embedded systems evolution has led to the emergence of H-MPSoCs which provide a way to respect the cost and performance constraints inherent to embedded systems. However they also make the task of designing and programming such systems a long and arduous process. It is thus necessary to develop tools that will free designers from architectural and programming details, so that they can focus on the tasks where they can bring added-value. The objective is thus to automatize the tasks that burden the design of H-MPSoC, in particular on FPGA, by providing a higher-level of abstraction following a method that brings together HLS and hardware/software co-design beyond the existing solutions which are whether incomplete or unfit. The presented work introduces a design framework relying on the automation of tedious tasks and allowing designers to express their expertise where they want to. For this, we rely on an architecture model defined with a high-level formalism independent from implementation details, providing a solution to the lack of multiprocessor architecture in FPGAs. This specification model also allows designers to provide design constraints in accordance with their level of expertise or involvement. The DSE is implemented as a scalable algorithm relying on fast and accurate estimation techniques. A method for the exploration of hardware accelerators based on HLS to provide fast cost estimations is introduced. The use of MDE methods enables portability and reuse by generating the final design implementation. The framework is validated through two case studies: an MJPEG video decoder and a face detection application
APA, Harvard, Vancouver, ISO, and other styles
42

Hrischuk, Curtis E. (Curtis Eldon) Carleton University Dissertation Engineering Systems and Computer. "The automatic generation of software performance models from a prototype." Ottawa, 1994.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
43

Lübke, Daniel. "An integrated approach for generation in SOA projects." Hamburg Kovač, 2007. http://d-nb.info/987919547/04.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Tseng, Kuo-Jung. "Chinese character generation : a stroke oriented method." Thesis, University of Kent, 1996. https://kar.kent.ac.uk/21368/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Arief, Leonardus Budiman. "A framework for supporting automatic simulation generation from design." Thesis, University of Newcastle Upon Tyne, 2001. http://hdl.handle.net/10443/1816.

Full text
Abstract:
Building a new software system requires careful planning and investigation in order to avoid any problems in the later stages of the development. By using a universally accepted design notation such as the Unified Modeling Language (UML), ambiguities in the system specification can be eliminated or minimised. The aspect that frequently needs to be investigated before the implementation stage can be commenced concerns the proposed system’s performance. It is necessary to predict whether a particular design will meet the performance requirement - i.e. is it worth implementing the system - or not. One way to obtain this performance prediction is by using simulation programs to mimic the execution of the system. Unfortunately, it is often difficult to transform the design into a simulation program without some sound knowledge of simulation techniques. In addition, new simulation programs need to be built each time for different systems - which can be tedious, time consuming and error prone. The currently available UML tools do not provide any facilities for generating simulation programs automatically from UML specifications. This shortcoming is the main motivation for this research. The work involved here includes an investigation of which UML design notations can be used; the available simulation languages or environments for running the simulation; and more importantly, a framework that can capture the simulation information from UML design notation. Using this framework, we have built tools that enable an automatic transformation of a UML design notation into a simulation program. Two tools (parsers) that can perform such a transformation have been constructed. We provide case studies to demonstrate the applicability of these tools and the usefulness of our simulation framework in general.
APA, Harvard, Vancouver, ISO, and other styles
46

Pell, Barney Darryl. "Strategy generation and evaluation for meta-game playing." Thesis, University of Cambridge, 1993. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.308363.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Singh, Neeta S. "An automatic code generation tool for partitioned software in distributed computing." [Tampa, Fla.] : University of South Florida, 2005. http://purl.fcla.edu/fcla/etd/SFE0001129.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Gil, Sepúlveda Victor Alejandro. "Algorithmic and Technical Improvements for Next Generation Drug Design Software Tools." Doctoral thesis, Universitat de Barcelona, 2016. http://hdl.handle.net/10803/396648.

Full text
Abstract:
The pharmaceutical industry is actively looking for new ways of boosting the efficiency and effectiveness of their R&D programmes. The extensive use of computational modeling tools in the drug discovery pipeline (DDP) is having a positive impact on research performance, since in silico experiments are usually faster and cheaper that their real counterparts. The lead identification step is a very sensitive point in the DDP. In this context, Virtual high-throughput screening techniques (VHTS) work as a filtering mecha-nism that benefits the following stages by reducing the number of compounds to be tested experimentally. Unfortunately the simplifications applied in the VHTS docking software make them prone generate false positives and negatives. These errors spread across the rest of the DDP stages, and have a negative impact in terms of financial and time costs. In the Electronic and Atomic Protein Modelling group (Barcelona Supercomputing Center, Life Sciences department), we have developed the Protein Energy Landscape Exploration (PELE) software. PELE has demonstrated to be a good alternative to explore the conformational space of proteins and perform ligand-protein docking simulations. In this thesis we discuss how to turn PELE into a faster and more efficient tool by improving its technical and algorithmic features, so that it can be eventually used in VHTS protocols. Besides, we have addressed the difficulties of analyzing extensive data associated with massive simulation production. First, we have rewritten the software using C++ and modern software engineering techniques. As a consequence, our code base is now well organized and tested. PELE has become a piece of software which is easier to modify, understand, and extend. It is also more robust and reliable. The rewriting the code has helped us to overcome some of its previous technical limitations, such as the restrictions on the size of the systems. Also, it has allowed us to extend PELE with new solvent models, force fields, and types of biomolecules. Moreover, the rewriting has make it possible to adapt the code in order to take advantage of new parallel architectures and accelerators obtaining promising speedup results. Second, we have improved the way PELE handles protein flexibility by im-plemented and internal coordinate Normal Mode Analysis (icNMA) method. This method is able to produce more energy favorable perturbations than the current Anisotropic Network Model (ANM) based strategy. This has allowed us to eliminate the unneeded relaxation phase of PELE. As a consequence, the overall computational performance of the sampling is significantly improved (-5-7x). The new internal coordinates-based methodology is able to capture the flexibility of the backbone better than the old method and is in closer agreement to molecular dynamics than the ANM-based method.
APA, Harvard, Vancouver, ISO, and other styles
49

Tan, Kian Moh Terence. "Tactical plan generation software for maritime interdiction using conceptual blending theory." Thesis, Monterey, Calif. : Naval Postgraduate School, 2007. http://bosun.nps.edu/uhtbin/hyperion-image.exe/07Dec%5FTan%5FKian.pdf.

Full text
Abstract:
Thesis (M.S. in Modeling, Virtual Environments, and Simulation (MOVES))--Naval Postgraduate School, December 2007.
Thesis Advisor(s): Hiles, John. "December 2007." Description based on title screen as viewed on January 18, 2008. Includes bibliographical references (p. 87-91). Also available in print.
APA, Harvard, Vancouver, ISO, and other styles
50

Haritos, Georgios. "Automatic semi/automatic generation of software to control flexible manufacturing cells." Thesis, Staffordshire University, 1995. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.308911.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography