Teses / dissertações sobre o tema "Qa76.575"

Siga este link para ver outros tipos de publicações sobre o tema: Qa76.575.

Crie uma referência precisa em APA, MLA, Chicago, Harvard, e outros estilos

Selecione um tipo de fonte:

Veja os 17 melhores trabalhos (teses / dissertações) para estudos sobre o assunto "Qa76.575".

Ao lado de cada fonte na lista de referências, há um botão "Adicionar à bibliografia". Clique e geraremos automaticamente a citação bibliográfica do trabalho escolhido no estilo de citação de que você precisa: APA, MLA, Harvard, Chicago, Vancouver, etc.

Você também pode baixar o texto completo da publicação científica em formato .pdf e ler o resumo do trabalho online se estiver presente nos metadados.

Veja as teses / dissertações das mais diversas áreas científicas e compile uma bibliografia correta.

1

Triastuti, Sugiyarto Endang. "Analysing rounding data using radial basis function neural networks model". Thesis, University of Northampton, 2007. http://nectar.northampton.ac.uk/2809/.

Texto completo da fonte
Resumo:
Unspecified counting practices used in a data collection may create rounding to certain ‘based’ number that can have serious consequences on data quality. Statistical methods for analysing missing data are commonly used to deal with the issue but it could actually aggravate the problem. Rounded data are not missing data, instead some observations were just systematically lumped to certain based numbers reflecting the rounding process or counting behaviour. A new method to analyse rounded data would therefore be academically valuable. The neural network model developed in this study fills the gap and serves the purpose by complementing and enhancing the conventional statistical methods. The model detects, analyses, and quantifies the existence of periodic structures in a data set because of rounding. The robustness of the model is examined using simulated data sets containing specific rounding numbers of different levels. The model is also subjected to theoretical and numerical tests to confirm its validity before being used on real applications. Overall, the model performs very well making it suitable for many applications. The assessment results show the importance of using the right best fit in rounding detection. The detection power and cut-off point estimation also depend on data distribution and rounding based numbers. Detecting rounding of prime numbers is easier than non-prime numbers due to the unique characteristics of the former. The bigger the number, the easier is the detection. This is in a complete contrast with non-prime numbers, where the bigger the number, the more will be the “factor” numbers distracting rounding detection. Using uniform best fit on uniform data produces the best result and lowest cut-off point. The consequence of using a wrong best fit on uniform data is however also the worst. The model performs best on data containing 10-40% rounding levels as less or more rounding levels produce unclear rounding pattern or distort the rounding detection, respectively. The modulo-test method also suffers the same problem. Real data applications on religious census data confirms the modulo-test finding that the data contains rounding base 5, while applications on cigarettes smoked and alcohol consumed data show good detection results. The cigarettes data seem to contain rounding base 5, while alcohol consumption data indicate no rounding patterns that may be attributed to the ways the two data were collected. The modelling applications can be extended to other areas in which rounding is common and can have significant consequences. The modelling development can he refined to include data-smoothing process and to make it user friendly as an online modelling tool. This will maximize the model’s potential use
Estilos ABNT, Harvard, Vancouver, APA, etc.
2

Ampratwum, Cecilia S. "Identification of chemical species using artificial intelligence to interpret optical emission spectra". Thesis, University of Northampton, 1999. http://nectar.northampton.ac.uk/3004/.

Texto completo da fonte
Resumo:
The nonlinear modeling capabilities of artificial neural networks (ANN’s) are renowned in the field of artificial intelligence (Al) for capturing knowledge that can be very difficult to understand otherwise. Their ability to be trained on representative data within a particular problem domain and generalise over a set of data make them efficient predictive models. One problem domain that contains complex data that would benefit from the predictive capabilities of ANN’s is that of optical emission spectra (OES). OES is an important diagnostic for monitoring plasma species within plasma processing. Normally, OES spectral interpretation requires significant prior expertise from a spectroscopist. One way of alleviating this intensive demand in order to quickly interpret OES spectra is to interpret the data using an intelligent pattern recognition technique like ANN’s. This thesis investigates and presents MLP ANN models that can successfully classify chemical species within OES spectral patterns. The primary contribution of the thesis is the creation of deployable ANN species models that can predict OES spectral line sizes directly from six controllable input process parameters; and the implementation of a novel rule extraction procedure to relate the real multi-output values of the spectral line sizes to individual input process parameters. Not only are the trained species models excellent in their predictive capability, but they also provide the foundation for extracting comprehensible rules. A secondary contribution made by this thesis is to present an adapted fuzzy rule extraction system that attaches a quantitative measure of confidence to individual rules. The most significant contribution to the field of Al that is generated from the work presented in the thesis is the fact that the rule extraction procedure utilises predictive ANN species models that employ real continuously valued multi-output data. This is an improvement on rule extraction from trained networks that normally focus on discrete binary outputs
Estilos ABNT, Harvard, Vancouver, APA, etc.
3

Johnson, Mark. "The Dyslexic User's Interface Support Tool (DUIST) : a framework for performance enhancing interface adaptation strategies for dyslexic computer users". Thesis, University of Northampton, 2007. http://nectar.northampton.ac.uk/2683/.

Texto completo da fonte
Resumo:
Due to the nature of the symptoms experienced by dyslexic individuals (e.g. defective visual processing, short term memory deficit and motor control problems) an investigation into support strategies to aid persons suffering from the condition seems strongly justifiable. As such, an extensive review of existing support techniques for dyslexic computer users are explored leading to the formulation of four central research models; dyslexia symptoms, symptom alleviating interface strategies, adjustable interface components and a dynamically adaptable interface preference elicitation mechanism. These models provide the foundation for the design of the Dyslexic User’s Interface Support Tool (DUIST) framework. Using a user centred design approach, the support framework is developed, tested and subsequently evaluated with positive results. Performance gains for dyslexic subjects in reading speed and reading accuracy exemplify the apparent benefits of framework utilisation (e.g. dyslexic mean reading speed increased by 4.98 wpm vs. control gains of 0.18 wpm; dyslexic mean reading errors reduced by 0.64 per 100 words vs. control reductions of 0.06 fewer errors per 100 words). Subsequent research into the long-term impact of framework utilisation; the perceived benefits of applying research formulated models to interfaces designed for dyslexics; and alternative strategies to portability all now seem justified. That said, the findings presented thus far warrants investigation by any reader actively interested in dyslexia; strategies for dyslexia symptom relief support environments for dyslexic computer users; applications of adaptive interfaces; and all potential system designers who may be considering developing any type of graphical interface for a dyslexic user group
Estilos ABNT, Harvard, Vancouver, APA, etc.
4

Wang, Yijun. "Development of an artificial neural network model to predict expert judgement of leather handle from instrumentally measured parameters". Thesis, University of Northampton, 2009. http://nectar.northampton.ac.uk/3581/.

Texto completo da fonte
Resumo:
Leather is a widely used material whose handling character is still assessed manually by experienced people in the leather industry. The aim of this study was to provide a new approach to such characterisation by developing Artificial Neural Network models to investigate the relationship between the subjective assessment of leather handle and its measureable physical characteristics. Two collections of commercial leather samples provided by TFL and PITTARDS were studied in this project. While the handle of the TFL collection covered a varied range, the PITTARDS collection was all relatively soft leather and with less difference within the collection. Descriptive Sensory Analysis was used to identify and quantify the subjective assessment of leather handle. A panel constituted of leather experts was organised and trained to: 1) define attributes describing leather handle; 2) assess specific leather handle by responding to questionnaires seeking information about the above attributes. According to the analysis of the raw data and the assessment observation, the attributes that should be used for training the artificial network models were "stiff", "empty", "smooth", "firm", "high density" and "elastic". Various physical measurements relating to leather handle were carried out as follows: standard leather thickness, apparent density, thickness with 1 gram load and 2 gram load, resistance to compression, resistance to stretching, surface friction, modified vertical loop deformation, drooping angle and BLC softness. The parameters from each measurement were all scaled on range 0 to 1 before being fed into network models. Artificial neural networks were developed through learning from the TFL examples and then tested on the PITTARDS collection. In the training stage, parameters from physical measurements and attribute gradings provided by descriptive sensory analysis were fed into the networks as input and desired output respectively. In the testing stage, physical measurement parameters were input to the trained network and the output of the network, which was the prediction of the leather handle, was compared with the gradings given by the panel. The testing results showed that the neural network models developed were able to judge the handle of a newly presented leather as well as an expert. Statistical methods were explored in the development of artificial neural network models. Principal Component Analysis was used to classify the attributes of leather handle and demonstrated that the predominant and most representative attributes out of the six attributes were "stiff", "empty" and "smooth". A network model called physical2panel, predicting the above three attributes from three physical parameters was built up by adopting a novel pruning method termed "Double-Threshold" which was used to decide the irrelevance of an input to a model. This pruning method was based on Bayesian methodology and implemented by comparing the overall connection weight of each input to each output with the limitation of two thresholds. The pruning results revealed that among the sixteen physical parameters, only three of them, - the reading from BLC softness guage, the compression secant modulus and the leather thickness measured under 1 gram load were important to the model. Another network model, termed panel2panel, that predicts the other three attributes "firm", "high density" and "elastic" from the prediction of the model physical2panel was developed and also proved to work as well as a leather expert panel. The conception of a 3D handle space was explored and shown to be a powerful means of demonstrating the findings.
Estilos ABNT, Harvard, Vancouver, APA, etc.
5

"A tunable version control system for virtual machines in an open-source cloud". 2013. http://repository.lib.cuhk.edu.hk/en/item/cuhk-1291493.

Texto completo da fonte
Resumo:
Open-source cloud platforms provide a feasible alternative of deploying cloud computing in low-cost commodity hardware and operating systems. To enhance the reliability of an open-source cloud, we design and implement CloudVS, a practical add-on system that enables version control for virtual machines (VMs). CloudVS targets a commodity cloud platform that has limited available resources. It exploits content similarities across different VM versions using redundancy elimination (RE), such that only non-redundant data chunks of a VM version are transmitted over the network and kept in persistent storage. Using RE as a building block, we propose a suite of performance adaptation mechanisms that make CloudVS amenable to different commodity settings. Specifically, we propose a tunable mechanism to balance the storage and disk seek overheads, as well as various I/O optimization techniques to minimize the interferences to other co-resident processes. We further exploit a higher degree of content similarity by applying RE to multiple VM images simultaneously, and support the copy-on-write image format. Using real-world VM snapshots, we experiment CloudVS in an open-source cloud testbed built on Eucalyptus. We demonstrate how CloudVS can be parameterized to balance the performance trade-offs between version control and normal VM operations.
開源雲端平台為供低成本硬件及作業系統提供一個可行的替代方案。為了提高開源雲的可靠性,我們設計及實踐了CloudVS,一個針對虛擬機的實用版本控制系統。CloudVS針對有限資源的低成本硬件雲平台,利用內容相似性,在不同的虛擬機版本使用冗餘消除。這樣,在虛擬機版本數據中只有非冗餘的部分在網絡上傳輸,並保存在持久存儲。使用冗餘消除作為構建塊,我們提出了一套性能適應機制,使CloudVS適合於不同的低成本硬件配置。具體而言,我們提出了一種可調諧的機制來平衡存儲和磁盤尋道開銷,以及應用各種I/O優化技術去最大限度地減少對其他同時運行進程的干擾。我們應用冗餘消除多個虛擬機影像去進一步利用其內容相似度,同時,我們更進一步支持寫時複製格式。使用來自真實世界的虛擬機快照,我們嘗試在開放源碼的雲測試平台Eucalyptus中測試CloudVS。我們演示CloudVS如何可以參數化,以平衡版本控制和正常的虛擬機操作之間的性能取捨。
Tang, Chung Pan.
Thesis M.Phil. Chinese University of Hong Kong 2013.
Includes bibliographical references (leaves 57-65).
Abstracts also in Chinese.
Title from PDF title page (viewed on 07, October, 2016).
Detailed summary in vernacular field only.
Estilos ABNT, Harvard, Vancouver, APA, etc.
6

Trentelman, Kerry. "Aspects of Java program verification". Phd thesis, 2006. http://hdl.handle.net/1885/151803.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
7

Nagappan, Rajehndra Yegappan. "Mining multidimensional data through compositional visualisation". Phd thesis, 2001. http://hdl.handle.net/1885/146042.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
8

Fatwanto, Agung. "A concern-aware requirements engineering framework". Phd thesis, 2011. http://hdl.handle.net/1885/150279.

Texto completo da fonte
Resumo:
Poorly understood and articulated requirements have been widely acknowledged as the main contributor for software development problems. A number of studies suggest that a holistic understanding of the concerns (goals and issues) surrounding software development and stakeholders' active participation are two critical factors for the success of requirements engineering. The research as documented in this thesis thus aims to solve the problem by developing and demonstrating a new approach necessary for eliciting, analyzing, and specifying various stakeholders' concerns. The aim has been achieved with the development and demonstration of the Concern-Aware Requirements Engineering (CARE) method. The CARE method was developed by combining goal-oriented, scenario-based, and actor-oriented approach together with a consideration to object-oriented approach. This combination allows the CARE method to provide a novel way to requirements engineering. It is novel in the sense that: (i) it combines goal-oriented, scenario-based, and actor-oriented approach, (ii) it considers object-oriented specification as the reference for final format into which the acquired (elicited, analyzed, and specified) information can potentially be transformed, and (iii) it introduces multidimensional information specification by providing the coverage to describe: multi-feature, multi-description, and multi-domain information. A validation (proof-of-concept) of the CARE method's capability has been conducted by means of demonstration using the Voter Tracking System (VTS) as an example. The demonstration provides a proof-of-concept, provides incentive to study the method further, and illustrates the potential value of combining goal-oriented, scenario-based, and actor-oriented approach, together with an object-oriented approach, for developing a new requirements engineering method for socio-technical systems. A verification of the CARE method's suitability to engineer the requirements of socio-technical systems has also been conducted by means of assessment against the requirements engineering analysis framework. The validation and verification show that the CARE method is capable comprehensively and systematically acquiring (eliciting, analyzing, and specifying) various concerns (goals and issues) surrounding software developments. However, the verification of the CARE method against the principles for designing effective visual notations shows that the CARE method does not employ an effective visual notation. A tool has also been developed as an enabling technology for the CARE method. A web-based platform was selected and an artefacts versioning feature is provided, thus allowing asynchronous collaborative works of geographically distributed team members located in different timezones.
Estilos ABNT, Harvard, Vancouver, APA, etc.
9

Awang, Abu Bakar Normi Sham. "The effects of software design complexity on defects : a study in open-source systems". Phd thesis, 2011. http://hdl.handle.net/1885/150085.

Texto completo da fonte
Resumo:
The aim of this thesis is to investigate whether there is a general correlation between post-delivery defects and system design complexity by studying measures relating to Data, Structural and Procedural complexity in object-oriented systems and determining their effect on post delivery defects. A further aim is to determine whether, during the detailed design phase, measured Data Complexity can estimate measured Procedural Complexity and Class Size for the implemented system. This research is based on prior work of Card and Glass, who introduced a System Complexity Model as a combination of Structural and Data Complexity. They applied their model to eight similar FORTRAN (RATFOR) systems. This research both investigates and extends the Card and Glass model for applying to the object-oriented environment. Several adjustments are made to accommodate important characteristics of object-oriented design and language, such as "inheritance" and "encapsulation". Based on these adjustments, a new System Complexity Model is proposed, which is then applied to 104 open-source systems to investigate its effectiveness in estimating post-delivery defects. The necessary data are extracted from the source code of systems maintained within SourceForge - a popular open-source repository. Included in the data are, Version Downloads and the Number of Developers considered as independent variables for predicting user reported defects. The Spearman's rank correlation coefficient and Generalized Linear Model (GLM) with Poisson distribution are used to analyze the collected data. The results show that the newly proposed System Complexity (Structural + Data) is not significant for estimating the volume of post-delivery defects (Post-DD). When Structural and Data Complexity are analyzed separately, the results show that Structural Complexity is highly significant in estimating the number of post-DDs. Other important findings include: 1) Data Complexity can effectively estimate Procedural Complexity and Class Size, 2) The ratio of System Complexity and Procedural Complexity is useful for estimating the probability of Defect Density and Class Size. This ratio represents the mappingofmetricsobtained during thedetailed design phase with Procedural Complexity which is measurable during implementation (writing of the source code).
Estilos ABNT, Harvard, Vancouver, APA, etc.
10

Hutchins, Matthew Alexander. "Modelling visualisation using formal algebra". Phd thesis, 1999. http://hdl.handle.net/1885/147627.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
11

Tridgell, Andrew. "Efficient algorithms for sorting and synchronization". Phd thesis, 1999. http://hdl.handle.net/1885/144682.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
12

Cohen, Jonathan Asher. "Coherence for rewriting 2-theories : general theorems with applications to presentations of Higman-Thompson groups and iterated monoidal categories". Phd thesis, 2008. http://hdl.handle.net/1885/151114.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
13

Armstrong, Warren Haley. "Swift : a flexible framework for runtime performance tuning". Phd thesis, 2011. http://hdl.handle.net/1885/151487.

Texto completo da fonte
Resumo:
Many computational kernels require extensive tuning to achieve optimal performance. The tuning process for some kernels must take into account the architecture on which they are executing, the input data that they are processing and the changing levels of contention for limited system resources. Maintaining performance in the face of such fluctuating influences requires kernels to continuously adapt. Swift is a software tool that performs this adaptation. It can be applied to many different target applications. Such an approach is more efficient than developing application-specific code for continuous tuning. Swift performs controlled experiments to gauge the performance of the target application. Results from these experiments are used to guide the execution of the target application. Swift performs periodic re-evaluations of the application and updates the application if environmental conditions or the internal state of the application have caused performance to degrade. The frequency of evaluation is scaled with its likely necessity -Swift performs few evaluations until it detects a potential performance degradation, at which point more detailed assessments are conducted. Swift is constructed using the DynInst library to modify and tune the executing kernel. The effectiveness of Swift depends on the computational expense of utilising this library. A suite of micro-benchmarks was developed to measure this expense. These benchmarks are not specific to Swift, and could guide the design of future DynInst-enabled applications. Swift was applied to tune sparse matrix-vector multiplication kernels. Tun{u00AC}ing such kernels requires selecting a matrix storage format and the associated multiplication algorithm. The choice of format depends on the characteristics of the matrix being multiplied as well as on prevailing system conditions and the number of multiplications being conducted. Swift was evaluated using both simulated environments and physical hardware. Simulated evaluation demonstrated that Swift could correctly select the best matrix format and could react to changing conditions. Evaluations on physical hardware demonstrated that automatic tuning was viable under certain conditions.
Estilos ABNT, Harvard, Vancouver, APA, etc.
14

Teh, Chin Hao Alvin. "Normative manipulation as a way of improving the performance of software engineering groups : three experiments". Phd thesis, 2012. http://hdl.handle.net/1885/149832.

Texto completo da fonte
Resumo:
As the size of Software development projects increase, so too do the number of people working on these projects. This increase of software groups have brought about a new focus on the sociological issues associated with Software Development. There is a growing body of work that seeks to understand how Software Engineers work together effectively as a group, and also to identify factors that enable increased productivity consistently. Social Psychology looks at the interactions between individuals in groups (group dynamics) and may provide an applicable means to address this increased need for enhanced group effectiveness in Software Engineering. The thesis of this research is that it is possible to apply Social Psychology research (in particular Normative Manipulation) to Software Engineering groups. It is possible to use Normative Manipulation effectively to increase the performance of Software Engineering groups in some types of tasks, and finally, this technique is adoptable by practising Software Engineering groups as it is non-intrusive. Normative Manipulation is a technique in which particular behaviours are made to be favoured by group members. This behaviour is then actively practised by all group members. These particular behaviours may in turn increase the effectiveness of groups on particular tasks - for instance, a group favouring the behaviour of objectivity would then be more inclined to assess provided on information its logical merits and may then in turn be more likely to uncover other related, but less obvious information. Since the success of elicitation and specification of Software Requirements is related to how complete the produced specification is, then it follows that such a group could possibly have increased performance in Software Elicitation and Specification tasks. We demonstrate the validity of the thesis claims by performing three studies. The first study attempts to replicate the results of a Social Psychology experiment on a sample of participants drawn from a Software Engineering population. The second study attempts to show that it is possible to affect the effectiveness of Requirements Elicitation by Software Engineering groups by instilling different norms. The third study applies Normative Manipulation on a practising Software Group to identify if the technique can be applied transparently as part of a normal Requirements Elicitation task. -- provided by Candidate.
Estilos ABNT, Harvard, Vancouver, APA, etc.
15

Tang, Thanh Tin. "Quality-oriented information retrieval in a health domain". Phd thesis, 2007. http://hdl.handle.net/1885/150696.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
16

Shames, Iman. "Formation control and coordination of autonomous agents". Phd thesis, 2010. http://hdl.handle.net/1885/150541.

Texto completo da fonte
Resumo:
The primary purpose of this thesis is to present new results in the field of localization, co-ordination and control of multi-agent systems. In the first part of the thesis, initially, the problem of localization in the presence of inter-agent noisy measurements is formalized and it is established that approximate localizability, i.e. the ability to calculate the approximate positions of the agents, is a generic property and as long as the magnitude of noise is smaller than an upper bound, one can solve the approximate localization problem. Moreover, it is shown that the accuracy of the approximate localization solution using distance measurements of a formation depends on the choice of the nodes with the known positions, anchors, in the formation. Additionally, a method to select these anchors in the network is introduced which minimizes some performance index associated with the error in the approximate solution for the positions of the agents in the formation. In the next chapter, some methods based on polynomial optimization are proposed that can be employed to solve two important problems of cooperative target localization and reference frame determination using different types of measurements. The first part of the thesis is concluded by addressing another localization problem that arose in an experiment conducted by Australia Defence Science and Technology Organization (DSTO). The problem of interest is to localize a formation of unmanned aerial vehicles (UAVs) capable of measuring the inter-agent distances or angles, and the angles subtended at each of them by two landmarks at known positions. We tackle this problem using tools from graph theory and linkage mechanism design. In the second part of this thesis, we shift our focus to motion control of autonomous agents. First, we address the problem of simultaneous localization and circumnavigation of an initially stationary target at an unknown position by a single agent aware of its own trajectory that is capable of measuring its distance to the target. We propose, a control law and an estimator that achieves this objective exponentially fast. Later, we extend our analysis to the case where the target is moving and calculate an upper bound for the estimation error in terms of the target speed. The last problem that we consider is the problem of forcing a set of agents initially at arbitrary positions subject to a constant speed constraint to rotate around a target at a known position forming a prescribed formation shape while guaranteeing that no collision occurs. We show that our proposed algorithm achieves this objective under some mild and realistic assumptions in finite time.
Estilos ABNT, Harvard, Vancouver, APA, etc.
17

Wang, Yuanzhi. "Organic aggregation : a human-centred and model-driven approach to engineering service-oriented systems". Phd thesis, 2010. http://hdl.handle.net/1885/151460.

Texto completo da fonte
Resumo:
Owing to a widespread trend of globalisation and service economies, there are exponentially increasing demands for Software-Intensive Systems (SIS) in general, and Service-Oriented Systems (SOS) in particular. However, it presents great challenges to develop and manage these systems. Although current research and practice provide various means to attack these challenges, there are many difficult impediments to overcome. This research is motivated by such demands, challenges, and opportunities. The ultimate objective is to understand and address the critical challenges of services engineering. To do so, we develop a multi-phased and iterative research methodology that is adapted from typical applied science research methodologies, in order to suit the exploratory nature of this research. According to the research methodology, we investigate and analyse the special characteristics of services engineering, such as a high degree of complexity, uncertainty, and volatility. Moreover, some existing approaches and related work are studied and analysed in a critical way. We conclude that the great difficulties of services engineering are fundamentally caused by a lack of disciplined engineering approaches that take into account the rapidly co-evolving socio-technical environments, where both human intellectual capacities and engineering competence need to be well understood and exploited. To realise our vision, we derive a generic engineering framework based on generalisation of other engineering disciplines, based on which, a services engineering framework called Organic Aggregation Services Engineering Framework (OASEF) is proposed. OASEF contains a theoretical foundation that consists of complementary theories and knowledge from multiple disciplines. Some important concepts are also defined, such as services engineering, models and modelling, and Socio-Technical Environments (STE). Moreover, OASEF contains some guiding principles that provide important guidance for the design and realisation of SOS and services engineering. Based on these conceptual resources, a profound concept called organic aggregation is developed, which takes an organic and synthetic approach to grow and manage systems of any kind. Furthermore, OASEF also incorporates: 1) a generic conceptual process model called Organic Aggregation Process (OAP) in support of organic aggregations of human intellectual and technical capacities; 2) a fully integrated model-driven method to realise OASEF/OAP activities in a systematic and automatic way; 3) a range of domain-specific and general purpose modelling languages for OASEF activities; 4) a mechanism to capture and reuse engineering capacities and to realise automatic system generation; and 5) an integrated tool environment in support of OASEF. Two controlled proof-of-concept case studies are conducted in real world settings, which aim to evaluate and improve OASEF concepts, methods, and mechanisms. Results show that OASEF helps to manage system complexity, agility, and productivity when engineering SOS. Some limitations and insufficiencies are also observed, which require future research. Although this research mainly focuses on SOS and services engineering, its engineering framework, or more specifically, the theoretical foundation, guiding principles, and generic process model, can be applied within a wider scope of software engineering and systems engineering.
Estilos ABNT, Harvard, Vancouver, APA, etc.
Oferecemos descontos em todos os planos premium para autores cujas obras estão incluídas em seleções literárias temáticas. Contate-nos para obter um código promocional único!

Vá para a bibliografia