Auswahl der wissenschaftlichen Literatur zum Thema „Qa76.575“

Geben Sie eine Quelle nach APA, MLA, Chicago, Harvard und anderen Zitierweisen an

Wählen Sie eine Art der Quelle aus:

Inhaltsverzeichnis

Machen Sie sich mit den Listen der aktuellen Artikel, Bücher, Dissertationen, Berichten und anderer wissenschaftlichen Quellen zum Thema "Qa76.575" bekannt.

Neben jedem Werk im Literaturverzeichnis ist die Option "Zur Bibliographie hinzufügen" verfügbar. Nutzen Sie sie, wird Ihre bibliographische Angabe des gewählten Werkes nach der nötigen Zitierweise (APA, MLA, Harvard, Chicago, Vancouver usw.) automatisch gestaltet.

Sie können auch den vollen Text der wissenschaftlichen Publikation im PDF-Format herunterladen und eine Online-Annotation der Arbeit lesen, wenn die relevanten Parameter in den Metadaten verfügbar sind.

Dissertationen zum Thema "Qa76.575"

1

Triastuti, Sugiyarto Endang. „Analysing rounding data using radial basis function neural networks model“. Thesis, University of Northampton, 2007. http://nectar.northampton.ac.uk/2809/.

Der volle Inhalt der Quelle
Annotation:
Unspecified counting practices used in a data collection may create rounding to certain ‘based’ number that can have serious consequences on data quality. Statistical methods for analysing missing data are commonly used to deal with the issue but it could actually aggravate the problem. Rounded data are not missing data, instead some observations were just systematically lumped to certain based numbers reflecting the rounding process or counting behaviour. A new method to analyse rounded data would therefore be academically valuable. The neural network model developed in this study fills the gap and serves the purpose by complementing and enhancing the conventional statistical methods. The model detects, analyses, and quantifies the existence of periodic structures in a data set because of rounding. The robustness of the model is examined using simulated data sets containing specific rounding numbers of different levels. The model is also subjected to theoretical and numerical tests to confirm its validity before being used on real applications. Overall, the model performs very well making it suitable for many applications. The assessment results show the importance of using the right best fit in rounding detection. The detection power and cut-off point estimation also depend on data distribution and rounding based numbers. Detecting rounding of prime numbers is easier than non-prime numbers due to the unique characteristics of the former. The bigger the number, the easier is the detection. This is in a complete contrast with non-prime numbers, where the bigger the number, the more will be the “factor” numbers distracting rounding detection. Using uniform best fit on uniform data produces the best result and lowest cut-off point. The consequence of using a wrong best fit on uniform data is however also the worst. The model performs best on data containing 10-40% rounding levels as less or more rounding levels produce unclear rounding pattern or distort the rounding detection, respectively. The modulo-test method also suffers the same problem. Real data applications on religious census data confirms the modulo-test finding that the data contains rounding base 5, while applications on cigarettes smoked and alcohol consumed data show good detection results. The cigarettes data seem to contain rounding base 5, while alcohol consumption data indicate no rounding patterns that may be attributed to the ways the two data were collected. The modelling applications can be extended to other areas in which rounding is common and can have significant consequences. The modelling development can he refined to include data-smoothing process and to make it user friendly as an online modelling tool. This will maximize the model’s potential use
APA, Harvard, Vancouver, ISO und andere Zitierweisen
2

Ampratwum, Cecilia S. „Identification of chemical species using artificial intelligence to interpret optical emission spectra“. Thesis, University of Northampton, 1999. http://nectar.northampton.ac.uk/3004/.

Der volle Inhalt der Quelle
Annotation:
The nonlinear modeling capabilities of artificial neural networks (ANN’s) are renowned in the field of artificial intelligence (Al) for capturing knowledge that can be very difficult to understand otherwise. Their ability to be trained on representative data within a particular problem domain and generalise over a set of data make them efficient predictive models. One problem domain that contains complex data that would benefit from the predictive capabilities of ANN’s is that of optical emission spectra (OES). OES is an important diagnostic for monitoring plasma species within plasma processing. Normally, OES spectral interpretation requires significant prior expertise from a spectroscopist. One way of alleviating this intensive demand in order to quickly interpret OES spectra is to interpret the data using an intelligent pattern recognition technique like ANN’s. This thesis investigates and presents MLP ANN models that can successfully classify chemical species within OES spectral patterns. The primary contribution of the thesis is the creation of deployable ANN species models that can predict OES spectral line sizes directly from six controllable input process parameters; and the implementation of a novel rule extraction procedure to relate the real multi-output values of the spectral line sizes to individual input process parameters. Not only are the trained species models excellent in their predictive capability, but they also provide the foundation for extracting comprehensible rules. A secondary contribution made by this thesis is to present an adapted fuzzy rule extraction system that attaches a quantitative measure of confidence to individual rules. The most significant contribution to the field of Al that is generated from the work presented in the thesis is the fact that the rule extraction procedure utilises predictive ANN species models that employ real continuously valued multi-output data. This is an improvement on rule extraction from trained networks that normally focus on discrete binary outputs
APA, Harvard, Vancouver, ISO und andere Zitierweisen
3

Johnson, Mark. „The Dyslexic User's Interface Support Tool (DUIST) : a framework for performance enhancing interface adaptation strategies for dyslexic computer users“. Thesis, University of Northampton, 2007. http://nectar.northampton.ac.uk/2683/.

Der volle Inhalt der Quelle
Annotation:
Due to the nature of the symptoms experienced by dyslexic individuals (e.g. defective visual processing, short term memory deficit and motor control problems) an investigation into support strategies to aid persons suffering from the condition seems strongly justifiable. As such, an extensive review of existing support techniques for dyslexic computer users are explored leading to the formulation of four central research models; dyslexia symptoms, symptom alleviating interface strategies, adjustable interface components and a dynamically adaptable interface preference elicitation mechanism. These models provide the foundation for the design of the Dyslexic User’s Interface Support Tool (DUIST) framework. Using a user centred design approach, the support framework is developed, tested and subsequently evaluated with positive results. Performance gains for dyslexic subjects in reading speed and reading accuracy exemplify the apparent benefits of framework utilisation (e.g. dyslexic mean reading speed increased by 4.98 wpm vs. control gains of 0.18 wpm; dyslexic mean reading errors reduced by 0.64 per 100 words vs. control reductions of 0.06 fewer errors per 100 words). Subsequent research into the long-term impact of framework utilisation; the perceived benefits of applying research formulated models to interfaces designed for dyslexics; and alternative strategies to portability all now seem justified. That said, the findings presented thus far warrants investigation by any reader actively interested in dyslexia; strategies for dyslexia symptom relief support environments for dyslexic computer users; applications of adaptive interfaces; and all potential system designers who may be considering developing any type of graphical interface for a dyslexic user group
APA, Harvard, Vancouver, ISO und andere Zitierweisen
4

Wang, Yijun. „Development of an artificial neural network model to predict expert judgement of leather handle from instrumentally measured parameters“. Thesis, University of Northampton, 2009. http://nectar.northampton.ac.uk/3581/.

Der volle Inhalt der Quelle
Annotation:
Leather is a widely used material whose handling character is still assessed manually by experienced people in the leather industry. The aim of this study was to provide a new approach to such characterisation by developing Artificial Neural Network models to investigate the relationship between the subjective assessment of leather handle and its measureable physical characteristics. Two collections of commercial leather samples provided by TFL and PITTARDS were studied in this project. While the handle of the TFL collection covered a varied range, the PITTARDS collection was all relatively soft leather and with less difference within the collection. Descriptive Sensory Analysis was used to identify and quantify the subjective assessment of leather handle. A panel constituted of leather experts was organised and trained to: 1) define attributes describing leather handle; 2) assess specific leather handle by responding to questionnaires seeking information about the above attributes. According to the analysis of the raw data and the assessment observation, the attributes that should be used for training the artificial network models were "stiff", "empty", "smooth", "firm", "high density" and "elastic". Various physical measurements relating to leather handle were carried out as follows: standard leather thickness, apparent density, thickness with 1 gram load and 2 gram load, resistance to compression, resistance to stretching, surface friction, modified vertical loop deformation, drooping angle and BLC softness. The parameters from each measurement were all scaled on range 0 to 1 before being fed into network models. Artificial neural networks were developed through learning from the TFL examples and then tested on the PITTARDS collection. In the training stage, parameters from physical measurements and attribute gradings provided by descriptive sensory analysis were fed into the networks as input and desired output respectively. In the testing stage, physical measurement parameters were input to the trained network and the output of the network, which was the prediction of the leather handle, was compared with the gradings given by the panel. The testing results showed that the neural network models developed were able to judge the handle of a newly presented leather as well as an expert. Statistical methods were explored in the development of artificial neural network models. Principal Component Analysis was used to classify the attributes of leather handle and demonstrated that the predominant and most representative attributes out of the six attributes were "stiff", "empty" and "smooth". A network model called physical2panel, predicting the above three attributes from three physical parameters was built up by adopting a novel pruning method termed "Double-Threshold" which was used to decide the irrelevance of an input to a model. This pruning method was based on Bayesian methodology and implemented by comparing the overall connection weight of each input to each output with the limitation of two thresholds. The pruning results revealed that among the sixteen physical parameters, only three of them, - the reading from BLC softness guage, the compression secant modulus and the leather thickness measured under 1 gram load were important to the model. Another network model, termed panel2panel, that predicts the other three attributes "firm", "high density" and "elastic" from the prediction of the model physical2panel was developed and also proved to work as well as a leather expert panel. The conception of a 3D handle space was explored and shown to be a powerful means of demonstrating the findings.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
5

„A tunable version control system for virtual machines in an open-source cloud“. 2013. http://repository.lib.cuhk.edu.hk/en/item/cuhk-1291493.

Der volle Inhalt der Quelle
Annotation:
Open-source cloud platforms provide a feasible alternative of deploying cloud computing in low-cost commodity hardware and operating systems. To enhance the reliability of an open-source cloud, we design and implement CloudVS, a practical add-on system that enables version control for virtual machines (VMs). CloudVS targets a commodity cloud platform that has limited available resources. It exploits content similarities across different VM versions using redundancy elimination (RE), such that only non-redundant data chunks of a VM version are transmitted over the network and kept in persistent storage. Using RE as a building block, we propose a suite of performance adaptation mechanisms that make CloudVS amenable to different commodity settings. Specifically, we propose a tunable mechanism to balance the storage and disk seek overheads, as well as various I/O optimization techniques to minimize the interferences to other co-resident processes. We further exploit a higher degree of content similarity by applying RE to multiple VM images simultaneously, and support the copy-on-write image format. Using real-world VM snapshots, we experiment CloudVS in an open-source cloud testbed built on Eucalyptus. We demonstrate how CloudVS can be parameterized to balance the performance trade-offs between version control and normal VM operations.
開源雲端平台為供低成本硬件及作業系統提供一個可行的替代方案。為了提高開源雲的可靠性,我們設計及實踐了CloudVS,一個針對虛擬機的實用版本控制系統。CloudVS針對有限資源的低成本硬件雲平台,利用內容相似性,在不同的虛擬機版本使用冗餘消除。這樣,在虛擬機版本數據中只有非冗餘的部分在網絡上傳輸,並保存在持久存儲。使用冗餘消除作為構建塊,我們提出了一套性能適應機制,使CloudVS適合於不同的低成本硬件配置。具體而言,我們提出了一種可調諧的機制來平衡存儲和磁盤尋道開銷,以及應用各種I/O優化技術去最大限度地減少對其他同時運行進程的干擾。我們應用冗餘消除多個虛擬機影像去進一步利用其內容相似度,同時,我們更進一步支持寫時複製格式。使用來自真實世界的虛擬機快照,我們嘗試在開放源碼的雲測試平台Eucalyptus中測試CloudVS。我們演示CloudVS如何可以參數化,以平衡版本控制和正常的虛擬機操作之間的性能取捨。
Tang, Chung Pan.
Thesis M.Phil. Chinese University of Hong Kong 2013.
Includes bibliographical references (leaves 57-65).
Abstracts also in Chinese.
Title from PDF title page (viewed on 07, October, 2016).
Detailed summary in vernacular field only.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
6

Trentelman, Kerry. „Aspects of Java program verification“. Phd thesis, 2006. http://hdl.handle.net/1885/151803.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
7

Nagappan, Rajehndra Yegappan. „Mining multidimensional data through compositional visualisation“. Phd thesis, 2001. http://hdl.handle.net/1885/146042.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
8

Fatwanto, Agung. „A concern-aware requirements engineering framework“. Phd thesis, 2011. http://hdl.handle.net/1885/150279.

Der volle Inhalt der Quelle
Annotation:
Poorly understood and articulated requirements have been widely acknowledged as the main contributor for software development problems. A number of studies suggest that a holistic understanding of the concerns (goals and issues) surrounding software development and stakeholders' active participation are two critical factors for the success of requirements engineering. The research as documented in this thesis thus aims to solve the problem by developing and demonstrating a new approach necessary for eliciting, analyzing, and specifying various stakeholders' concerns. The aim has been achieved with the development and demonstration of the Concern-Aware Requirements Engineering (CARE) method. The CARE method was developed by combining goal-oriented, scenario-based, and actor-oriented approach together with a consideration to object-oriented approach. This combination allows the CARE method to provide a novel way to requirements engineering. It is novel in the sense that: (i) it combines goal-oriented, scenario-based, and actor-oriented approach, (ii) it considers object-oriented specification as the reference for final format into which the acquired (elicited, analyzed, and specified) information can potentially be transformed, and (iii) it introduces multidimensional information specification by providing the coverage to describe: multi-feature, multi-description, and multi-domain information. A validation (proof-of-concept) of the CARE method's capability has been conducted by means of demonstration using the Voter Tracking System (VTS) as an example. The demonstration provides a proof-of-concept, provides incentive to study the method further, and illustrates the potential value of combining goal-oriented, scenario-based, and actor-oriented approach, together with an object-oriented approach, for developing a new requirements engineering method for socio-technical systems. A verification of the CARE method's suitability to engineer the requirements of socio-technical systems has also been conducted by means of assessment against the requirements engineering analysis framework. The validation and verification show that the CARE method is capable comprehensively and systematically acquiring (eliciting, analyzing, and specifying) various concerns (goals and issues) surrounding software developments. However, the verification of the CARE method against the principles for designing effective visual notations shows that the CARE method does not employ an effective visual notation. A tool has also been developed as an enabling technology for the CARE method. A web-based platform was selected and an artefacts versioning feature is provided, thus allowing asynchronous collaborative works of geographically distributed team members located in different timezones.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
9

Awang, Abu Bakar Normi Sham. „The effects of software design complexity on defects : a study in open-source systems“. Phd thesis, 2011. http://hdl.handle.net/1885/150085.

Der volle Inhalt der Quelle
Annotation:
The aim of this thesis is to investigate whether there is a general correlation between post-delivery defects and system design complexity by studying measures relating to Data, Structural and Procedural complexity in object-oriented systems and determining their effect on post delivery defects. A further aim is to determine whether, during the detailed design phase, measured Data Complexity can estimate measured Procedural Complexity and Class Size for the implemented system. This research is based on prior work of Card and Glass, who introduced a System Complexity Model as a combination of Structural and Data Complexity. They applied their model to eight similar FORTRAN (RATFOR) systems. This research both investigates and extends the Card and Glass model for applying to the object-oriented environment. Several adjustments are made to accommodate important characteristics of object-oriented design and language, such as "inheritance" and "encapsulation". Based on these adjustments, a new System Complexity Model is proposed, which is then applied to 104 open-source systems to investigate its effectiveness in estimating post-delivery defects. The necessary data are extracted from the source code of systems maintained within SourceForge - a popular open-source repository. Included in the data are, Version Downloads and the Number of Developers considered as independent variables for predicting user reported defects. The Spearman's rank correlation coefficient and Generalized Linear Model (GLM) with Poisson distribution are used to analyze the collected data. The results show that the newly proposed System Complexity (Structural + Data) is not significant for estimating the volume of post-delivery defects (Post-DD). When Structural and Data Complexity are analyzed separately, the results show that Structural Complexity is highly significant in estimating the number of post-DDs. Other important findings include: 1) Data Complexity can effectively estimate Procedural Complexity and Class Size, 2) The ratio of System Complexity and Procedural Complexity is useful for estimating the probability of Defect Density and Class Size. This ratio represents the mappingofmetricsobtained during thedetailed design phase with Procedural Complexity which is measurable during implementation (writing of the source code).
APA, Harvard, Vancouver, ISO und andere Zitierweisen
10

Hutchins, Matthew Alexander. „Modelling visualisation using formal algebra“. Phd thesis, 1999. http://hdl.handle.net/1885/147627.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen

Bücher zum Thema "Qa76.575"

1

F, Koegel Buford John, Hrsg. Multimedia systems. New York: ACM Press, 1994.

Den vollen Inhalt der Quelle finden
APA, Harvard, Vancouver, ISO und andere Zitierweisen
2

1954-, Steele Guy L., Hrsg. C, a reference manual. 5. Aufl. Upper Saddle River, N.J: Prentice-Hall, 2002.

Den vollen Inhalt der Quelle finden
APA, Harvard, Vancouver, ISO und andere Zitierweisen
3

Ira, Pohl, Hrsg. A book on C: Programming in C. 4. Aufl. Reading, Mass: Addison-Wesley, 1998.

Den vollen Inhalt der Quelle finden
APA, Harvard, Vancouver, ISO und andere Zitierweisen
4

Ira, Pohl, Hrsg. A book on C: Programming in C. 3. Aufl. Redwood City, Calif: Benjamin Cummings Pub. Co., 1995.

Den vollen Inhalt der Quelle finden
APA, Harvard, Vancouver, ISO und andere Zitierweisen
5

Mark, Schaeffer, Hrsg. Macromedia Director MX for Windows and Macintosh. Berkeley, CA: Peachpit, 2003.

Den vollen Inhalt der Quelle finden
APA, Harvard, Vancouver, ISO und andere Zitierweisen
6

Mehdi, Jazayeri, und Mandrioli Dino, Hrsg. Fundamentals of software engineering. Englewood Cliffs, NJ: Prentice Hall, 1991.

Den vollen Inhalt der Quelle finden
APA, Harvard, Vancouver, ISO und andere Zitierweisen
7

L, Ranum David, Hrsg. Python programming in context. Sudbury, Mass: Jones and Bartlett Publishers, 2009.

Den vollen Inhalt der Quelle finden
APA, Harvard, Vancouver, ISO und andere Zitierweisen
8

Andy, Channelle, und Sicam Jaime, Hrsg. Beginning Ubuntu Linux. 4. Aufl. New York, NY: Apress, 2009.

Den vollen Inhalt der Quelle finden
APA, Harvard, Vancouver, ISO und andere Zitierweisen
9

Ivica, Kostanic, Hrsg. Principles of neurocomputing for science and engineering. New York, NY: McGraw Hill, 2001.

Den vollen Inhalt der Quelle finden
APA, Harvard, Vancouver, ISO und andere Zitierweisen
10

Barbara, Ericson, Hrsg. Introduction to computing & programming in Python: A multimedia approach. 3. Aufl. Upper Saddle River: Prentice Hall, 2013.

Den vollen Inhalt der Quelle finden
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Wir bieten Rabatte auf alle Premium-Pläne für Autoren, deren Werke in thematische Literatursammlungen aufgenommen wurden. Kontaktieren Sie uns, um einen einzigartigen Promo-Code zu erhalten!

Zur Bibliographie