Dissertationen zum Thema „Évaluation des logiciels“
Geben Sie eine Quelle nach APA, MLA, Chicago, Harvard und anderen Zitierweisen an
Machen Sie sich mit Top-50 Dissertationen für die Forschung zum Thema "Évaluation des logiciels" bekannt.
Neben jedem Werk im Literaturverzeichnis ist die Option "Zur Bibliographie hinzufügen" verfügbar. Nutzen Sie sie, wird Ihre bibliographische Angabe des gewählten Werkes nach der nötigen Zitierweise (APA, MLA, Harvard, Chicago, Vancouver usw.) automatisch gestaltet.
Sie können auch den vollen Text der wissenschaftlichen Publikation im PDF-Format herunterladen und eine Online-Annotation der Arbeit lesen, wenn die relevanten Parameter in den Metadaten verfügbar sind.
Sehen Sie die Dissertationen für verschiedene Spezialgebieten durch und erstellen Sie Ihre Bibliographie auf korrekte Weise.
Pelet, Jacques. „Évaluation de l'intégrité des logiciels à caractère sécuritaire“. Saint-Etienne, 1998. http://www.theses.fr/1998STET4026.
Der volle Inhalt der QuelleHu, Olivier. „Contribution à l'évaluation des logiciels multimédias pédagogiques“. Compiègne, 2001. http://www.theses.fr/2001COMP1350.
Der volle Inhalt der QuelleMaurice, François. „Un modèle d'évaluation et d'amélioration d'entités logicielles basé sur l'utilisation de métriques“. Toulouse 3, 1996. http://www.theses.fr/1996TOU30192.
Der volle Inhalt der QuelleKanoun, Karama. „Croissance de la sûreté de fonctionnement des logiciels : caractérisation, modélisation, évaluation“. Toulouse, INPT, 1989. http://www.theses.fr/1989INPT091H.
Der volle Inhalt der QuelleHe, Peng. „Conception et évaluation des systèmes logiciels de classifications de paquets haute-performance“. Thesis, Université Grenoble Alpes (ComUE), 2015. http://www.theses.fr/2015GREAA007/document.
Der volle Inhalt der QuellePacket classification consists of matching packet headers against a set of pre-defined rules, and performing the action(s) associated with the matched rule(s). As a key technology in the data-plane of network devices, packet classification has been widely deployed in many network applications and services, such as firewalling, load balancing, VPNs etc. Packet classification has been extensively studied in the past two decades. Traditional packet classification methods are usually based on specific hardware. With the development of data center networking, software-defined networking, and application-aware networking technology, packet classification methods based on multi/many processor platform are becoming a new research interest. In this dissertation, packet classification has been studied mainly in three aspects: algorithm design framework, rule-set features analysis and algorithm implementation and optimization. In the dissertation, we review multiple proposed algorithms and present a decision tree based algorithm design framework. The framework decomposes various existing packet classification algorithms into a combination of different types of “meta-methods”, revealing the connection between different algorithms. Based on this framework, we combine different “meta-methods” from different algorithms, and propose two new algorithms, HyperSplit-op and HiCuts-op. The experiment results show that HiCuts-op achieves 2~20x less memory size, and 10% less memory accesses than HiCuts, while HyperSplit-op achieves 2~200x less memory size, and 10%~30% less memory accesses than HyperSplit. We also explore the connections between the rule-set features and the performance of various algorithms. We find that the “coverage uniformity” of the rule-set has a significant impact on the classification speed, and the size of “orthogonal structure” rules usually determines the memory size of algorithms. Based on these two observations, we propose a memory consumption model and a quantified method for coverage uniformity. Using the two tools, we propose a new multi-decision tree algorithm, SmartSplit and an algorithm policy framework, AutoPC. Compared to EffiCuts algorithm, SmartSplit achieves around 2.9x speedup and up to 10x memory size reduction. For a given rule-set, AutoPC can automatically recommend a “right” algorithm for the rule-set. Compared to using a single algorithm on all the rulesets, AutoPC achieves in average 3.8 times faster. We also analyze the connection between prefix length and the update overhead for IP lookup algorithms. We observe that long prefixes will always result in more memory accesses using Tree Bitmap algorithm while short prefixes will always result in large update overhead in DIR-24-8. Through combining two algorithms, a hybrid algorithm, SplitLookup, is proposed to reduce the update overhead. Experimental results show that, the hybrid algorithm achieves 2 orders of magnitudes less in memory accesses when performing short prefixes updating, but its lookup speed with DIR-24-8 is close. In the dissertation, we implement and optimize multiple algorithms on the multi/many core platform. For IP lookup, we implement two typical algorithms: DIR-24-8 and Tree Bitmap, and present several optimization tricks for these two algorithms. For multi-dimensional packet classification, we have implemented HyperCuts/HiCuts and the variants of these two algorithms, such as Adaptive Binary Cuttings, EffiCuts, HiCuts-op and HyperSplit-op. The SplitLookup algorithm has achieved up to 40Gbps throughput on TILEPro64 many-core processor. The HiCuts-op and HyperSplit-op have achieved up to 10 to 20Gbps throughput on a single core of Intel processors. In general, our study reveals the connections between the algorithmic tricks and rule-set features. Results in this dissertation provide insight for new algorithm design and the guidelines for efficient algorithm implementation
Farenc, Christelle. „Ergoval : une méthode de structuration des règles ergonomiques permettant l'évaluation automatique d'interfaces graphiques“. Toulouse 1, 1997. http://www.theses.fr/1997TOU10013.
Der volle Inhalt der QuelleThe thesis introduces a new method for structuring ergonomic rules in order to evaluate graphical user interface. This method performed in collaboration with the SRTP (post office technical research unit) aims to be used by computer experts and to be integrated in an automatic user interface evaluation tool : ERGOVAL. In order to provide information to developers in a way they can handle it to modify the interface, ergonomic rules were reformulated to concern directly graphical objects of the user interface. Knowledge involved in the evaluation was structured in this way : * a representation of the UI in terms of the interaction objects of the norm CUA was built : this is the decomposition of graphical objects * all graphical objects concerned by the same set of ergonomic rules are grouped together into classes of objects : the typology of graphic objects. . The resulting typology consists in several levels of abstraction, the graphical objects being the leaves of this typology. The links of this typology are types of links which have hierarchical properties, i. E. Each type inherits attributes from the parent type and associated rules. A mock-up of the ERGOVAL tool was made to validate knowledge structuration and to define specifications of the final tool. In order to determine the scale application, the automatic and qualitative dimensions were studied especially the automatic retrieval of interface description and the number and level of ergonomic rules integrated in the mock-up. Consequently, the quality of an automatic evaluation and an evaluation of high level ergonomic rules were determined
Waeselynck, Hélène. „Vérification de logiciels critiques par le test statistique“. Toulouse, INPT, 1993. http://www.theses.fr/1993INPT010H.
Der volle Inhalt der QuelleBabau, Jean-Philippe. „Etude du comportement temporel des applications temps réel à contraintes strictes basée sur une analyse d'ordonnançabilité“. Poitiers, 1996. http://www.theses.fr/1996POIT2305.
Der volle Inhalt der QuelleCharlet, Célina. „Raffiner pour vérifier des systèmes paramétrés“. Besançon, 2003. http://www.theses.fr/2003BESA2054.
Der volle Inhalt der QuelleGuégain, Edouard. „Optimisation de logiciels par leur configuration“. Electronic Thesis or Diss., Université de Lille (2022-....), 2023. http://www.theses.fr/2023ULILB020.
Der volle Inhalt der QuelleThe field of software engineering evolves rapidly, exposing practitioners to an ever-increasing collection of tools, languages, frameworks, and paradigms.Each of these components can have its own, internal configuration.Thus, designing a new software system consist in selecting components from this collection, which is akin to creating a configuration.The criterion to configure such systems is too often the ease of development, which leads to oversized, power-hungry bloatware.This paradigm is not aligned with frugal or environmental concerns.Thus, this dissertation looks into the ability to leverage the configuration of a system to optimize its performance.A specific focus is made on energy consumption and the size of software systems.A prerequisite to optimizing a system is to understand its current performance.To gain insight into this subject, the configuration software JHipster was empirically analyzed.Exhaustively assessing the performances of configurations of JHipster, wrt several indicators, showed that different configurations have indeed different performances.Thus, relying on performance insight, it is possible to create high-performance configurations of JHipster.Furthermore, some performance indicators proved correlated across configurations.Therefore, the optimization goal can be simplified by ignoring redundant performance indicators.The process of creating optimized configurations of JHipster was performed manually, which is only possible in smaller configuration spaces.To tackle larger configuration spaces, an algorithm was created, defining how to assess the performance of each option, and then how to improve a given configuration using such performance data.However, optimizing a configuration by selecting high-performance options brought out limitations, as options can interact with each other:in some situations, pairing high-performance options may result in subpar performances.Similarly, low-performance options can prove unexpectedly efficient when paired together.Thus, the optimization algorithm has been designed to leverage such specific behaviors.Applying this algorithm to a large set of configurations showed that most of them can reach near-optimal performances, with only a limited set of modifications.However, performance constraints are not limited to a single performance indicator.Depending on the context, the energy consumption of a system may not be the single most impactful indicator to optimize.Thus, the optimization algorithm must be generalized to support several performance indicators.This generalized algorithm has been validated on a pair of performance indicators: the execution time and the size of the software.The main highlight of this validation is that half of all configurations can reach a local optimum by changing a single setting.Furthermore, by exhaustively applying the algorithm to the configuration space of a system, it was possible to follow how it navigates the configuration space to find optimal configurations.Analyzing this navigation highlighted current limitations in the algorithm, which can thus be further improved as future work.In its current state, the algorithm was published as an open-source tool under the name ICO
Abdeen, Hani. „Visualizing, Assessing and Re-Modularizing Object-Oriented Architectural Elements“. Phd thesis, Université des Sciences et Technologie de Lille - Lille I, 2009. http://tel.archives-ouvertes.fr/tel-00498389.
Der volle Inhalt der QuelleNicolas, Christophe. „Une mesure de la cohésion fonctionnelle pour l'évaluation du code source des logiciels fortran“. Versailles-St Quentin en Yvelines, 1995. http://www.theses.fr/1995VERS0003.
Der volle Inhalt der QuelleBruel, Jean-Michel. „Fuze : un environnement intégré pour l'analyse formelle de logiciels distribués temps réel“. Toulouse 3, 1996. http://www.theses.fr/1996TOU30257.
Der volle Inhalt der QuelleAbdeen, Hani. „Visualizing, assessing and re-modularizing object-oriented architectural elements“. Electronic Thesis or Diss., Lille 1, 2009. http://www.theses.fr/2009LIL10069.
Der volle Inhalt der QuelleTo cope with the complexity of large object-oriented software systems, developers organize classes into subsystems using the concepts of module or package. Such modular structure helps software systems to evolve when facing new requirements.The organization of classes into packages and/or subsystems represents the software modularization. the software modularization usually follows interrelationships between classes. Ideally, packages should to be loosely coupled and cohesive to a certain extent. However, Studies show that as software evolves to meet requirements and environment changes, the software modularization gradually drifts and looses quality. As a consequence, the software modularization must be maintained. It is thus important to understand, to assess and to optimize the organization of packages and their relationships. Our claim is that the maintenance of large and complex software modularizations needs approaches that help in: (1) understanding package shapes and relationships; (2) assessing the quality of a modularization, as well as the quality of a single package within a given modularization; (3) optimizing the quality of an existing modularization. In this thesis, we concentrate on three research fields: software visualizations, metrics and algorithms. At first, we define two visualizations that help maintainers: (1) to understand packages structure, usage and relationships; (2) to spot patterns; and (3) to identify misplaced classes and structural anomalies. In addition to visualizations, we define a suite of metrics that help in assessing the package design quality (i.e., package cohesion and coupling). We also define metrics that assess the quality of a collection of inter-dependent packages from different view points, such as the degree of package coupling and cycles. Finally, we define a search-based algorithm that automatically reduces package coupling and cycles only by moving classes over existing packages. Our optimization approach takes explicitly into account the original class organization and package structure. It also allows maintainers to control the optimization process by specifying: (1) the maximal number of classes that may change their packages; (2) the classes that are candidate for moving and the classes that should not; (3) the packages that are candidate for restructuring and the packages that should not; and (4) the maximal number of classes that a given package can entail.The approaches presented in this thesis have been applied to real large object-oriented software systems. The results we obtained demonstrate the usefulness of our visualizations and metrics; and the effectiveness of our optimization algorithm
Chevalier, Marcel. „Etude de la complexité des logiciels de type flots de données en vue de la fiabilité : application à l'atelier logiciel SAGA“. Grenoble 1, 1989. http://tel.archives-ouvertes.fr/tel-00334028.
Der volle Inhalt der QuelleConquet, Eric. „Une méthode d'intégration de l'évaluation de performance dans le développement des systèmes informatiques“. Toulouse 3, 1993. http://www.theses.fr/1993TOU30128.
Der volle Inhalt der QuelleKoliaï, Souad. „Approche statique et dynamique pour l'évaluation de performances de codes scientifiques“. Versailles-St Quentin en Yvelines, 2011. http://www.theses.fr/2011VERS0010.
Der volle Inhalt der QuelleCurrent hardware tends to increase pressure on programmers to optimize the codes. The complexity of modern architectures makes it more difficult to understand the behavior of the programs running on them. Moreover, the compilers apply aggressive optimizations which makes the compiled code more difficult to understand. This increasing complexity shows that there is still a need of performance analysis to help the programmers. Different tools and techniques exist, but no single tool is a panacea; instead, different tools have different strengths. This thesis proposes two different and complementary tools for performance analysis on binary code. The first tool, Maqao’s static analysis, performs a static evaluation of the performance of the code, and gives an estimate of the quality of the code, such as the vectorization ratios. The second tool, Decan, is a new approach of performance analysis that targets the memory instructions to pinpoint the set of instructions responsible of the poor performance. Both tools are combined to propose a semi-automated methodology for performance evaluation
Moro, Pierre. „Techniques de vérification basées sur des représentations symboliques par automates et l'abstraction guidée par les contre-exemples“. Paris 7, 2008. http://www.theses.fr/2008PA077013.
Der volle Inhalt der QuelleThis thesis studies automatic verification techniques where programs are represented symbolically by automata. The set of configurations of such programs are represented by automata whereas instructions are represented by transducers. Computing the set of reachable states of such programs is an important ingredient for the verification of safety properties. This problem is undecidable, since the naive iterative computation does not terminate in general. Therefore, one has to use techniques to accelerate the computation. One of the techniques mainly studied consists to over-approximate the set of reachable states to enforce the convergence of the computation. These over-approximation techniques can introduce behaviours that do not exist in the program and for which the property i false. In that case, we check if the counterexamples is present in real program or due to the upper approximation. In the latter case, we use refinement techniques in order to eliminate the spurious counterexample from our abstraction and restart the computation. Using this abstract-check-refine loop, we propose techniques in order to verify sequential, non-recursive programs manipulating linked lists. We develop methods allowing to represent memory configurations as automata, and instructions as transducers. We then propose specific abstraction and refinement techniques for such representations. Then, we show that this kind of programs can also be represented like counter automata, i. E. , only few special cells of the heap (and their number is finite) are relevant and counters indicating the number of elements between this points allows to represent the heap of the program. We develop then methods for counter automata verification using the abstract-check-refine loop for this special kind of automata. We have tested our methods with a tool that supports the automaton described previously. In an other part of the thesis, we study the size of the counterexamples for Buchi automaton that represent the product between a linear temporal logic formula and a finite automaton. These counterexamples allow to correct the program and it is important to have as small as possible counterexamples to improve the time needed for the correction. Using SPIN's memory representation for a state, such algorithms have to optimize memory usage while keeping time complexity as small as possible
Atig, Mohamed Faouzi. „Vérification de Programmes Concurrents : Décidabilité et Complexité“. Paris 7, 2010. http://www.theses.fr/2010PA077066.
Der volle Inhalt der QuelleThis thesis addresses the verification problems in both, concurrent and recursive Systems as well as concurrent Systems with store buffers. We establish the required theoretical basis for automated analyses: decidability and complexity results for reachability problems. In a first time, we are interested in verifying concurrent programs where each process corresponds to a sequential program with (recursive) procedure calls. The difficulty in analyzing such programs cornes from the interaction between recursion and concurrency which makes the reachability problems undecidable in general. However, in practice programs obey additional constraints that can be exploited to turn the reachability problem decidable. Their study is subject of this thesis. These conditions may be seen as constraints to impose on the order between the actions of the analyzed programs. Moreover, these decidability results can be used to perform an under-approximation analysis to effectively detect bad behaviors of the analyzed programs. In a second time, we study concurrent programs running under weak memory models. In such kind of programs, the order between actions of the same process is relaxed (for performance reasons) by allowing the permutation between certain types of memory operations. This makes reasoning about the behaviors of concurrent programs much more difficult. Moreover, it is not clear how to apply standard reasoning techniques. Our works show that indeed according to the type of relaxation, the reachability problem becomes décidable (but with a highly complexity) in other cases, it even turns out undecidability
Balmas, Françoise. „Contribution à la conceptualisation de programmes : modèle, implémentation, utilisation et évaluation“. Paris 8, 1995. http://www.theses.fr/1995PA081071.
Der volle Inhalt der QuellePerelman, Gary. „Conception, développement et évaluation de techniques d'interactions fluides pour des environnements multidimensionnels : application aux logiciels du service public“. Thesis, Toulouse 3, 2018. http://www.theses.fr/2018TOU30255/document.
Der volle Inhalt der QuelleThe work of this thesis is part of a collaboration with the company Berger-Levrault, a major actor in the development of administrative management software for public services. This work is based on two observations. On the first hand, the policy of digitization of public services induces the need for software adapted to the professions of all public institutions. These software are complex and particularly rich compared to classically used office software (Office, mailbox, etc.). On the other hand, we observe that the devices used to interact with these software did not evolve. Since several decades, the mouse and the keyboard remain the norm in a fixed environment. However, these devices have only few input degrees of freedom. The manipulation of multidimensional data with these devices induces a greater number of steps to perform a task, thus lengthening the interaction path. In this context, the objective of these thesis work is to improve the interaction flow with multidimensional data contained in the software of the public service through the increase of the input degrees of freedom proposed by the devices. Indeed, a larger amount of input degrees of freedom would reduce the number of steps necessary to the accomplishment of a given task, thus improving the interaction flow. We propose three major contributions: a device with multiple degrees of freedom, the Roly-Poly Mouse; a design space, DECO; as well as a set of interaction techniques with mobile devices based on the principle of stacking. A first contribution of our work is the design of a new device with multiple degrees of freedom: the Roly-Poly Mouse (RPM). This device, whose base is rounded, aims to replace the traditional mouse. It has 6 degrees of freedom (3 translations of which 2 exploited and 3 rotations). We evaluated its performance and compared it to other devices for a task requiring 6 degrees of freedom (3D object manipulation). A second contribution of our work is the definition of a design space focusing on the physical aspect of the composition of devices: DECO. DECO relies on two axes: physical arrangement and physical manipulation. From this design space, we designed a compound device: the Roly-Poly Mouse 2, a device consisting of the combination of a Roly-Poly Mouse and a traditional mouse. We evaluated its performance and compared it to other devices through a RST task (Rotate-Scale-Translate, 5D task). [...]
Soares, Sebastião Roberto. „Conception et évaluation d'un système a base de connaissances pour l’élimination de déchet“. Lyon, INSA, 1994. http://www.theses.fr/1994ISAL0064.
Der volle Inhalt der QuelleThe present work deals with the development and utilisation of elementary industrial waste treatment models, in order to provide a decision support system. The approach consists in three main steps : First the major informations concerning waste treatment strategies (Thermal biological, physical, chemical and land-filling), and their interactions, were identified, interpreted and chained up. This first step consist in the collection of the knowledge required to model the global approach. Second, the knowledge was formalised into « production rules » and these were automated. From this procedure originated DECHAIDE (WAST'AID), a Knowledge Based System (KBS) which allows the user to orientate a waste, by referring to a limited number of parameters, throught the most technically adjusted strategies. Finally, DECHAIDE was evaluated on a technical, pragmatical and subjective basis. This assessment drew our attention on the consistency of this system and some of its weaknesses, as well as the influence on the user's performance. DECHAIDE's responses to real waste treatment scenarios was analysed. Educational and technical utility of KBS in a multidisciplinary field such as waste management was demonstrated and validated
Marif, Anouar. „Référentiel pour le développement d'un système de pilotage de la performance cohérent et réactif“. Doctoral thesis, Université Laval, 2021. http://hdl.handle.net/20.500.11794/70269.
Der volle Inhalt der QuelleThis thesis aims at elaborating a framework for the development of a coherent and responsive performance management system, interface between several distinct and complementary methods. The proposed framework integrates key performance management principle s and offers an improved and promising alternative to traditional performance measurement systems. A performance management system is an essential tool for an organization. Otherwise, the conception of this system remains a complex process given the several aspects and elements that the organization must integrate into its process of evaluating and improving performance, in particular disruptive events, often neglected in the literature and very difficult to take into account in the development of a performance management mechanism. Decision-making to cover the effects of a disruptive event is not immediate as it has an inertia which can lead to a loss of overall performance. Also, decision-makers rarely have the necessary tools to verify that the used key performance management components (Objectives-Decision variables-Performance indicators) are coherent and help to move the organization towards achievement of its expected objectives. Hence, the organization operates in an uncertain environment and must be adaptable to ensure its viability. This research is motivated by a strong need raised by the achieved literature review highlighting the need to develop a performance management system that responds to current management challenges, namely : consistency between the key management components performance and responsiveness to deal with disruptive events. As a result, the main objective of this thesis is to propose a framework for the development of a coherent and responsive performance management system. The contributions of this thesis are presented in four phases. In a first phase, based on the findings of the literature review, we underlined the need to propose a structural approach SIPCo (Consistent Performance Indicator System) to identify the key components of performance management and to ensure their consistency through two methods. A logical method based on (1) a decision-making system modeling approach in order to identify the decision-making centers and (2) an informational system modeling approach in order to establish a representation of the interactive part of the key components of the performance management of each of the decision-making centers. The SIPCo method is also based on a participatory method to support the logical approach in order to define the various key components of performance management with future users. In a second phase, we proposed a procedural approach SYPCo-R (Coherent and Responsive Performance Management System) by integrating "potential event", an essential element nowadays never integrated by other performance evaluation systems. Most performance management systems are based on the triplet "Objective – Decision variable - Performance indicator" while the proposed SYPCo-R is based on the quadruplet "Objective - Potential event - Decision variable - Performance indicator". The objective of SYPCo-R is to provide overall consistency in the use of the key performance management components, and responsiveness by integrating the notion of potential event into decision-making through a methodology for classifying decision variables allowing to thwart potential events that may hinder the achievement of objectives. In a third phase, we proposed a conceptual model MCR (Conceptual Model of Reactivity) which uses the fundamental properties of the notion of reactivity in the form of an algorithm composed of a set of operating rules to identify performance failures in terms of reactivity and their origins in order to adjust and consolidate SYPCo-R. In a fourth phase, we proposed a predictive approach based on simulation to evaluate and assess the impact of the values fixed on the alternatives associated with each decision variable chosen from the ranking resulting from the SYPCo-R approach. Furthermore, this approach aims to anticipate performance failures to adjust the parameters of MCR and SYPCo-R. This will provide decision-makers with an additional tool to ensure consistency and responsiveness based on anticipation and prediction. This thesis provides innovative solutions in the process of developing a performance management system by proposing an interface by proposing an interface framework between several distinct and complementary methods that effectively responds to the concerns of decision-makers. The proposed framework allows to identify the complexity of a system and to make it intelligible to decision-makers. Moreover, this framework accepts future extensions based on an optimized exploitation of real-time data.
Drăgoi, Cezara. „Automated verification of heap-manipulating programs with infinite data“. Paris 7, 2011. http://www.theses.fr/2011PA077189.
Der volle Inhalt der QuelleIn this thesis, we focus on the verification of safety properties for sequential programs manipulating dynamic data structures carrying unbounded data. We develop a logic-based framework where program specifications are given by formulas. First, we address the issue of automatizing pre/post-condition reasoning. We define a logic, called CSL, for the specification of linked structures or arrays, as well as compositions of these structures. The formulas in CSL can describe reachability relations between cells in the heap following some pointer fields, the size of the heap, and the scalar data, We prove that the satisfiability problem of CSL is decidable and that CSL is closed under the computation of the strongest post-condition. Second, we address the issue of automatic synthesis of assertions for programs with singly-linked lists. We define an abstract interpretation based framework which combines a specific finite-range abstraction on the shape of the heap with an abstract domain on sequences of data. Different abstractions on sequences are considered allowing to reason about their sizes, the multisets of their elements, or relations on their data at different positions. We define an interprocedural analysis that computes the effect of each procedure in a local manner, by considering only the part of the heap reachable from its actual parameters. We have implemented our techniques in a tool which shows that our approach is powerful enough for automatic generation of non-trivial procedure summaries and pre/post-condition reasoning
Merlin, Bruno. „Méthodologie et instrumentalisation pour la conception et l'évaluation des claviers logiciels“. Toulouse 3, 2011. http://thesesups.ups-tlse.fr/1323/.
Der volle Inhalt der QuelleThe expansion of mobile devices turn text input performances a major challenge for Human-Machine Interaction. We observed that, even if traditional QWERTY soft keyboards or telephone based soft keyboard were evaluated as poorly efficient, and, even if several alternatives evaluated as more efficient were proposed in the research field, these new alternatives are rarely used. Based on this observation, we argue that the goal of soft keyboard evaluation focus on long term performances whereas does not take into account the perspective for a user to use it in his quotidian. Consequently, we propose a complementary evaluation strategy base on heuristic evaluation methodology. In order to ease the evaluation and design of new soft keyboards, we proposed a new version (E-Assist II) of the E-Assiste plate-form. This plate-form aims, at first, to facilitate the design and procedure of experimentations and, more generally, to guide the theoretical, experimental and heuristic evaluations. A compact version (TinyEAssist) enables to perform experimentation on mobile environment such as mobile phone. At second, based on soft keyboard structure study, we proposed a keyboard specification language enabling to generate complex keyboard (including soft keyboard interacting with prediction systems). The generated soft keyboards may be used into the experimentation plate-form or interacting with the exploration system. At last, based on the criteria highlighted by the heuristic evaluation, we proposed four new soft keyboard paradigms. Among them two paradigms showed interesting perspectives: at first the multilayer keyboard consist in accompanying the user from a standard QWERTY layout to an optimized layout during a transition period; the second consist in accelerating the access to the characters such as accents, upper-case, punctuation, etc. , frequently ignored in the keyboard optimizations
Capobianco, Antonio. „Stratégies d'aide en ligne contextuelles : acquisition d'expertises, modélisation et évaluation expérimentale“. Nancy 1, 2002. http://docnum.univ-lorraine.fr/public/SCD_T_2002_0286_CAPOBIANCO.pdf.
Der volle Inhalt der QuelleSouchard, Laurent. „Les logiciels tuteurs fermés: institutions d'apprentissage et d'enseignement ? : le cas du début du secondaire“. Paris 7, 2009. http://www.theses.fr/2009PA070030.
Der volle Inhalt der QuelleClosed Tutors Software or CTS are used in classrooms in many schools in mathematics. To analyze their potential role in the school, we have built a model based on the notion of institution, central to the Anthropological Theory of Didactics, defined by us, based on criteria of social reality, legitimacy, stability and specificity. To understand whether CTS can be used as an institution of teaching and learning, each of four software in our study has been fully inspected by an expert whose captures videos allow us some comparisons with the work of students our experiment. The decryption of all data was conducted using a software analysis of behavior, the Observer from Noldus. More specifically, with regard to mathematics learning, we chose to analyze how the four CTS offer numeracy, whether arithmetic, numerical or algebraic. For this, the theoretical framework developed by Houdement and Kuzniak for learning geometry has been extended including the notions of paradigm and workspace. The tests we conducted we have shown that CTS of our study can hardly be used as an autonomous institution of the regular classroom which is the primary institution. But their use can be valued for student learning by creating parallel institutions adapted
Meyer, Svetlana. „Conception et évaluation d'Evasion, un logiciel éducatif d'entraînement des capacités d'attention visuelle impliquées en lecture“. Thesis, Université Grenoble Alpes (ComUE), 2019. http://www.theses.fr/2019GREAS002/document.
Der volle Inhalt der QuelleLearning to read is a complex activity that relies on different cognitive abilities, including visual attention. The role of visual attention in learning to read is widely documented in the scientific literature but absent from school curricula. In this thesis work, we designed an original educational software, called Evasion, for visual attention training in the classroom and we evaluated its impact on beginning readers' performance.A literature review was first conducted to identify the dimensions of visual attention that are involved in reading and how best to train them. We then propose a conceptual framework which allows us to interpret these results. We conclude that the dimensions to be targeted are the total amount of visual attention resources and attention spatial dispersion. These two facets of attention seem to be particularly well driven by action video games, whose effect on visual attention has also been characterized in our model.To train visual attention resources and dispersion as well as possible within Evasion, we have mixed the tasks known to improve these attentional dimensions with the properties of action video games. Our software includes four training mini-games and an adaptive difficulty algorithm developed by our team to adjust the game properties online to the child needs. The training program was provided in classroom over a period of ten weeks at a rate of three 20-minute sessions a week. It was proposed to a large sample of 730 beggining readers for reading difficulty prevention.The impact of Evasion was assessed before and after training as compared to a control group that used an intervention program conceived to improve oral comprehension in English. The results of this ecological experiment show that visual attention and reading did not improve more following Evasion than control training. Additional analyses revealed that training time was poorly respected while this factor relates to the magnitude of improvement in the attentional dimensions we targeted. The analyses further suggest a problem in the level of difficulty of the mini-games. Overall, our work opens up new perspectives on the improvements to be made to our software and, above all, on the conditions for successful implementation of ecological experiments
Badr, Georges. „Modèle théorique et outil de simulation pour une meilleure évaluation des claviers logiciels augmentés d'un système de prédiction de mots“. Toulouse 3, 2011. http://thesesups.ups-tlse.fr/1549/.
Der volle Inhalt der QuellePredictive model and simulation tool for a best evaluation of soft keyboard augmented by words prediction list The software keyboards are used to enable text input in mobility and for devices without physical keyboards, such as the new generation of mobile phones. However, these keyboards have several drawbacks such as slowness text entry and fatigue generated for motor impaired users. The solution was to combine software keyboard to lists containing the words likely to continue the word introduced by the user. While these lists, so-called prediction lists, reduce the number of clicks and the number of operations, the speed of user input has decreased. An experiment with an eye tracking system has identified the "strategies" of the user while using and searching a list of words. These results were helpful to refine the prediction models in order to reduce the gap between the performance predicted and the performance actually recorded. Based on observations made during the first experiment, we propose two variants of the use of word prediction list. The first proposes a new way to interact with the list of words and allows maximum use of it. The second evaluates a repositioning of the list of words in order to reduce the number of eye movements to the list. These two propositions were theoretically and experimentally evaluated by users. These software can improve the input performances compared with a classic word prediction list
Huet, Fabrice. „Objets mobiles : conception d'un middleware et évaluation de la communication“. Phd thesis, Université de Nice Sophia-Antipolis, 2002. http://tel.archives-ouvertes.fr/tel-00505420.
Der volle Inhalt der QuelleNavet, Nicolas. „Évaluation de performances temporelles et optimisation de l'ordonnancement de tâches et messages“. Vandoeuvre-les-Nancy, INPL, 1999. http://docnum.univ-lorraine.fr/public/INPL_T_1999_NAVET_N.pdf.
Der volle Inhalt der QuelleBoussadi, Abdelali. „L'aide à la validation pharmaceutique : conception et évaluation d’un système d’alerte à base de règles pour la validation pharmaceutique des prescriptions médicamenteuses“. Paris 6, 2013. http://www.theses.fr/2013PA066246.
Der volle Inhalt der QuelleUsing an ‘Agile’, business oriented and development platform-independent software design process (BRDF, Business Rule Developement Framework) meets one of the strategic objectives of the U. S. Roadmap for national action on clinical decision support by taking into consideration three important criteria posing a particular challenge to software designers: 1) business processes, 2) knowledge modeling of the context of application, and 3) the agility of the various design steps. Using BRDF at the Georges Pompidou University Hospital (HEGP) in the business context of pharmaceutical validation allows to include the end user (the pharmacists) in 5 of the 8 design steps of BRDF, we have also been able to derive 427 clinical decision rules. 140 clinical decision rules have been implemented as clinical alerts to control and adapt nephrotoxic medication orders; these rules checked 71,413 medication orders and fired 5824 (8. 16%) alerts. Using a clinical data warehouse-based process for refining medication orders alerts facilitates alert optimization toward the goal of maximizing the safety of the patient and minimizing overridden alerts. Using this process on the alerts implemented with BRDF to control and adapt nephrotoxic medication orders prescribed at the HEGP showed that after several iterations of this process, 45 (16. 07%) decision rules were removed, 105 (37. 5%) were changed and 136 new rules were introduced. Prospective validation of the alert system at the HEGP hospital during 7 months study period showed the superiority of the alert system in comparison with the daily pharmacist’s practice of the medication orders validation activity
Sapolin, Bertrand. „Construction d'une méthodologie d'évaluation statistique des logiciels de dispersion atmosphérique utilisés en évaluation de risque NRBC et développement d'un modèle d'estimation de l'incertitude des résultats“. Paris 7, 2011. http://www.theses.fr/2011PA077217.
Der volle Inhalt der QuelleAtmospheric dispersion of contaminated clouds following deliberate or accidental releases of CBRN (chemical, biological, radiological, nuclear) toxic substances may have serious health impacts. In order to estimate them, CBRN risk assessment activities rely, among other things, on atmospheric dispersion models. These models compute the concentration field of pollutant in order to quantify potential adverse effects on human population. They need to be evaluated, which means their outputs have to be compared to experimental data within an appropriate methodology. Now, existing evaluation methodologies have two flaws: firstly they are not suited to risk assessment, and secondly their results may be somewhat arbitrary because they are based on direct comparisons between observations and model results. Turbulence in the atmospheric boundary layer introduces a large random component in the observations, and thus an inevitable gap between observations and model results, be the latter "perfect". In this thesis two tools have been built to fix these issues. The first one is an evaluation methodology suitable for the risk assessment context. The second one is an empirical statistical model meant to estimate the uncertainty in the simulation results. It can be associated to an atmospheric dispersion model with probabilistic capabilities in order to produce an envelop of the answer rather than a unique "average" result, the latter being of little use despite its omnipresence in current risk assessment studies. When used jointly, the two tools developed in this thesis enable model/experiment comparisons to be more objective and less subject to experimental randomness
Alba, Winckler Marco Antonio. „StateWebCharts : une notation formelle pour la modélisation de la navigation des applications Web“. Toulouse 1, 2004. http://www.theses.fr/2004TOU10026.
Der volle Inhalt der QuelleIn spite of the apparent facility of build Web pages given by current visual environments, the development over the World Wide Web is complex due to many factors such as the evolving nature of applications, the multidisciplinary nature of development team, the competitive points of views for the application and complexity of user requirements, and the unrealistic and narrow schedules for the development. To deal with such as complexity, modelling support is essential. Currently modelling methods provide little support to describe in complete and unambiguous manner navigation in web applications even though navigation is considered as a critical element. The present work proposes a formal description technique namely StateWebCharts (SWC), which extends StateCharts models to model the navigation of Web applications and a design environment supporting the edition and simulation of models described in the formalism
Gamatié, Abdoulaye. „Modélisation polychrone et évaluation de systèmes temps réel“. Phd thesis, Université Rennes 1, 2004. http://tel.archives-ouvertes.fr/tel-00879359.
Der volle Inhalt der QuelleAlain, Sylvie. „Évaluation d'outils d'analyse du cycle de vie pour étudier la performance environnementale de bâtiments en bois innovants“. Master's thesis, Université Laval, 2015. http://hdl.handle.net/20.500.11794/26005.
Der volle Inhalt der QuelleTo reduce the environmental impact of a building, integration of life cycle assessment (LCA) during design phase can follow two approaches: use of a simplified tool, such as Athena, by construction professionals, or use of a more complex tool, such as SimaPro, in collaboration with an LCA analyst. The objective of the project is to evaluate the strengths and limitations of these tools when analysing innovative timber buildings in Canada. The results are based on a case study: a six-storey office building with glulam structure that was outside prescriptive standard at time of construction. For Athena, possible improvements include more flexibility, including a better control over the maintenance cycles of materials, as well as more information regarding the uncertainty of the results. SimaPro offers more flexibility and transparency. However, more processes representing building materials in the Canadian context would be necessary.
Yasini, Seyed Mobin. „Conception et évaluation de méthodes et outils logiciels pour améliorer la qualité de la prescription et de la réalisation des examens de Biologie“. Paris 6, 2013. http://www.theses.fr/2013PA066651.
Der volle Inhalt der QuelleLaboratory tests are not always prescribed appropriately. Guidelines have been developed to rationalize the test-ordering behavior and maximizing the appropriateness of laboratory medicine. However; these guidelines are not frequently consulted by physicians. We decided to develop a system facilitating the consultation of these guidelines. Unified Modeling Language was used to represent the categories of information elements contained in these documents and their relationships to each other. We used the model generated to implement a computerized interface. The interface was found to be rapid and easy to use. In the next step, we went further to implement the test-ordering rules in a HIS (hospital information system) to automatically display relevant recommendations according to the patient context. Therefore, we analyzed the aspects related to the integration of test-ordering recommendations in HISs. Firstly, we evaluated the implementability of test-ordering rules according to intrinsic and extrinsic factors to the recommendations. We then transformed the content of the guidelines in an executable format using a conceptual modeling and subsequently, we have implemented thirty two test-ordering rules in our HIS named ORBIS. Our guideline modeling led to an application for facilitating access to these documents. It also led to the development of a manual for writing harmonized laboratory guidelines. The modeling of test-ordering rules and the evaluation of their implementability, have clarified the gaps against the implementation. We finally implemented a selection of test-ordering rules in the hospital information system to integrate them into daily practice
Abouelala, Mourad. „Évaluation des outils de modélisation et de simulation dans le domaine de l’enseignement de la fabrication mécanique : cas des logiciels de la FAO“. Thesis, Aix-Marseille, 2015. http://www.theses.fr/2015AIXM3056/document.
Der volle Inhalt der QuelleSimulation tools as means to facilitate setting up a production have become very common in industry and, therefore, in education. Among several significant problems, like pedagogical issues, the cost of equipment acquisition and the adaptability of students to the multitude of Computer-Aided Manufacturing, it is raise in education the problem of selecting software in order to ensure maximum effectiveness of teaching process and students learning.This research study was designed to investigate a methodology to select CAM software that could be effective as a support of CAM learning in university, taking into account different features of CAM learning. We determine the student effectiveness in learning factors of CAM software and further, determine the relationship between the different main factors. The research was conducted using a questionnaire submitted to 50 students attending the second academic year of Mechanical Design and Production. The study provides results from empirical test of these relationships and provides criteria for evaluation simulation of software in education
El, Samad Mahmoud. „Découverte et monitoring de ressources pour le traitement de requêtes dans une grille de données“. Toulouse 3, 2009. http://thesesups.ups-tlse.fr/661/.
Der volle Inhalt der QuelleThe distributed data management in grid systems raises new problems and presents real challenges: resource discovery, resource allocation, replication, monitoring services for query optimization. . . Etc. Grid systems differ mainly from parallel and distributed systems by the two characteristics: the large scale and the system instability (i. E. The dynamicity of nodes). In this thesis, we are interested in the resource discovery phase for an efficient query evaluation in data grid environments. First, we present a state of the art on the main research works of resource discovery by focusing on the important criteria (e. G. Scaling, reliable discovery, maintenance cost) for data source discovery which is specific to data grid environments. In this perspective, we propose a new method of data source discovery based on Distributed Hash Tables (DHT) allowing a permanent access -in the presence of the dynamicity of nodes- from any node of a Virtual Organization VOlocal towards all other VOi (i [different from]local) in the system with a minimum maintenance cost between the DHT. After the resource discovery phase, it is very important to monitor the current state of resources especially that these last ones are shared on very large scale environments. The resource monitoring can be made during the initial allocation or the execution phase, in order to take decisions on the choice of the execution node of a join (or of a part of a join) for example. In this context, we propose a method considering the variation of host and network parameter values, at runtime, in the calculation of the response time of a relational operation. The proposed method integrates monitoring information into an execution model based on mobile agents developed in the Pyramid team. Finally, we validate our proposals by a performance evaluation
Meynard, Jean-Baptiste. „Réalisation et évaluation d'un système de surveillance en temps réel pour les forces armées en opérations“. Aix-Marseille 2, 2007. http://www.theses.fr/2007AIX20690.
Der volle Inhalt der QuelleNoureddine, Adel. „Towards a better understanding of the energy consumption of software systems“. Thesis, Lille 1, 2014. http://www.theses.fr/2014LIL10009/document.
Der volle Inhalt der QuelleWith the rise of the usage of computers and mobile devices, and the higher price of electricity, energy management of software has become a necessity for sustainable software, devices and IT services. Energy consumption in IT is rising through the rise of web and distributed services, cloud computing, or mobile devices. However, these approaches do not use proper energy information for their adaptations rendering themselves limited and not energy-aware. They do not provide an energy feedback of software, and limited information is available on how and where energy is spend in software code. To address these shortcomings, we present, in this thesis, energy models, approaches and tools in order to accurately estimate the energy consumption of software at the application level, at the code level, and for inferring energy evolution models based on the method's own input parameters. We also propose Jalen and Jalen Unit, energy frameworks for estimating how much energy each portion of code consumes, and for inferring energy evolution models based on empirical benchmarking of software methods. By using software estimations and energy models, we are able to provide accurate energy information without the need of power meters or hardware energy investment. The energy information we provide also gives energy management approaches direct and accurate energy measurements for their adaptations and optimizations. Provided energy information also draws a model of energy consumption evolution of software based on the values of their input parameters. This gives developers knowledge on energy efficiency in software leading to choose some code over others based on their energy performance
Haddad, Axel. „Shape-Preserving Transformations of Higher-Order Recursion Schemes“. Paris 7, 2013. http://www.theses.fr/2013PA077264.
Der volle Inhalt der QuelleHigher-order recursion scheme model functional programs in the sense that they describe the recursive definitions of the user-defined functions of a pro-gram, without interpreting the built-in functions. Therefore the semantics of a higher-order recursion scheme is the (possibly infinite) tree of executions of a program. This thesis focus on verification related problems. The MSO model- checking problem, i. E. The problem of knowing whether the tree generated by a scheme satisfy an monadic second order logic (MSO) formula, has been solved by Ong in 2006. In 2010 Broadbent Carayol Ong and Serre extended this result by showing that one can transform a scheme such that the nodes in the tree satisfying a given MSO formula are marked, this problem is called the reflection problem. Finally in 2012 Carayol and Serre have solved the selection problem: if the tree of a given scheme satisfies a formula of the form "There exist a set of node such that. . . ", one can transforrn the scheme such that a set witnessing the property is marked. In this thesis, we use a semantics approach to study scheme-transformation related problems. Our goal is to give shape-preserving solutions to such problems, i. E. Solutions where the output scheme has the same structure as the input one. In this idea, we establish a simulation algorithm that takes a scheme G and an evaluation policy r E {0I,I0} and outputs a scheme G' such that the value tree of G' under the policy Tif= is equal to the value tree of G under Then we give new proofs of the reflection and selection, that do not involve collapsible pushdown automata, and are again shape-preserving
Charguéraud, Arthur. „Vérification de programmes à l'aide de formules caractéristiques“. Paris 7, 2010. http://www.theses.fr/2010PA077214.
Der volle Inhalt der QuelleThis dissertation describes a new approach to program verification, based on characteristic formulae the characteristic formula of a program is a higher-order logic formula that describes the behavior of that program, in the sense that it is sound and complete with respect to the semantics. This formula can be exploited in an interactive theorem prover to establish that the program satisfies a specification expressed in the style of separation logic, with respect to total correctness. The characteristic formula of a program is automatically generated from its source code alone. In particular, there is no need to annotate the source code with specifications or loop invariants, as such information can be given in interactive proof scripts. One key feature of characteristic formulae is that they are of linear size and that they can be pretty-printed in a way that closely resemble the source code they describe, even though they do not refer to the syntax of the programming language Characteristic formulae serve as a basis for a tool, called CFML, that supports the verification of CAML programs using the coq proof assistant. CFML has been employed to verify about half of the content of okasaki's book on purely functional data structures, and to verify several imperative data structures such as mutable lists, sparse arrays and union-find. Cfml also supports reasoning on higher-order imperative functions, such as functions in cps form and higher-order iterators
Ammar-Boudjelal, Farid. „Analyse des structures symboliques manipulées dans les langages de spécification : proposition et évaluation de stratégies adaptées au contrôle de la qualité : application au langage LDS“. La Rochelle, 1999. http://www.theses.fr/1999LAROS029.
Der volle Inhalt der QuelleJacquemin, Maxime. „Arithmétiques relationnelles pour l'analyse par interprétation abstraite de propriétés de précision numérique“. Electronic Thesis or Diss., université Paris-Saclay, 2021. http://www.theses.fr/2021UPASG050.
Der volle Inhalt der QuelleFloating point arithmetic is the most used approach to perform mathematical computations using real numbers with a computer. However, this approach has a default : each operation can introduce an error, that is, a difference with the result we would have obtained using real numbers. Even if those errors are very small, they can accumulate et provoke serious bugs, particularly in critical domains like aeronautics or nuclear energy production for example. Thus, we have to be able to guarantee that the errors introduced by the use of floating point arithmetic do not cause problems, in other words, that they are small enough for the program to behave as expected. To answer this need, we propose an abstract interpretation based static analysis, along with a new abstract domain, that computes an overapproximation of the errors introduced by floating point arithmetic. This analysis is based on the interaction, performed through a reduced product, between two conceptions of the concept of error : absolute error, intuitive and helpful to understand the analyzed program, and relative error, closer of floating point arithmetic functioning. Our analysis relies on the combination of affine and intervals arithmetics, and thus have relational reasoning capacities. However, this combination has difficulties dealing with non linear operations, whose precision has a huge impact on relative errors evaluations. Thus, we propose two approaches to tackle this problem. The first one consists of several improvements of this combination that help evaluating multiplications and divisions more precisely without impacting performances significantly. The second one consists of the definition of a new relational arithmetic, specifically designed to represent relative errors. Besides, we have implemented a prototype of our analysis within the Frama-C/Eva tool. The first experimental results enlighten the advantages of our analysis against state of the art tools
Rene, Amandine. „Conception d'une méthodologie d'évaluation et de validation cliniques d'un dispositif médical logiciel d'aide au diagnostic en imagerie : application au suivi lésionnel en oncologie“. Thesis, Montpellier 1, 2014. http://www.theses.fr/2014MON1T009.
Der volle Inhalt der QuelleAided-diagnosis software in imaging are now integrated to radiological workflow but they are also key elements in medical research. Defined as medical devices, recent regulatory changes now impose clinical evaluations on manufacturers. The pharmaceutical industry benefits from a proven method in drug evaluation. Yet, transposition to medical devices is not fully effective and even more complex in the case of software. The aim of this thesis is to propose a clinical evaluation and validation methodology for these devices. The first part introduces the normative and regulatory framework as well as methodologies from various areas. The synthesis of these data allows the presentation of the first methodology item enabling the clinical evaluation of software performance. In order to further the analysis, the second part of the methodology is dedicated to evaluation/validation of software ergonomics, a sensitive issue in medical software. Finally, to restore these devices to their true place in health care, the last item proposes to highlight their impact in clinical practice and in patient management, through their implication in the search for new biomarkers in imaging. These various methods comply with and go beyond the regulatory framework in order to meet the expectations of all the stakeholders involved in the life cycle of aided-diagnosis software in imaging. To conclude, an example of its application is presented showing the impact of a dedicated software in the evaluation of oncology response in imaging
Cheramy, Maxime. „Etude et évaluation de politiques d'ordonnancement temps réel multiprocesseur“. Thesis, Toulouse, INSA, 2014. http://www.theses.fr/2014ISAT0025/document.
Der volle Inhalt der QuelleNumerous algorithms have been proposed to address the scheduling of real-time tasksfor multiprocessor architectures. Yet, new scheduling algorithms have been defined veryrecently. Therefore, and without any guarantee of completeness, we have identified morethan fifty of them. This large diversity makes the comparison of their behavior and performancedifficult. This research aims at allowing the study and the evaluation of keyscheduling algorithms. The first contribution is SimSo, a new simulation tool dedicatedto the evaluation of scheduling algorithms. Using this tool, we were able to compare theperformance of twenty algorithms. The second contribution is the consideration, in the simulation,of temporal overheads related to the execution of the scheduler and the impactof memory caches on the computation time of the jobs. This is done by the introductionof statistical models evaluating the cache miss ratios
Belley, Denis. „Évaluation du volume et des pertes de qualité causées par les principaux défauts des tiges d'épinette blanche et de pin gris“. Doctoral thesis, Université Laval, 2014. http://hdl.handle.net/20.500.11794/25295.
Der volle Inhalt der QuelleThe first objective of this work is to characterize the properties of jack pine and white spruce and develop a lumber volume correction factor due to stem shape for both species. The purpose of this section is to more precisely predict lumber volume from forest inventory data. The second objective is to model the presence of knots and evaluate their impact on lumber yield using Optitek, a sawing simulation software. To achieve this goal, a new software had to be developed in order to extract CT image information and make it compatible with the Optitek software. Hence, the second objective is simulate lumber sawing while taking into account the knot dimension and location. The trees come from a Nelder (1962) type plantation of white spruce (Picea glauca (Moench) Voss) and jack pine (Pinus banksiana Lamb.). This type of plantation is characterized by a circular shape which makes the stand density vary from the center to the periphery of the circle. This site makes possible the study of two different species growing in similar conditions. Several field data were analyzed such as diameter at breast height (DBH), curvature, taper, total tree length, live crown size. First, the results show that tree characteristics are strongly influenced by the stand density. Indeed, DBH, total height, taper, length and width of the live crown, diameter of the five largest branches dead and alive generally increased with greater distance between the trees, both for jack pine and white spruce. The results obtained with the simulation using the knot information gave a lumber sawing volume and value significantly higher. Both jack pine and white spruce have produced more No.2 & better pieces when knots have been considered in the sawing simulations (15% for white spruce and 40% for jack pine). As for lumber value, the increase varied from 9.5% to 15.1% for white spruce and 15.2% to 23.0% for jack pine. Again, the larger jack pine knot size could explain this greater potential for improvement.
Benarif, Samir. „Plate-forme multi-agent pour la reconfiguration dynamique des architectures logicielles“. Versailles-St Quentin en Yvelines, 2006. http://www.theses.fr/2006VERS0038.
Der volle Inhalt der QuelleA dynamic change to architecture is an active area of research within the software architecture community. The objective of all architecture reconfiguration, adaptation and evolution is the improvement of the quality attributes of the software architecture. Only recently the reconfiguration of software architecture at runtime has grown up considerably for the construction of reliable evolutionary systems. The structure of these systems is dynamic and continuously changing. Consequently, architectures must have the ability to react to events and perform architectural changes autonomously. In this thesis, we provide a new approach based on software multi-agent platform. Such agents are used to supervise the architecture, gather information from it and its environment, capture dynamic changes, and manage them. They monitor the components dynamically and adapt them to structural changes in the architecture. They evaluate dynamically the quality attributes of the architecture. This evaluation will ensure the correctness, robustness, security, availability, etc. As the changes take place so that the system conforms to its architecture and remains in conformance throughout its lifetime
Glory, Anne-Cécile. „Vérification de propriétés de programmes flots de données synchrones“. Grenoble 1, 1989. http://tel.archives-ouvertes.fr/tel-00335630.
Der volle Inhalt der Quelle