Dissertations / Theses on the topic 'Exigences des modèles de confiance'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the top 50 dissertations / theses for your research on the topic 'Exigences des modèles de confiance.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.
Abidi, Rihab. "Smart Rοad Signs based trust management mοdels fοr cοοperative Ιntellgent Τranspοrtatiοn Systems." Electronic Thesis or Diss., Normandie, 2024. http://www.theses.fr/2024NORMR111.
Full textThe increasing complexity of urban traffic systems has made congestion a significant challenge, leading to severe economic, environmental, and social impacts. Intelligent Transportation Systems (ITSs) have emerged as a promising solution to mitigate these challenges by enabling dynamic traffic management. However, the reliability of data within ITSs represents an increasingly significant challenge. The introduction of erroneous data by defective or malicious sensors can lead to malfunctions or intentional disruptions of the system. In this context, trust management models assume a crucial importance.Most of the existing trust models propose vehicle-centric approaches. However, the high mobility and dynamic nature of the ITS environments affects the stability and scalabity of such systems. Accordingly, proposing novel trust models designed specifically for ITSs to enhance the accuracy, security, scalability and stabilty of traffic information dissemination constitutes the overall goal of this thesis.First, we proposed a generic architecture for a trust framework, leveraging Smart Road Signs (SRSs). The conception of this architecture was built upon the output of a deep investigation of the state of the art. This framework has been, then, developed to propose two novel trust models. The first model, considers the contextual information and multi-source data aggregation to assess the trustworthiness of reported traffic events and the different nodes of the network. Additionally, the model applies a bi-level trust evaluation combining Bayesian Inference and a dynamic weighted sum approach. Furthermore, a predictive-based Baysian Inference was proposed to enhance the accuracy of trust evaluation. Thereafter, a communication trust model was proposed, to complement the previous contribution, using Quality of Service (QoS) metrics to evaluate the SRSs behaviour. This model introduces a self-organizing trust model to track the SRSs' behaviours and establishes stable environments using a fuzzy-based Dempster Shafer Theory (DST). In fact, we consider a more realistic scenario where all the nodes are vulnerable to attacks and failure. Thus, the main objective of this model is to ensure that the system remains operational even in hostile environments, by mitigating the inherent single point of failure vulnerability characteristic of centralized network architectures.\\The proposed models were validated through simulations, showing their effectiveness in identifying malicious nodes and mitigating erroneous traffic reports. The results demonstrate that considering multi-source data aggregation and context-aware information increases the accuracy of trust evaluation. Furthermore, the adoption of an infrastructure-based framework leveraging a decentralized and hierarchical architecture enhances the scalability and stability of the trust models, which is suitable for such environment
Humbert, Sophie. "Déclinaison d'exigences de sécurité, du niveau système vers le niveau logiciel, assistée par des modèles formels." Bordeaux 1, 2008. http://www.theses.fr/2008BOR13580.
Full textBrottier, Erwan. "Acquisition et analyse des exigences pour le développement logiciel : une approche dirigée par les modèles." Phd thesis, Université Rennes 1, 2009. http://tel.archives-ouvertes.fr/tel-00512174.
Full textGam, El Golli Inès. "Ingéniérie des exigences pour les systèmes d'information décisionnels : concepts, modèles et processus - la méthode CADWE." Phd thesis, Université Panthéon-Sorbonne - Paris I, 2008. http://tel.archives-ouvertes.fr/tel-00363878.
Full textGam, Inès. "Ingénierie des exigences pour les systèmes d'information décisionnels : concepts, modèles et processus : la méthode CADWE." Paris 1, 2008. http://www.theses.fr/2008PA010032.
Full textBensimhon, Larry. "Excès de confiance et mimétisme informationnel sur les marchés d'actions." Paris 1, 2006. http://www.theses.fr/2006PA010081.
Full textDechartre, Philippe. "Les exigences de fonctionnement de l'entreprise : l'acteur doit-il devenir une donnée ou peut-il être un paramètre ?" Paris 9, 1993. https://portail.bu.dauphine.fr/fileviewer/index.php?doc=1993PA090038.
Full textLe, Pors Éric. "Interprétation sémantique des exigences pour l'enrichissement de la traçabilité et pour l'amélioration des architectures de systèmes complexes." Télécom Bretagne, 2010. http://www.theses.fr/2010TELB0127.
Full textThe increasing complexity of software-intensive systems produced in the industry is related to the complexity of engineering: reduction of production delays, activities organisation, managing subcontractors. . . This complexity is also due to the increasing number of new functions and data to be processed. Moreover, it depends on the number of physical and software components to interact, constrained in terms of functionality and quality, which implies ever more difficult designs for these systems. To control their production, a high level of expertise is required in more and more domains. Moreover, this knowledge must be shared by many teams collaborating on building specification and implementing separate and specific entities of the system. System Engineering process describes the different building phases, from operational need analysis, through system need analysis, to the establishment of an architectural solution that meets the functional need expressed by the customer. This need is formalized in requirements. This PhD work proposes an approach to model architectural solutions by expressing the required functions associated with their non-functional constraints using different viewpoints. These viewpoints can represent different facets of the architectural solution, separing thus the global complexity. Additionally, controls allow to ensure that the solution is compliant with coherence and design rules. We also propose a method coupled with the realization of a conceptual model of the system and its environment, in order to obtain better written requirements. The conceptual model allows us to make an interpretation of the semantics present in these requirements in order to extract control and verification elements. These items captured in requirements will annotate elements of different viewpoints, representing the architecture, by exploiting traceability links established by design engineers. The verification of these annotations consideration and the automatic generation of viewpoints from information contained in the requirements allows us to obtain architectural solutions closer to the customer needs. The conceptual model also gives us the opportunity to capitalize, in different expertise domains, on engineers knowledge. It also enables us to establish a reference model as a basis for discussion and training of new team members
Chabot, Martial. "Tests automatisés dirigés par les exigences pour systèmes cyber-physiques." Thesis, Université Grenoble Alpes (ComUE), 2018. http://www.theses.fr/2018GREAM056/document.
Full textNowadays, many major manufacturers in different fields are working towards the design of smart products to meet new market needs. The design of these systems is increasingly complex, as they are composed of many physical components controlled by applications running on processors. In order to support this multi-disciplinary design, the solution we propose in this thesis is to guide the system modeling and design by taking into account the test scenarios that will be used to validate its requirements. The method that we propose suggests reasoning at the system level and starting the design process by formalizing validation tests. In other words, it amounts to specifying the acceptance criterion(s) for the requirement as well as the test scenario necessary to verify it. Formalizing the tests in this way makes it possible to analyze the formulation of the requirements themselves and to remove any ambiguity. We propose a generic model of the structural view of the test infrastructure, and an associated UML profile. The behavioral view is modeled as SysML sequence diagrams. The test infrastructure interfaces provide testability constraints for the system to be designed. We have developed a tool, ARES (Automatic GeneRation of Executable Tests from SysML), which automatically transforms this structural/behavioral specification of the tests into simulatable or executable scenarios. These scenarios, analogous by construction, will be used to validate simulatable models of the system (Matlab/Simulink), then during the process of final verification of the product (with a TestStand environment). We present the application of this tool on various case studies associated with Schneider Electric products
Benson, Marie Anne. "Pouvoir prédictif des données d'enquête sur la confiance." Master's thesis, Université Laval, 2021. http://hdl.handle.net/20.500.11794/69497.
Full textConfidence survey data are time series containting the responses to questions aiming to measure confidence and expectations of economic agents about future economic activity. The richness of these data and their availability in real time attracts the interest of many forecasters who see it as a way to improve their traditional forecasts. In this thesis, I assess the predictive power of survey data for the future evolution of Canadian GDP, while comparing the forecasting performance of the Conference Board of Canada own confidence indices to the indicators I construct using principal component analysis. Using three simple linear models, I carry out an out-of-sample forecasting experiment with rolling windows on the period 1980 to 2019. The results show that principal component analysis provides better-performing indicators than the indices produced by the Conference Board. However, the results of the study cannot show that clear that confidence improves forecasting unambiguently once the lagged growth rate of GDP is added to the analysis.
Pétin, Jean-François. "Méthodes et modèles pour un processus sûr d'automatisation." Habilitation à diriger des recherches, Université Henri Poincaré - Nancy I, 2007. http://tel.archives-ouvertes.fr/tel-00202431.
Full textDuffet, Carole. "Quelle confiance accorder au modèle solution de la tomographie de réflexion 3D ?" Montpellier 2, 2004. http://www.theses.fr/2004MON20222.
Full textSannier, Nicolas. "INCREMENT : une approche hybride pour modéliser et analyser dans le large les exigences réglementaires de sûreté." Phd thesis, Université Rennes 1, 2013. http://tel.archives-ouvertes.fr/tel-00941881.
Full textLacroix, Julien. "Vers un cloud de confiance : modèles et algorithmes pour une provenance basée sur les contrôles d'accès." Thesis, Aix-Marseille, 2015. http://www.theses.fr/2015AIXM4365.
Full textThis document is the culmination of three years of thesis. Having introduced and cleared the general issue related to my thesis subject, i.e. « how to use provenance data to enforce trust in the Cloud? », I present a description of the concepts, models and languages related to my thesis and the state of the art that can partially address this issue. Secondly, I present the solution based on provenance that I bring to access controls, in distributed systems such as the Cloud: PBAC². It is based on a system combining both provenance models (PROV-DM) and access controls (generic rules of RBAC type with regimentation and regulation policies). This system uses a central execution engine denoted the mediator to enforce security and foster trust in the Cloud, via rule checking over a part of the retrospective provenance graph it received. Furthermore, I describe the study I made of three PBAC² extensions: (1) the integration of the PROV-O ontology and its pros and cons regarding the size of the (sub)graph received by the mediator; (2) the construction of the PBAC² adaptation with the regulation security approach; (3) the translation of PBAC² rules into PROV CONSTRAINTS constraints. Moreover, PBAC² is applied to a realistic example that belongs to the healthcare sector. A PBAC² prototype and a demonstration on some practical examples with a local machine and a real Cloud system illustrate the scope of this work. In conclusion of the thesis, I propose four perspectives of this work
Arlot, Sylvain. "Rééchantillonnage et Sélection de modèles." Phd thesis, Université Paris Sud - Paris XI, 2007. http://tel.archives-ouvertes.fr/tel-00198803.
Full textLa majeure partie de ce travail de thèse consiste dans la calibration précise de méthodes de sélection de modèles optimales en pratique, pour le problème de la prédiction. Nous étudions la validation croisée V-fold (très couramment utilisée, mais mal comprise en théorie, notamment pour ce qui est de choisir V) et plusieurs méthodes de pénalisation. Nous proposons des méthodes de calibration précise de pénalités, aussi bien pour ce qui est de leur forme générale que des constantes multiplicatives. L'utilisation du rééchantillonnage permet de résoudre des problèmes difficiles, notamment celui de la régression avec un niveau de bruit variable. Nous validons théoriquement ces méthodes du point de vue non-asymptotique, en prouvant des inégalités oracle et des propriétés d'adaptation. Ces résultats reposent entre autres sur des inégalités de concentration.
Un second problème que nous abordons est celui des régions de confiance et des tests multiples, lorsque l'on dispose d'observations de grande dimension, présentant des corrélations générales et inconnues. L'utilisation de méthodes de rééchantillonnage permet de s'affranchir du fléau de la dimension, et d'"apprendre" ces corrélations. Nous proposons principalement deux méthodes, et prouvons pour chacune un contrôle non-asymptotique de leur niveau.
Rosas, Erika. "Services à base de communautés de confiance dans les réseaux P2P." Paris 6, 2011. http://www.theses.fr/2011PA066573.
Full textMekki, Ahmed. "Contribution à la Spécification et à la Vérification des Exigences Temporelles : Proposition d'une extension des SRS d'ERTMS niveau 2." Phd thesis, Ecole Centrale de Lille, 2012. http://tel.archives-ouvertes.fr/tel-00710674.
Full textBlouin, Dominique. "Modeling languages for requirements engineering and quantitative analysis of embedded systems." Lorient, 2013. http://www.theses.fr/2013LORIS313.
Full textL’ingénierie dirigée par les modèles est une approche visant à maitriser la complexité croissante des systèmes. Une phase critique de cette approche est l’ingénierie des exigences, qui vise à formuler correctement le problème qui doit être résolu par le système à développer. Une spécification d’exigences doit être couplée avec une spécification de la conception du système qui représente une solution au problème formulé. Plusieurs langages de description d’architectures (ADL) ont été proposés pour la modélisation des systèmes et l’analyse de leurs propriétés non fonctionnelles (NFP). Cependant, certains de ces langages ne disposent pas de moyens de modélisation des domaines du problème et de l’estimation des NFP. Pour résoudre ces problèmes, cette thèse propose deux nouveaux langages pouvant être combinés à des ADL pour en combler ces lacunes. RDAL (Requirements Definition and Analysis Language) permet la modélisation, l’analyse et là vérification des exigences d’un système, incluant des moyens de formalisation de bonnes pratiques de l’ingénierie des exigences. QAML (Quantitative Analysis Modeling Language) permet de représenter des modèles d’analyse de NFP de manière à pouvoir les intégrer dans un modèle d’un ADL donné. Ces modèles sont alors automatiquement interprétés pour fournir des estimations des NFP concernées, s’assurant ainsi de leur cohérence avec le modèle évolutif de conception. QAML est également utile pour la représentation des fiches techniques des composants de manière à faciliter l’intégration des données dans le flot de conception. Ces deux langages ont été validés à l’aide de modèles AADL démontrant leur capacité à détecter des erreurs de conception
Samih, Hamza. "Test basé sur les modèles appliqué aux lignes de produits." Thesis, Rennes 1, 2014. http://www.theses.fr/2014REN1S109/document.
Full textSoftware product line engineering is an approach that supports developing products in family. These products are described by common and variable features. Currently, the validation activity is disjointed from the product lines development process. The effort and resources provided in the test campaigns for each product can be optimized in the context of product lines. Model-based testing is a technique for automatically generating a suite of test cases from requirements. In this thesis report, we present an approach to test a software product line with model-based testing. This technique is based on an algorithm that establishes the relationship between the variability model released with OVM and the test model, using traceability of functional requirements present in both formalisms. Our contribution is an algorithm that automatically extracts a product test model. It is illustrated with a real industrial case of automotive dashboards and experimented by an industrial of aeronautic domain in the MBAT European project context
Mekki, Ahmed. "Contribution à la Spécification et à la Vérification des Exigences Temporelles : Proposition d’une extension des SRS d’ERTMS niveau 2." Thesis, Ecole centrale de Lille, 2012. http://www.theses.fr/2012ECLI0006/document.
Full textThe work developed in this thesis aims to assist the engineering process of temporal requirements for time-constrained complex systems. Our contributions concern three phases: the specification, the behaviour modelling and the verification. For the specification of temporal requirements, a new temporal properties typology taking into account all the common requirements one may meet when dealing with requirements specification, is introduced. Then, to facilitate the expression, we have proposed a structured English grammar. Nevertheless, even if each requirement taken individually is correct, we have no guarantee that a set of temporal properties one may express is consistent. Here we have proposed an algorithm based on graph theory techniques to check the consistency of temporal requirements sets. For the behaviour modelling, we have proposed an algorithm for transforming UML State Machine with time annotations into Timed Automata (TA). The idea is to allow the user manipulating a quite intuitive notation (UML SM diagramsduring the modelling phase and thereby, automatically generate formal models (TA) that could be used directly by the verification process. Finally, for the verification phase, we have adopted an observer-based technique. Actually, we have developed a repository of observation patterns where each pattern is relative to a particular temporal requirement class in our classification. Thereby, the verification process is reduced to a reachability analysis of the observers’ KO states relatives to the requirements’ violation
Taffo, Tiam Raoul. "Modèles opérationnels de processus métier et d'exigences variables pour le développement de lignes de produits logiciels." Thesis, Montpellier, 2015. http://www.theses.fr/2015MONTS268.
Full textAny organization involved in software engineering has to deal with reduction of production time and cost, in order to face the competitiveness challenge. This imperative of thinking the software economy resulted in the goal of getting better control on developer productivity. Software Reuse is a preferred way to increase the productivity, particularly when it is systematized. Two types of activities should be considered to improve software reuse, development for reuse and development by reuse. Several solutions have been proposed to contribute and improve development for reuse. For its part, product line approach is distinguished by its contribution to development by reuse through support and automation of selection, configuration, and derivation of new products. However, although this approach has positioned reuse as a core activity in its engineering process, it remains difficult to realize it in many situations. For example, due to lack of specification or management of variability which may occur in each artifacts from all steps of the engineering process. In this context, the general issue of this thesis consists in industrialization of software product line, by the contribution to systematization of reuse in each steps and automation of transitions between those steps. To better support the business agility, our first goal is the specification of variability within business process models, in order to make them directly usable into software factory. Our second goal is to introduce variability specification into requirements engineering, enabling systematic reuse of requirements models and establishing traceability links with previous models of variable business processes. Thus, an architecture model (service oriented) can be generated in software factory, as implementation of modeled business processes with compliance to specified requirements
Revault, d'Allonnes Adrien. "Evaluation sémantique d'informations symboliques : la cotation." Paris 6, 2011. http://www.theses.fr/2011PA066395.
Full textLebeaupin, Benoit. "Vers un langage de haut niveau pour une ingénierie des exigences agile dans le domaine des systèmes embarqués avioniques." Thesis, Université Paris-Saclay (ComUE), 2017. http://www.theses.fr/2017SACLC078/document.
Full textSystems are becoming more and more complex, because to stay competitive, companies whichdesign systems search to add more and more functionalities to them. Additionally, this competition impliesthat the design of systems needs to be reactive, so that the system is able to evolve during its conception andfollow the needs of the market.This capacity to design flexibly complex systems is hindered or even prevented by various variouselements, with one of them being the system specifications. In particular, the use of natural language tospecify systems have several drawbacks. First, natural language is inherently ambiguous and this can leadsto non-conformity if customer and supplier of a system disagree on the meaning of its specification.Additionally, natural language is hard to process automatically : for example, it is hard to determine, usingonly a computer program, that two natural language requirements contradict each other. However, naturallanguage is currently unavoidable in the specifications we studied, because it remains very practical, and itis the most common way to communicate.We aim to complete these natural language requirements with elements which allow to make them lessambiguous and facilitate automatic processing. These elements can be parts of models (architectural modelsfor example) and allow to define the vocabulary and the syntax of the requirements. We experimented theproposed principles on real industrial specifications and we developped a software prototype allowing totest a specification enhanced with these vocabulary and syntax elements
Touzani, Mounir. "Extension de l’ingénierie des exigences à l’information spatio-temporelle : apports dans le contexte des systèmes d’information de gestion." Thesis, Montpellier, 2016. http://www.theses.fr/2016MONTT309/document.
Full textIn a world where people and objects are increasingly connected and localized, geographic information (GI) is very present in our daily life and its inclusion in the management information systems becomes essential. Individuals and enterprises mobilize increasingly to orient themselves in space, access to various statistical data georeferenced, plan travel...Current developments on mobile systems, for example, involve a space-time dimension, often reserved for geographic information systems (GIS). Therefore, many software systems are required to maintain a very close relationship and precise with the real world to open up new fields of application such as smart cities, factories of the future or a new generation of logistics systems.The magnitude of this change is major since in terms of data, at least 80% are concerned (http://www.esrifrance.fr/sig1.aspx). However, an analyst often faces difficulties in capturing requirements in general, necessitating an organized and systematic approach. It is in this context that we direct our study to the field of requirements engineering (RE) to better build an argument that takes into account the space-time dimension. This is a key step in the development of such requirements in a management information system development project.The proposed framework includes contributions in the fields of RE and geomatics. In this sense, we have particularly caught the KAOS method that offers a goal oriented requirements engineering approach and equipped with a software named "Objectiver".First, we propose an extension of the KAOS methodology in the space-time dimension. KAOS already answered the questions of "WHY", the "HOW", the "WHAT" and the "WHO". We approach our research specifically, the issues of "WHEN" and "WHERE". We use this for two lines of research : one explores the duality between space and time dimensions in order to transpose the spatial dimension of requirements engineering techniques already defined. On the other hand, we consider notations widely used in GIS, and to integrate them in primitive requirements engineering and thus facilitate the capture of space-time requirements. We made a prototype using the tool "Objectiver". However, the results presented are applicable to other methods and tools.To push as much as possible the performance of an existing system, we propose as a second step, to examine more specifically the open integration strategies and operating bricks started in data and/or services to meet geomatics to identified needs. We believe the users of these information systems must be able to integrate space-time aspects in their management rules or business rules.This raises the question "how to identify the space-time aspects of business rules by a RE process ?" Which brings us to reflect on the construction of a management information system that is capable of separating the business view and the system view. We show specifically how business rules can be identified on the basis of space-time aspects. We have equipped our contribution and illustrate through a real case study of merger of two universities. Next, we show through this same case study how to deploy such rules in the most appropriate components ensuring secure an open architecture
Dragomir, Iulia. "Conception et vérification d'exigences de sûreté temporisées à base de contrats dans les modèles SysML." Toulouse 3, 2014. http://thesesups.ups-tlse.fr/2510/.
Full textNowadays computer systems grow larger in size and more complex. Embedded in devices from different domains like avionics, aeronautics, consumer electronics, etc. , they are often considered critical with respect to human life, costs and environment. A development that results in safe and reliable critical real-time embedded systems is a challenging task, considering that errors are accidentally inserted in the design. A way for system designers to tackle this issue is to use a compositional design technique based on components and driven by requirements: it allows to infer from global requirements, component properties that must locally hold. Contract-based reasoning allows to compositionally derive correct components from global system requirements by interposing abstract and partial specifications for components. Informally, a contract models the abstract behavior a component exhibits from the point of view of the requirement to be satisfied (i. E. Guarantee) in a given context (i. E. Assumption). Contracts can be used to decompose and trace requirements during iterative design, but also to perform compositional verification of requirement satisfaction. In this thesis, we present a methodology for reasoning with contracts during system design and verification within SysML. Thus, we define the syntax for contracts in UML/SysML, as well as a set of refinement relations between contracts and/or components in order to prove the system's correctness with respect to requirements. Next, we provide a formal framework that models the semantics of a UML/SysML model extended with contracts as a mapping of the language concepts to a variant of Timed Input/Output Automata. The refinement relations are formalized based on the trace inclusion relation and compositional properties are proved to hold which ensures the soundness of the methodology. The approach is instantiated for the OMEGA Profile and IFx2 toolset with partial automatic generation of proof obligations. Finally, the approach is applied on several case studies, including an industry-grade system model, which show its efficiency by comparative verification results
Minier, Marine. "Quelques résultats en cryptographie symétrique, pour les modèles de confiance dans les réseaux ambiants et la sécurité dans les réseaux de capteurs sans fil." Habilitation à diriger des recherches, INSA de Lyon, 2012. http://tel.archives-ouvertes.fr/tel-00918938.
Full textHamdi, Sana. "Computational models of trust and reputation in online social networks." Thesis, Université Paris-Saclay (ComUE), 2016. http://www.theses.fr/2016SACLL001/document.
Full textOnline Social Networks (OSNs) have known a dramatic increase and they have been used as means for a rich variety of activities. In fact, within OSNs, usersare able to discover, extend, manage, and leverage their experiences and opinionsonline. However, the open and decentralized nature of the OSNs makes themvulnerable to the appearance of malicious users. Therefore, prospective users facemany problems related to trust. Thus, effective and efficient trust evaluation isvery crucial for users’ decision-making. It provides valuable information to OSNsusers, enabling them to make difference between trustworthy and untrustworthyones. This thesis aims to provide effective and efficient trust and reputationmanagement methods to evaluate trust and reputation of OSNs users, which canbe divided into the following four contributions.The first contribution presents a complex trust-oriented users’ contexts andinterests extraction, where the complex social contextual information is taken intoaccount in modelling, better reflecting the social networks in reality. In addition,we propose an enrichment of the Dbpedia ontology from conceptualizations offolksonomies.We second propose the IRIS (Interactions, Relationship types and Interest Similarity)trust management approach allowing the generation of the trust networkand the computation of direct trust. This model considers social activities of usersincluding their social relationships, preferences and interactions. The intentionhere is to form a solid basis for the reputation and indirect trust models.The third contribution of this thesis is trust inference in OSNs. In fact, it isnecessary and significant to evaluate the trust between two participants whomhave not direct interactions. We propose a trust inference model called TISON(Trust Inference in Social Networks) to evaluate Trust Inference within OSNs.The fourth contribution of this thesis consists on the reputation managementin OSNs. To manage reputation, we proposed two new algorithms. We introducea new exclusive algorithm for clustering users based on reputation, called RepC,based on trust network. In addition, we propose a second algorithm, FCR, whichis a fuzzy extension of RepC.For the proposed approaches, extensive experiments have been conducted onreal or random datasets. The experimental results have demonstrated that ourproposed algorithms generate better results, in terms of the utility of delivered results and efficiency, than do the pioneering approaches of the literature
Saba, Stéphanie. "Marché et réseaux : l'influence des liens interindividuels sur l'efficacité des échanges." Thesis, Paris 2, 2016. http://www.theses.fr/2016PA020008/document.
Full textHow to define and measure trust is still an enigma in economics, philosophy and sociology. This "three papers" thesis compares two different mechanisms - egotiated(decentralised submarket) and auction (centralised submarket) - on the basis of trust. Through an empirical study, the level of trust is evaluated and its impact is analysed on the "Boulogne-sur-Mer" fish market characterised by a stable coexistence of these two mechanisms. The three papers are preceded by a general introduction and a literature review. Paper one aims at comparing the nestedness and the robustness of both submarkets. Social network tools of ecologists are applied in order to provide an answer. Paper two models trust creation on both structures from the buyer side using the survival analysis and considering the buyer size. Paper three studies the effect of a trust index on the outcomes of transactions. Bipartite and projected graphs reveal the difference between submarkets. This thesis shows that the negotiated market is marked by a higher level of trust as agents interact and are not fully informed about the market situation unlike the auction one where information is centralised. We believe that trust is a way out of risk when there is lack of information
Coudin, Élise. "Inférence exacte et non paramétrique dans les modèles de régression et les modèles structurels en présence d'hétéroscédasticité de forme arbitraire." Thèse, Paris, EHESS, 2007. http://hdl.handle.net/1866/1506.
Full textLerasle, Matthieu. "Rééchantillonnage et sélection de modèles optimale pour l'estimation de la densité." Toulouse, INSA, 2009. http://eprint.insa-toulouse.fr/archive/00000290/.
Full textGama, Batista João Da. "Phénomènes collectifs déstabilisateurs dans les systèmes socio-économiques." Thesis, Université Paris-Saclay (ComUE), 2015. http://www.theses.fr/2015SACLC026/document.
Full textThis thesis reports on two different research topics belonging to the same project. The first research avenue, which is thoroughly explained in chapter 3, is a modelling approach to the dynamics of trust in a networked society. The second, whose description can be found in chapter 4, is an experimental approach to study human decisions when people trade an asset with a positive average growth per period in a controlled laboratory environment. One of the common links between these two topics is collective action, which is a key player in a number of phenomena, for example in the dynamics of panic, bankruptcies and, consequently, systemic risk. Therefore, the author hopes that this work will contribute to the study of collective action phenomena, especially in the field of quantitative finance, in which it is more likely that the specific findings from the above mentioned trust model and trading experiment can be used in their present form
Duong, Quoc bao. "Approche probabiliste pour l'estimation dynamique de la confiance accordée à un équipement de production : vers une contribution au diagnostic de services des SED." Phd thesis, Université de Grenoble, 2012. http://tel.archives-ouvertes.fr/tel-00910079.
Full textDuong, Quoc Bao. "Approche probabiliste pour l'estimation dynamique de la confiance accordée à un équipement de production : vers une contribution au diagnostic de services des SED." Thesis, Grenoble, 2012. http://www.theses.fr/2012GRENT102/document.
Full textThe work that we present in this paper contributes to the field of supervision, monitoringand control of complex discrete event systems services. It is placed in the context of randomfailure occurrence of operative parts where we focus on providing tools to maintenance teamsby locating the possible origin of potential defect products: better locate to better maintain, soeffectively to minimize more equipment’s time drift. If the production equipment were able todetect such drifts, the problem could be considered simple; however, metrology equipment addsto the complexity. In addition, because of an impossibility to equip the production equipment witha sensor system which comprehensively covers all parameters to be observed, a variable sensorreliability in time and a stressed production environments, we propose a probabilistic approachbased on Bayesian network to estimate real time confidence, which can be used for productionequipment?s operation
Le, Quyet Thang. "Analyse de covariance généralisée et modélisation du dosage progressif en alimentation animale : application à l'étude du tourteau de colza." Paris 11, 1988. http://www.theses.fr/1988PA112019.
Full textIn the multivariate model where the variables are strongly correlated, a covariance analysis (ANCOVA) is strongly advisable. The classical ANCOVA method may be used only if the covariate is not affected by the treatments in the experiment. F. Smith (1957) illustrated the case of an ANCOVA with covariate affected by treatments that lead to erroneous conclusions. We proposed some generalized ANCOVA methods. Three methods are explained. The first two are used for study of colza-oil cake in avicol feeding, with experimental data from INRA. The third concerning statistical synthesis has been presented only from a theoretical point of view. The introduction of a new food for animals can be less adaptable because of the lack of animal appetite as regards to the new food. To increase the efficiency of the experiments we propose amodelization of the progressive dosage in animal feeding, using the confidence simultaneous inter-vals of the comparisons "treatments- control" with optimal allocations of treatments and control of the development of experiment
Amokrane, Nawel. "De l’ingénierie des besoins à l’ingénierie des exigences : vers une démarche méthodologique d’ingénierie de systèmes complexes, de vérification et de validation appliquée à l’informatisation des PME." Thesis, Montpellier, 2016. http://www.theses.fr/2016MONTT259/document.
Full textMost enterprises, and especially SMEs, must develop their business in very competitive and rapidly changing fields. Where they have to adapt to volatile customers who want to find cheaper products and services and that are more corresponding to their needs. The SME is then confronted with problems of responsiveness and flexibility in responding to these customers. As an effect, it seeks to reduce the costs and time to market and to provide high quality and innovative goods and services. The SME’s information system is an asset on which it can rely to implement this strategy and so to maximize its responsiveness and flexibility but also to reach the sought profitability and quality. These are key qualities that guarantee autonomy and recognition, qualities that are highly needed by any SME. The Information system is indeed the drive belt of information not only inside the enterprise, between decision and operational systems proving added value of the enterprise, but also within its environment that includes its external partners. Part of this information system is computerized. It stores and processes the information needed by the different decision-making, business and support processes that serve the enterprise’s strategy. It is crucial to understand the features, interfaces and data that make up this computerized system and develop them according to the needs of SME. The SME is therefore tempted to embark, alone or accompanied, in so-called computerization projects i.e. projects for the development or improvement of its computerized system. We are interested in projects aimed at developing management applications of SMEs. The SME – then assuming the role of project owner – along with the development team – supporting the role of project management – have to share a common vision of the computerization needs. They are then called upon to carry out jointly requirements engineering (RE) activities. RE guides the SMEs to be able to describe and formalize its needs. It then allows the development team to specify more formally these needs as requirements which then define the required development work. RE is often carried out with the assistance of project owner support. This crucial step remains difficult for SMEs. It is most often performed by the development team itself to address the lack of resources, time and skills of SMEs. However, the involvement of the SME’s members is vital to the success of any computerization project, especially if it permanently affects the functioning of the enterprise.This work, developed through a collaborative with the company RESULIS, consisted in developing a requirements engineering method which offers SMEs concepts, simple languages, modeling and verification means that are easily and intuitively manipulated and provide sufficient and relevant formalization of the SME’s requirements. This method is based on principles derived from both enterprise modeling and systems engineering fields for requirements elicitation. Semi-formal verification and validation means are applied to guarantee some expected qualities of the resulting requirements. The method is also integrated in the model driven development cycle to enable a posteriori the production of prototypes and make interoperable the languages and tools used by both the SME and the development team
Chareton, Christophe. "Modélisation formelle d’exigences et logiques temporelles multi-agents." Thesis, Toulouse, ISAE, 2014. http://www.theses.fr/2014ESAE0020/document.
Full textThese studies relate to modeling languages for requirements engineering. The aim is to provide tools for the evaluation of agents' capacity to ensure the satisfaction of their assigned specifications (we talk of "the assignment problem"). To do so, we first develop a modeling language (Khi). Then we give a set of assignment correctness criteria for systems that are modelled as instances of Khi. We also develop a formal tool, USLKhi, for wich we solve the model checking problem. We reduce the different correctness criteria for Khi instances to instances of the model checking problem for USLKhi. Thus as a whole, our proposition enables to express, formalise and solve the assignment problem. We also illustrate the set of concepts and tools that are presented with a case study featuring spatial observation missions
Bouchard, Joëlle. "Prévision et réapprovisionnement dynamiques de produits de consommation à cycle rapide." Doctoral thesis, Université Laval, 2016. http://hdl.handle.net/20.500.11794/27421.
Full textThe retail industry is in upheaval. Many banners have recently announced their closure, such as Jacob, Mexx, Danier, Smart Set and Target in Canada, to name a few. To remain competitive and ensure their sustainability, companies have to adapt themselves to new consumer buying habits. No doubt that a better understanding of demand is needed. But how to estimate and forecast demand in a context with constantly increasing volatility and ever shorter product lifecycles, where competitive pressure is pervasive and global? Managing demand is a difficult exercise, even in this age when numerous sophisticated tools have been developed. The dynamic environment in which organizations evolve explains in part the difficulty. Through the past 30 years, the customer has gone from passive spectator to leading actor, inevitably changing the consumer-business relationship. Technological development and the advent of e-commerce are also largely responsible for profound changes experienced by businesses. The way of doing business is not the same and forces companies to adapt to these new realities. The challenges are important. Companies able to seize market signals will be better equipped to anticipate demand and make better decisions in response to customer needs. This thesis is articulated according to three main lines of research around this main theme, exploiting a real business testbed. The first theme concerns the development of a daily forecast method adapted to sales data with a double seasonality as well as special days. The second twofold research first presents a forecasting method for using ratio to quickly forecast sales or future demands of a very large number of products and their variations. Then it proposes a method to determine cumulative sales forecasts and confidence intervals for products newly introduced in the chain stores. Finally, the third axis proposes a predictive method to help reorder launching and sizing decision based on the results of a predictive analysis, deployment of targeted products and inventory of potential substitute products. Keywords : Forecasting; seasonality; calendar effect; products without demand history; confidence intervals; replenishment decision; retail network.
Chevallereau, Benjamin. "Contribution des nouvelles approches de modélisation à la durabilité des applications." Phd thesis, Ecole centrale de nantes - ECN, 2011. http://tel.archives-ouvertes.fr/tel-00578443.
Full textLouati, Amine. "Une approche multi-agents pour la composition de services Web fondée sur la confiance et les réseaux sociaux." Thesis, Paris 9, 2015. http://www.theses.fr/2015PA090035/document.
Full textThis thesis deals with service discovery, selection and composition problems. The aim is to fulfill a complex requester query. To do that, we propose a multi-agent approach based on trust and social networks. We define a trust model as a compositional concept that includes social, expert, recommender and cooperation-based component. The social-based component judges whether or not the provider is worthwhile pursuing before using his services. The expert-based component estimates whether or not the service behaves well and as expected. The recommender-based component checks whether or not an agent is reliable and if we can rely on its recommendations. The cooperation-based component allows agents to decide with whom to interact in a service composition. We propose a distributed algorithm for service discovery using trust between agents and referral systems in social networks. We also develop a new method based on a probabilistic model to infer trust between non adjacent agents while taking into account roles of intermediate agents. Finally, we present an original coalition formation process which is incremental, dynamic and overlapping for service composition in social networks. %In particular, our coalition formation process engaging self-interested agents is incremental, dynamic and overlapping. Experimental results show that our multi-agents approaches are efficient, outperforms existing similar ones and can deliver more trustworthy results at low cost of communications
Kamdem, Simo Freddy. "Model-based federation of systems of modelling." Thesis, Compiègne, 2017. http://www.theses.fr/2017COMP2374.
Full textThe engineering of complex systems and systems of systems often leads to complex modelling activities (MA). Some challenges exhibited by MA are: understanding the context where they are carried out and their impacts on the lifecycles of models they produce, and ultimately providing a support for mastering them. How to address these challenges with a formal approach is the central challenge of this thesis. In this thesis, after discussing the related works from systems engineering in general and the co-engineering of the system to be made (product) and the system for make (project) systems specifically, we position and develop a methodology named MODEF, that aims to master the operation of MA. MODEF consists in: (1) characterizing MA as a system (and more globally as a federation of systems) in its own right; (2) iteratively architecting this system through: the modelling of the conceptual content of the models produced by MA and their life cycles, the tasks carried out within MA and their effects on these life cycles; (3) specifying the expectations over these life cycles and; (4) analysing models (of MA) against expectations (and possibly tasks constraints) - to check how far expectations are achievable - via the synthesis of the acceptable behaviours. On a practical perspective, the exploitation of the results of the analysis allows figuring out what could happen with the modelling tasks and their impacts on the whole state of models they handle. We show on two case studies (the operation of a supermarket and the modelling of the functional coverage of a system) how this exploitation provides insightful data on how the system is end-to-end operated and how it can behave. Based on this information, it is possible to take some preventive or corrective actions on how the MA are carried out. On the foundational perspective, the formal semantics of three kinds of involved models and the expectations formalism are first discussed. Then the analysis and exploitation algorithms are presented. Finally this approach is roughly compared with model checking and systems synthesis approaches. Last but not least, two enablers whose first objectives are to ease the implementation of MODEF are presented. The first one is a modular implementation of MODEF's buildings blocks. The second one is a federated architecture (FA) of models which aims to ease working with formal models in practice. Despite the fact that FA is formalised within the abstract framework of category theory, an attempt to bridge the gap between abstraction and implementation is sketched via some basic data structures and base algorithms. Several perspectives related to the different components of MODEF conclude this work
Hérault, Stéphanie. "Étude des processus de formation de l'attitude envers la marque : un essai de modélisation intégrant une variable psychologique, la confiance en soi : une application en situation de pré-test publicitaire pour des produits de grande consommation." Paris 1, 1999. http://www.theses.fr/1999PA010065.
Full textRopaul, Maïva. "Essais sur l'analyse économique de la responsabilité civile des entreprises." Thesis, Paris 2, 2015. http://www.theses.fr/2015PA020071/document.
Full textThe accelerating pace of technological innovations and pressures from civil society provide tort law with new challenges. This thesis studies the incentive effects of tort law on corporate investment in prevention in this context. Particularly, this study deepens the traditional economic analysis of corporate civil liability and assess the effects of the combination of non legal sanctions and the legal framework. First, we highlight the evolution of the economic analysis of liability and responsibility. Then, we study the incentive effects ofcivil liability in a theoretical model, with a particular emphasis on the role of the legal notion of causality. Next, we examine to what extent the difficulties of predicting accident risks affect incentives provided by liability with both a theoretical model and with a lab experiment. In a theoretical model, we develop ananalysis of the role of non-legal sanctions, from civil society, along side the tort law. We show that the incentive effects of consumer boycott on corporate investment in prevention are limited. Finally, through an empirical study, we complete this analysis by studying the magnitude and determinants of consumer boycott in Europe
Moussa, Kaouther. "Estimation de domaines d'attraction et contrôle basé sur l'optimisation : application à des modèles de croissance tumorale." Thesis, Université Grenoble Alpes, 2020. http://www.theses.fr/2020GRALT078.
Full textThe main objective of this thesis is to propose frameworks and algorithms that are based on advanced control approaches, in order to guide cancer treatments scheduling. It also aims at pointing out the importance of taking into account the problem of stochastic uncertainties handling in the drug scheduling design, since cancer dynamical systems are considered to be highly uncertain phenomena.Cancer dynamical interactions are still an open research topic which is not fully understood yet. The complexity of such dynamics comes from their partially unknown behavior and their uncertain nature. Additionally, they are often described by nonlinear complex dynamics and require taking into consideration many constraints related to physiology as well as biology.In terms of control design, this topic gathers many complexity ingredients such asnonlinear dynamics, constraints handling and optimality issues. Therefore, in this thesis, we propose to use a recent optimal control approach that is based on moment optimization. This framework has the advantage of considering all the state and input variables as probability densities, allowing therefore to explicitly consider parametric as well as initial state uncertainties in the optimal control problem. We use this framework in Part II, in order to design robust optimal control schedules that represent cancer drugs injection profiles.The second problem that we address in Part III consists in the estimation of regionsof attraction for cancer interactions models. This problem is interesting in the context of cancer treatment design, since it provides the set of all possible initial conditions (tumor and patient health indicators), that can be driven to a desired targeted safe region, where the patient is considered to be healed. Furthermore, we focus on the assessment of methodologies that take into consideration the parametric uncertainties that can affect the dynamical model
Matoussi, Abderrahman. "Construction de spécifications formelles abstraites dirigée par les buts." Thesis, Paris Est, 2011. http://www.theses.fr/2011PEST1036/document.
Full textWith most of formal methods, an initial formal model can be refined in multiple steps, until the final refinement contains enough details for an implementation. Most of the time, this initial model is built from the description obtained by the requirements analysis. Unfortunately, this transition from the requirements phase to the formal specification phase is one of the most painful steps in the formal development chain. In fact, building this initial model requires a high level of competence and a lot of practice, especially as there is no well-defined process to assist designers. Parallel to this problem, it appears that non-functional requirements are largely marginalized in the software development process. The current industrial practices consist generally in specifying only functional requirements during the first levels of this process and in leaving the consideration of non-functional requirements in the implementation level. To overcome these problems, this thesis aims to define a coupling between a requirement model expressed in SysML/KAOS and an abstract formal specification, while ensuring a distinction between functional and non-functional requirements from the requirements analysis phase. For that purpose, this thesis proposes firstly two different approaches (one dedicated to the classical B and the other to Event-B) in which abstract formal models are built incrementally from the SysML/KAOS functional goal model. Afterwards, the thesis focuses on the approach dedicated to Event-B in order to complete it and enrich it by using the two other SysML/KAOS models describing the non-functional goals and their impact on functional goals. We present different ways to inject these non-functional goals and their impact into the obtained abstract Event-B models. Links of correspondance between the non-functional goals and the different Event-B elements are also defined in order to improve the management of the evolution of these goals. The different approaches proposed in this thesis have been applied to the specification of a localization component which is a critical part of a land transportation system. The approach dedicated to Event-B is implemented in the SysKAOS2EventB tool, allowing hence the generation of an Event-B refinement architecture from a SysML/KAOS functional goal model. This implementation is mainly based on the model-to-model transformation technologies
Ahmad, Manzoor. "Modeling and verification of functional and non functional requirements of ambient, self adaptative systems." Phd thesis, Université Toulouse le Mirail - Toulouse II, 2013. http://tel.archives-ouvertes.fr/tel-00965934.
Full textLieber, Romain. "Spécification d'exigences physico-physiologiques en ingénierie d'un système support de maintenance aéronautique." Thesis, Université de Lorraine, 2013. http://www.theses.fr/2013LORR0197.
Full textCurrent Systems Engineering framework must evolve in order to take into account the critical interactions of human-machine systems since the specification phase. The objective is to ensure that the behavior of such systems is kept within an accepted domain of performances whatever is the context of use. Those performances depend on the synergies of the different interactions that take place between technical and human systems when operating a common object. Human Factors Integration in Systems Engineering also known as Human Systems Integration implies to start working on the overall performance of all the interfaces of a human-machine system. These different interfaces exhibit emerging complex interactions. Some of them are inquired to ease the whole system performances and facilitate system resilience capabilities within disruptive unanticipated environment. Other ones are designed to finalize the system mission according to the purpose of its context of use. The paradigm we have explored in our work is based on the hypothesis of possible inter-operations between physiological and technical processes for human-machine interaction specification by coupling a System Modeling Framework with the Mathematical Theory of Integrative Physiology one. Our work focuses on the physical and physiological requirements specification (modeled with SysML) of a visual perceptive interaction for human to perceive right the meaning of symbolic properties technical objects afford when they are being maintained in variable contextualized situations. Our specification work results lead us to propose a Model-Based Support Systems Engineering organization
Matoussi, Abderrahman, and Abderrahman Matoussi. "Construction de spécifications formelles abstraites dirigée par les buts." Phd thesis, Université Paris-Est, 2011. http://tel.archives-ouvertes.fr/tel-00680736.
Full textMachara, Marquez Samer. "Models and algorithms for managing quality of context and respect for privacy in the Internet of Things." Thesis, Université Paris-Saclay (ComUE), 2015. http://www.theses.fr/2015SACLL005/document.
Full textThe Internet of Things (IoT) is a novel paradigm, which basic idea is the pervasive presence around us of a variety of things or objects that are able to interact with each other and cooperate with their neighbors by sharing data, directly acquired by measuring some facts, in order to reach common goals. This information not only represents the state of users but also the processes in which they are involved, this is called the Context. The context informs both recognition and mapping operations by providing a structured, unified view of the world in which a system operates. With the IoT, many applications consume context information concerning users (context owners) such as, daily routines, behaviors and health, offering lots of benefits to users, but compromising their privacy. The re-search problematic of this thesis lies within the “semantic-oriented” IoT vision. In this context, semantic technologies play a key role in exploiting appropriate modelling solutions for integrating privacy security into the IoT. Context-aware applications and services (context consumers) expect correct and reliable context data to adapt their functionalities. In this thesis the Quality of Context (QoC) is meta-data attached to context information describing a range of criteria that express context information quality. These meta-data can be used to determine the worth of the information for a particular application in a particular situation. We explore middleware and framework solutions to integrate the management of privacy and QoC in the IoT. This thesis is distinguished from other context management domain researches by bearing in mind the decoupling of the IoT participants, i.e., the owners of context information and the consumers of this context information. Moreover, we consider the QoC as a factor affecting the privacy of individuals. This thesis provides the following contributions along two axis: 1. Designing a Context Contract Meta-model to define privacy and QoC concerns of decoupled context owners and context consumers based on reciprocal trust. This design is based on two points. Firstly, we consider that privacy is the capacity of context owners to control what, how, when, where and with whom to share information. Therefore, we identify four privacy dimensions (purpose, visibility, retention, QoC), and use them in the definition of access policies and obligations. Secondly, context consumers expect a certain QoC level in order to perform their tasks. We then propose to define two kinds of context contract for the producer and the consumer sides as follows: Context producer contract: A context contract whose clauses are expressions of the production of context data, of privacy requirements, and of QoC guarantees; Context consumer contract: A context contract whose clauses are expressions of the consumption of context data, of QoC requirements, and of privacy guarantees. Each context contract is created without the knowledge of is counter party.2. Proposing an algorithm to create agreements among context producers and context consumers by evaluating and compare requirements against guarantees, stated on their respective context contracts. As both IoT participants have symmetric contracts, when one participant defines its requirements, the other one defines its guarantees. The matching process of these context contracts verifies if the requirements of one party are included within the guarantees offered by the other party. Therefore, taking a decision based on this compatibility match from the producer point of view is to permit or deny the access to context data. Complementarily, from a consumer point of view, the consumption of context data is permitted or denied. From this definition, we designed algorithms to determine whether access and consumption are authorized or not, according to the context contracts matching
Sango, Marc. "Traceability of concerns and observer-based verification for railway safety-critical software." Thesis, Lille 1, 2015. http://www.theses.fr/2015LIL10067/document.
Full textIn recent years, the development of critical systems demands more and more software. In order to reduce their costs of development and verification, actors in critical domains, such as avionics and automotive domains, are moving more and more towards model-driven engineering. In contrast, in the railway domain, for strategic and organizational reasons, actors remain faithful to traditional methods that allow them to take advantage of their knowledge. However, these conventional approaches suffer from a lack of abstraction and do not provide supports for traceability of concerns and formal verification, which are highly recommended for the development of railway safety-critical software. To address these shortcomings, we present in this thesis a systematic approach based on model driven engineering and component-based model, in order to better manage software complexity and traceability of concerns. In this dissertation, we provide in particular three major contributions. First, we provide an integrated set of meta-models for describing the concerns of software requirements, software components, and traceability between the concerns and software components. With the second contribution, we propose a formal support of our model to allow formal verification of temporal properties. Finally, with the last contribution, we propose a software component-based development and verification approach, called SARA, and included in V-lifecycle widely used in the railway domain. Experiments we conducted to validate our approach through a few case studies of the new European train control system ERTMS/ETCS, show that by using component model that explicitly include requirement traceability, we are able to provide a practical, scalable and reliable approach
Lieber, Romain. "Spécification d'exigences physico-physiologiques en ingénierie d'un système support de maintenance aéronautique." Electronic Thesis or Diss., Université de Lorraine, 2013. http://www.theses.fr/2013LORR0197.
Full textCurrent Systems Engineering framework must evolve in order to take into account the critical interactions of human-machine systems since the specification phase. The objective is to ensure that the behavior of such systems is kept within an accepted domain of performances whatever is the context of use. Those performances depend on the synergies of the different interactions that take place between technical and human systems when operating a common object. Human Factors Integration in Systems Engineering also known as Human Systems Integration implies to start working on the overall performance of all the interfaces of a human-machine system. These different interfaces exhibit emerging complex interactions. Some of them are inquired to ease the whole system performances and facilitate system resilience capabilities within disruptive unanticipated environment. Other ones are designed to finalize the system mission according to the purpose of its context of use. The paradigm we have explored in our work is based on the hypothesis of possible inter-operations between physiological and technical processes for human-machine interaction specification by coupling a System Modeling Framework with the Mathematical Theory of Integrative Physiology one. Our work focuses on the physical and physiological requirements specification (modeled with SysML) of a visual perceptive interaction for human to perceive right the meaning of symbolic properties technical objects afford when they are being maintained in variable contextualized situations. Our specification work results lead us to propose a Model-Based Support Systems Engineering organization