Academic literature on the topic 'Proxys (logiciels)'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Proxys (logiciels).'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Proxys (logiciels)"

1

Merghit, Rachid, Mouloud Ait Athmane, and Abdelhak Lakehal. "Weak diagnostic performance of resting ankle ABI compared with lower limb's arterial Doppler in the diagnosis of PAD in coronary patients: about a hospital serial study." Batna Journal of Medical Sciences (BJMS) 8, no. 2 (December 28, 2021): 105–9. http://dx.doi.org/10.48087/bjmsoa.2021.8203.

Full text
Abstract:
Introduction. L’index de pression systolique (IPS) est considéré comme un outil indispensable, pour la prise en charge de l'artériopathie oblitérante des membres inférieurs (AOMI), cependant un complément d’exploration par les autres testes physiologiques, IPS au gros orteil et IPS effort s’impose afin de réduire le nombre des faux négatifs. Objectif. Démontrer le faible apport de l’IPS cheville de repos par rapport à l’échodoppler artériel des membres inférieurs dans le diagnostic de l’AOMI. Matériels et méthodes. Sur une série de 300 malades coronariens consécutifs durant l’année 2016 hospitalisés dans le service de cardiologie de l’hôpital universitaire de Constantine, un dépistage de l’AOMI a été réalisé par les investigations suivantes : Mesure de l’IPS à la cheville, compléter par la mesure de l’IPS a l’orteil si incompressibilité artérielle et par la mesure de l’IPS d’effort si l’IPS de repos est limite. Un échodoppler artériel des membres inférieurs a été réalisée par un échographe vividE9 General Electric pour l’ensemble de nos malades, en utilisant une sonde à balayage linéaire 12L, destinée à l’exploration vasculaire périphérique permettant d’obtenir un dépistage ciblé, Le traitement et l’exploitation des données ont fait appel au logiciel SPSS22. Résultats. Une sensibilité modérée de l’ordre de 50%, face à une spécificité élevée avoisinant 100% de l’IPS cheville de repos par rapport à l’échodoppler artériel des membres inférieurs. Sensibilité nettement améliorer après complément par les autres testes physiologiques qui sont la prise de l’IPS cheville effort et la mesure de l’index de pression systolique au gros orteil. Conclusion. L’examen vasculaire des membres inférieurs associe à la mesure de l’IPS cheville couplée aux autres testes physiologique (IPS au gros orteil et IPS effort) assurent une bonne sensibilité et spécificité diagnostiques de l’AOMI.
APA, Harvard, Vancouver, ISO, and other styles
2

Dieudonné, Awo A. "L’ENSEIGNEMENT DE L’HISTOIRE ÉCONOMIQUE ET SOCIALE AU DÉPARTEMENT D’HISTOIRE ET D’ARCHÉOLOGIE DE L’UNIVERSITÉ D’ABOMEY-CALAVI : ÉTAT DES LIEUX ET DÉFIS (1976-2021) / THE TEACHING OF ECONOMIC AND SOCIAL HISTORY IN THE DEPARTMENT OF HISTORY AND ARCHAEOLOGY OF ABOMEY-CALAVI UNIVERSITY: SITUATIONAL ANALYSIS AND CHALLENGES (1976-2021)." European Journal of Social Sciences Studies 7, no. 4 (April 29, 2022). http://dx.doi.org/10.46827/ejsss.v7i4.1257.

Full text
Abstract:
<p>Cette étude vise à évaluer l’impact de l’enseignement de l’histoire économique et sociale sur l’évolution de l’historiographie du département d’histoire et d’archéologie de l’Université d’Abomey-Calavi, de 1976 à 2021. Dans cet exercice, les répertoires de mémoires de Maîtrise d’histoire, d’archéologie et d’histoire de l’art (1976-2016) et de Licence (2016-2021) ont été dépouillés. Le corpus des 86 mémoires (dont 49 Maîtrise et 37 Licence) répertoriés selon divers critères est apuré à la lumière des procès-verbaux de soutenance, des entretiens obtenus de certains étudiants, enseignants et anciens chefs du département puis soumis aux techniques d’analyse quantitative et qualitative. Il y ressort que les recherches conduites dans ce secteur de l’histoire sur la longue durée (86 mémoires en 43 ans) couvrent inégalement le territoire national et sont essentiellement "béninocentriques". Leur visibilité relative résulte de l’insuffisance du personnel encadrant, de l’inadaptation du profil des étudiants à l’entrée de la formation et de la dispersion des productions scientifiques de ce domaine de l’histoire dans des revues généralistes à faible impact. Un recrutement pressant d’Assistants dans un élan de refonte du parcours des étudiants pour lier les acquis de la discipline historique à l’initiation à la statistique, à la maîtrise des logiciels de bases de données informatiques, à la connaissance de certains concepts, théories et approches en économie ; le tout combiné avec une parfaite maîtrise de l’anglais et la création d’une revue spécialisée, pourraient insuffler la dynamique dont ce secteur de l’histoire a besoin pour prendre toute sa place dans l’historiographie béninoise.</p><p> </p><p>This study aims to assess the impact of the teaching of economic and social history on the evolution of historiography of the department of history and archeology of the University of Abomey-Calavi, from 1971 to 2021. In this exercise, the repertory of theses of Masters of History, Archeology and History of Art (1976-2016) and License (20162021) were examined. The corpus of 81 theses (including 44 master's theses and 37 bachelor's theses) listed according to various criteria is audited in the light of the defense reports, interviews obtained from certain students, teachers and former heads of the department and submitted to the techniques quantitative analysis. The analysis shows that the research carried out in this sector of history over the long term (81 memories in 43 years) presents an uneven coverage of the national territory and is essentially "beninocentric". Their low visibility results from the insufficiency of supervisory staff, the unsuitability of the profile of students at the entry of training and the dispersion of scientific productions of economic and social history in general journals overshadowing their influence. An urgent recruitment of Assistants, a redesign of the course to shape the students by combining the achievements of the historical discipline with the initiation to statistical techniques, the mastery of computer database software, certain concepts, theories and approaches in economics, in socio-economics; all of this combined with a perfect command of English and the creation of a specialist journal would give economic and social history the dynamic it expects to take its place in Benin historiography.</p><p> </p><p><strong> Article visualizations:</strong></p><p><img src="/-counters-/edu_01/0948/a.php" alt="Hit counter" /></p>
APA, Harvard, Vancouver, ISO, and other styles
3

Silva, Christina, and Ligia Elliot. "Avaliação da Hipermídia para Uso em Educação: uma Abordagem Alternativa." Revista Brasileira de Estudos Pedagógicos 78, no. 188-89-90 (June 18, 2019). http://dx.doi.org/10.24109/2176-6681.rbep.78i188-89-90.1057.

Full text
Abstract:
Por suas características específicas, a hipermídia requer uma abordagem não tradicional e critérios adequados para ser avaliada. Tal abordagem deve incluir não apenas a avaliação do produto mas também do processo, e deve ocorrer em ambientes reais de aprendizagem. Este artigo apresenta o desenvolvimento e a aplicação de uma abordagem alternativa para avaliar o uso da hipermídia no ensino de terceiro grau, que enfatiza a avaliação formativa e a realização de tarefas complexas e significativas pelos alunos, em cooperação, segundo uma perspectiva construtivista. Os resultados da aplicação da abordagem sugerem que a avaliação orientada para o produto, realizadapor especialistas, embora necessária, não foi suficiente para estimar a eficácia de um software educacional hipermídia que era parte integrante da abordagem. A avaliação dos efeitos do emprego da hipermídia revelou que ela é uma ferramenta cognitiva de grande utilidade educacional. Abstract Hypermedia technology requires a non-traditional approach and adequate criteria to be evaluated. Such an approach must include product evaluation and process evaluation as well, and must occur in real learning environments. This article presents the development and application of an alternative approach to evaluate hypermedia utilization by higher education students. This approach emphasizes formative evaluation strategies and aims at complex and meaningful education some tasks carried out by students in a constructivist perspective. The results of the application of this approach suggest that in despite of the product oriented evaluation made by specialists is always necessary, it was not sufficient to estimate its efficacy. The effects evaluation of the hypermedia revealed that this is a very useful cognitive tool in education. Résumé Par ses caractéristiques spéciphiques, l'hypermidia exige une approche non traditionnelle et des critères appropriés pour son évaluation. Cette approche doit inclure non seulement l'évaluation du produit mais aussi celui du procès et doit se développer dans des ambiances réelles d'apprentissage. Cet article présente le développement et l'application d'une approche alternative pour évaluer l'utilisation de l'hypermidia dans l'enseignement supérieur. Cette approche renforce l'évaluation formative et la réalisation des travaux complexes et significatifs de la part des élèves, en coopération, a partir d'une perspective constructiviste. Les résultats de l'application de l'approche nous suggèrent que l'évaluation orientée vers le produit et realizée par des spécialistes, même étant nécessaire, n 'est pas suffisante pour nous donner l'estimative de l'efficacité d'un logiciel éducatif hypermidia et partie intégrante de l approche. L'évaluation des effets de l'emploi de l'hypermidia a révélé qu 'elle est un outil cognitiftrès utile pour l'éducation. Resumen Por sus características específicas, la tecnología hipermidia necesita una abordaje no tradicional y criterios apropiados para ser evaluada. Tal abordaje inclue tanto la evaluación de producto quanto la de proceso y es desarrollada en ambientes reales de aprendizaje. Este artículo presenta el desarollo y la aplicación de una abordaje alternativa para evaluar la utilización de la hipermidia por alumnos del curso de Pedagogia. Esta abordaje resalta la evaluación formativa y la realización de tareas complexas y significativas por alumnos en cooperación según una perspectiva constructivista. Los resultados de la aplicación de esta abordaje sugeren que la evaluación orientada para el producto, realizada por los especialistas, es siempre necesaria, pero non fue suficiente para estimar su eficacia. Los efectos de la hipermidia indicaran que ella es un instrumento cognitivo de grande utilidad educacional.
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Proxys (logiciels)"

1

Chiapponi, Elisa. "Detecting and Mitigating the New Generation of Scraping Bots." Electronic Thesis or Diss., Sorbonne université, 2023. http://www.theses.fr/2023SORUS490.

Full text
Abstract:
Chaque jour, une guerre invisible pour les données se déroule entre les sites de commerce électronique et les acteurs qui,en siphonnent les données, sont appelés "scrapers'' . Les sites de commerce électronique détiennent les données au cœur du conflit et souhaitent les fournir uniquement aux utilisateurs légitimes. Les scrapers veulent un accès illimité et continu aux données susmentionnées pour en tirer profit. Pour atteindre cet objectif, les scrapers envoient de grandes quantités de requêtes aux sites de commerce électronique, ce qui leur cause des problèmes financiers. Cela a conduit l'industrie de la sécurité à s'engager dans une course aux armements contre les scrapers afin de créer de meilleurs systèmes pour détecter et contrer leurs activités. À l'heure actuelle, la bataille se poursuit, mais les scrapers semblent avoir le dessus, notamment grâce à leur utilisation de Proxies IP Résidentiels (RESIPs). Dans cette thèse, nous visons à rééquilibrer la balance des forces en introduisant de nouvelles techniques de détection et d'atténuation qui surmontent les limitations des méthodes actuelles. Nous proposons une technique inspirée des "pots de miel'' qui piège les scrapers en leur faisant croire qu'ils ont obtenu les données visées tandis qu'ils reçoivent des informations modifiées. Nous présentons deux nouvelles techniques de détection basées sur des mesures de réseau qui identifient les requêtes émanant de scrapers cachés derrière les infrastructures RESIP. À travers un partenariat en cours avec Amadeus IT Group, nous validons nos résultats en utilisant des données opérationnelles réelles. Conscients que les scrapers ne cesseront pas de chercher de nouvelles façons d'éviter la détection et l'atténuation, nous offrons des contributions qui peuvent aider à élaborer les prochaines armes défensives pour lutter contre les scrapers. Nous proposons une caractérisation complète des RESIPs, la plus puissante arme actuellement à la disposition des scrapers. De plus, nous examinons la possibilité d'acquérir des renseignements sur les menaces liées aux scrapers en les géolocalisant lorsqu'ils envoient des demandes via un RESIP
Every day an invisible war for data takes place between e-commerce websites and web scrapers. E-commerce websites own the data at the heart of the conflict and would like to provide it only to genuine users. Web scrapers aim to have illimited and continuous access to the above-mentioned data to capitalize on it. To achieve this goal, scrapers send large amounts of requests to e-commerce websites, causing them financial problems. This led the security industry to engage in an arms race against scrapers to create better systems to detect and mitigate their requests. At present, the battle continues, but scrapers appear to have the upper hand, thanks to the usage of Residential IP Proxies (RESIPs). In this thesis, we aim to shift the balance by introducing novel detection and mitigation techniques that overcome the limitations of current state-of-the-art methods. We propose a deceptive mitigation technique that lures scrapers into believing they have obtained their target data while they receive modified information. We present two new detection techniques based on network measurements that identify scraping requests proxied through RESIPs. Thanks to an ongoing collaboration with Amadeus IT Group, we validate our results on real-world operational data. Being aware that scrapers will not stop looking for new ways to avoid detection and mitigation, this thesis provides additional contributions that can help in building the next defensive weapons for fighting scrapers. We propose a comprehensive characterization of RESIPs, the strongest weapon currently at the disposal of scrapers. Moreover, we investigate the possibility of acquiring threat intelligence on the scrapers by geolocating them when they send requests through a RESIP
APA, Harvard, Vancouver, ISO, and other styles
2

Chicoulaa, Bruno. "Reflexion sur l'informatisation d'un service d'urgence : a propos d'un logiciel." Toulouse 3, 1994. http://www.theses.fr/1994TOU31030.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Blanchard, Yann. "Le dossier résumé de séjour à l'hôpital général : à propos du logiciel RSSPLUS : première étape du système d'information médical des centres hospitaliers de Dax et de Bayonne." Bordeaux 2, 1989. http://www.theses.fr/1989BOR23044.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Girka, Thibaut. "Differential program semantics." Thesis, Sorbonne Paris Cité, 2018. http://www.theses.fr/2018USPCC147/document.

Full text
Abstract:
Les programmes informatiques sont rarement écrits d'un seul coup, et sont au contraire composés de changements successifs. Il est également fréquent qu'un logiciel soit mis à jour après sa sortie initiale. De tels changements peuvent avoir lieu pour diverses raisons, comme l'ajout de fonctionnalités ou la correction de bugs. Il est en tout cas important d'être capable de représenter ces changements et de raisonner à leur propos pour s'assurer qu'ils implémentent les changements voulus.En pratique, les différences entre programmes sont très souvent représentées comme des différences textuelles sur le code source, listant les lignes de textes ajoutées, supprimées ou modifiées. Cette représentation, bien qu'exacte, ne dit rien de leurs conséquences sémantiques. Pour cette raison, il existe un besoin pour de meilleures représentations des différences sémantiques entre programmes.Notre première contribution est un algorithme de construction de programmes de corrélation, c'est-à-dire, des programmes entrelaçant les instructions de deux autres programmes de telle sorte qu'ils simulent leur sémantiques. Ces programmes de corrélation peuvent alors être analysés pour calculer une sur-approximation des différences sémantiques entre les deux programmes d'entrée. Ce travail est directement inspiré d'un article de Partush et Yahav, qui décrit un algorithme similaire, mais incorrect en présence de boucles avec des instructions `break` ou `continue`. Pour garantir la correction de notre algorithme, nous l'avons formalisé et prouvé à l'aide de l'assistant de preuve Coq.Notre seconde et plus importante contribution est un cadre formel permettant de décrire précisément et de formellement vérifier des différences sémantiques. Ce cadre, complètement formalisé en Coq, représente la différence entre deux programmes à l'aide d'un troisième programme que nous appelons oracle. Contrairement à un programme de corrélation, un oracle n'entrelace pas nécessairement les instructions des deux programmes à comparer, et peut « sauter » des calculs intermédiaires.Un tel oracle est généralement écrit dans un langage de programmation différent des programmes à comparer, ce qui permet de concevoir des langages d'oracles spécifiques à certaines classes de différences, capables de mettre en relation des programmes qui plantent avec des programmes qui s'exécutent correctement.Nous avons conçu de tels langages d'oracles pour couvrir un large éventail de différences sur un langage impératif jouet. Nous avons également prouvé que notre cadre est au moins aussi expressif que celui de la Logique Relationnelle de Hoare en encodant plusieurs variantes de cette dernière sous forme de langages d'oracles, prouvant leur correction dans la foulée
Computer programs are rarely written in one fell swoop. Instead, they are written in a series of incremental changes.It is also frequent for software to get updated after its initial release. Such changes can occur for various reasons, such as adding features, fixing bugs, or improving performances for instance. It is therefore important to be able to represent and reason about those changes, making sure that they indeed implement the intended modifications.In practice, program differences are very commonly represented as textual differences between a pair of source files, listing text lines that have been deleted, inserted or modified. This representation, while exact, does not address the semantic implications of those textual changes. Therefore, there is a need for better representations of the semantics of program differences.Our first contribution is an algorithm for the construction of a correlating program, that is, a program interleaving the instructions of two input programs in such a way that it simulates theirsemantics. Further static analysis can be performed on such correlating programs to compute an over-approximation of the semantic differences between the two input programs. This work draws direct inspiration from an article by Partush and Yahav, that describes a correlating program construction algorithm which we show to be unsound on loops that include `break` or `continue`statements. To guarantee its soundness, our alternative algorithm is formalized and mechanically checked within the Coq proof assistant.Our second and most important contribution is a formal framework allowing to precisely describe and formally verify semantic changes.This framework, fully formalized in Coq, represents the difference between two programs by a third program called an oracle.Unlike a correlating program, such an oracle is not required to interleave instructions of the programs under comparison, and may “skip” intermediate computation steps. In fact, such an oracle is typically written in a different programming language than the programs it relates, which allows designing correlating oracle languages specific to certain classes of program differences, andcapable of relating crashing programs with non-crashing ones.We design such oracle languages to cover a wide range of program differences on a toy imperative language. We also prove that our framework is at least as expressive as Relational Hoare Logic by encoding several variants as correlating oracle languages, proving their soundness in the process
APA, Harvard, Vancouver, ISO, and other styles
5

Lardon, Jérémy. "Proxy d'interface Homme-Machine : apport des algorithmes génétiques pour l'adaptation automatique de la présentation de documents Web." Phd thesis, Université Jean Monnet - Saint-Etienne, 2010. http://tel.archives-ouvertes.fr/tel-00573036.

Full text
Abstract:
L'informatique pervasive, paradigme fer de lance des services "anytime/anywhere", appelle de plus en plus à des travaux de ré-ingénierie de sites Web. L'approche contemporaine de l'informatique dans les nuages va d'ailleurs générer de nouveaux besoins en ce sens. Aujourd'hui, les sites Web sont toujours pensés pour l'affichage sur un ordinateur traditionnel (comprendre desktop). Des versions alternatives sont cependant de plus en plus proposées pour l'accès via des smartphones, ou encore des dispositifs de visionnage jadis passifs, comme la télévision (interactive IPTV). Ces travaux de développement sont souvent relégués à des tâches ad hoc dans la gestion de projet du développement d'un site Web. Les algorithmes génétiques nous permettent d'approximer le problème d'optimisation de cette composition et son séquencement. En effet, les différentes compositions possibles sont mises en concurrence, croisées, et évaluées, de sorte qu'une solution proche d'un optimum puisse se dégager en un temps fini. Cet algorithme est au cœur du moteur d'adaptation proposé. La thèse fait état de l'implémentation de ce modèle complet, des résultats obtenus expérimentalement, et propose une interprétation de la performance du système. Le premier chapitre de ce mémoire est consacré au contexte de notre étude et l'étude bibliographique. Tout d'abord, nous présentons les généralités sur le domaine de l'adaptation automatique de documents Web. Par la suite, nous donnons un aperçu des contributions scientifiques ainsi que des systèmes déjà développés pour l'adaptation automatique. L'étude comparative de ces travaux nous a permis de dégager les pistes de travail de l'approche proposée dans la thèse. Le second chapitre propose notre modèle et plus particulièrement son découpage architectural. Nous présentons les concepts d'estimation de valeurs caractéristiques et de simulation de transformations permettant de construire l'algorithme génétique au centre de notre moteur d'hypermédia adaptatifs. L'implémentation de notre algorithme génétique y est également explicitée. Enfin, le chapitre 3 sert à présenter l'observation des résultats obtenus, ainsi que dresser les conclusions liées à notre implémentation. Cette présentation est associée à une étude des conséquences de la variation des paramètres de notre modèle et des ressources computationnelles. De cette analyse, nous soulevons les perspectives qu'offrent nos travaux
APA, Harvard, Vancouver, ISO, and other styles
6

Nigron, Pierre. "Effectful programs and their proofs in type theory : application to certified compilation and certified packet processing." Electronic Thesis or Diss., Sorbonne université, 2022. http://www.theses.fr/2022SORUS480.

Full text
Abstract:
Pour pouvoir raisonner sur nos programmes, une méthode est de directement les écrire dans un assistant de preuve. Utilisant la correspondance de Curry-Howard, les programmes et les preuves ne font alors qu’un. Pour ne pas nuire à la cohérence logique de l’assistant de preuve, le système est obligé de restreindre les programmes à ne pas avoir d’effets de bord. Cependant, les effets de bord sont omniprésents et essentiels dans la programmation. Différentes techniques telles que les monades ou les effets algébriques ont alors émergé pour les modéliser offrant ainsi un moyen d'écrire des programmes impératifs dans des langages purement fonctionnels. C'est donc assez naturellement que les résultats de décennies de recherches investies pour raisonner sur des programmes impératifs tentent d'être adaptés au raisonnement sur des programmes avec effets. Dans cette thèse, nous nous intéressons d'abord à l'utilisation de la logique de séparation pour raisonner sur des programmes avec effets implémentés dans un assistant de preuve. Nous étudions une approche consistant à décrire les comportements des effets à l'aide d'un transformateur de prédicats. Nous nous concentrons d'abord sur la fraîcheur, puis sur le traitement de paquets et le zéro-copie. Pour étudier notre approche, nous nous appuyons sur deux exemples concrets qui sont le module SimplExpr de CompCert et la bibliothèque de décodeur Nom. Pour finir, pour compiler les analyseurs de paquets produits vers C, nous proposons une méthode par raffinement supprimant les continuations introduites par l'utilisation d'une monade libre et effectuant quelques optimisations
One way to reason about our programs is to write them directly into a proof assistant. Using the Curry-Howard correspondence, programs and proofs are then one. In order not to undermine the logical consistency of the proof assistant, the system is forced to restrict the programs to have no side effects. However, side effects are ubiquitous and essential in programming. Different techniques such as monads or algebraic effects have emerged to model them, thus offering a way to write imperative programs in purely functional languages. It is therefore quite natural that the results of decades of research invested in reasoning about imperative programs try to be adapted to reasoning about programs with effects. In this thesis, we are first interested in the use of separation logic to reason about programs with effects implemented in a proof assistant. We study an approach to describe the behaviour of effects using a predicate transformer. We focus first on freshness, then on packet processing and zero-copy. To study our approach, we rely on two concrete examples which are the SimplExpr module of CompCert and the decoder library Nom. Finally, in order to compile the packet parsers produced to C, we propose a refinement method removing the continuations introduced by the use of a free monad and performing some optimizations
APA, Harvard, Vancouver, ISO, and other styles
7

Nguyen, Thi Minh Tuyen. "Taking architecture and compiler into account in formal proofs of numerical programs." Phd thesis, Université Paris Sud - Paris XI, 2012. http://tel.archives-ouvertes.fr/tel-00710193.

Full text
Abstract:
On some recently developed architectures, a numerical program may give different answers depending on the execution hardware and the compilation. These discrepancies of the results come from the fact that each floating-point computation is calculated with different precisions. The goal of this thesis is to formally prove properties about numerical programs while taking the architecture and the compiler into account. In order to do that, we propose two different approaches. The first approach is to prove properties of floating-point programs that are true for multiple architectures and compilers. This approach states the rounding error of each floating-point computation whatever the environment and the compiler choices. It is implemented in the Frama-C platform for static analysis of C code. The second approach is to prove behavioral properties of numerical programs by analyzing their compiled assembly code. We focus on the issues and traps that may arise on floating-point computations. Direct analysis of the assembly code allows us to take into account architecture- or compiler-dependent features such as the possible use of extended precision registers. It is implemented above the Why platform for deductive verification
APA, Harvard, Vancouver, ISO, and other styles
8

Dénès, Maxime. "Étude formelle d'algorithmes efficaces en algèbre linéaire." Phd thesis, Université Nice Sophia Antipolis, 2013. http://tel.archives-ouvertes.fr/tel-00945775.

Full text
Abstract:
Les méthodes formelles ont atteint un degré de maturité conduisant à la conception de systèmes de preuves généralistes, permettant à la fois de vérifier la correction de systèmes logiciels complexes ou de formaliser des mathématiques avancées. Mais souvent, l'accent est mis davantage sur la facilité du raisonnement sur les programmes plutôt que sur leur exécution efficace. L'antagonisme entre ces deux aspects est particulièrement sensible pour les algorithmes de calcul formel, dont la correction repose habituellement sur des concepts mathématiques élaborés, mais dont l'efficacité pratique est une préoccupation importante. Cette thèse développe des approches à l'étude formelle et l'exécution efficace de programmes en théorie des types, et plus précisément dans l'assistant à la preuve \coq{}. Dans un premier temps, nous présentons un environnement d'exécution permettant de compiler en code natif de tels programmes tout en conservant la généralité et l'expressivité du formalisme. Puis, nous nous intéressons aux représentations de données et plus particulièrement au lien formellement vérifié et automatisé entre représentations adaptées aux preuves ou au calcul. Ensuite, nous mettons à profit ces techniques pour l'étude d'algorithmes en algèbre linéaire, comme le produit matriciel de Strassen, le procédé d'élimination de Gauss ou la mise en forme canonique de matrices, dont notamment la forme de Smith pour les matrices sur un anneau euclidien. Enfin, nous ouvrons le champ des applications à la formalisation et au calcul certifié des groupes d'homologie de complexes simpliciaux issus d'images numériques.
APA, Harvard, Vancouver, ISO, and other styles
9

Perron, Sébastien. "Détection d'erreurs et confinement logiciel : une évaluation empirique." Thèse, 2021. http://depot-e.uqtr.ca/id/eprint/9664/1/eprint9664.pdf%20.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Proxys (logiciels)"

1

Lohr, Steve, and Joel Brinkley. U.S. v. Microsoft: The inside story of the landmark case. New York: McGraw-Hill, 2001.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Steve, Lohr, ed. U.S. v. Microsoft. New York: McGraw-Hill, 2001.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

Code complete: A practical handbook of software construction. Redmond, Wash: Microsoft Press, 1993.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

Code complete. 2nd ed. Redmond, Wash: Microsoft Press, 2004.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

Code complete. 2nd ed. Redmond, Wash: Microsoft Press, 2004.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

Code complete: [a practical handbook of software construction]. 2nd ed. Redmond, Wash: Microsoft Press, 2004.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
7

Code Complete, Second Edition. Redmond: Microsoft Press, 2007.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
8

Code complete. Redmond, WA: Microsoft Press, 1993.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
9

Auletta, Ken. World War 3.0: Microsoft and its enemies. New York: Random House, 2001.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
10

Abrahamsson, Pekka, and Jürgen Münch. Product-Focused Software Process Improvement: 8th International Conference, PROFES 2007, Riga, Latvia, July 2-4, 2007, Proceedings. Springer London, Limited, 2007.

Find full text
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Proxys (logiciels)"

1

Khoury, Raphaël. "Avant-Propos." In La sécurité logicielle: une approche défensive, XIII—XVI. Presses de l'Université Laval, 2021. http://dx.doi.org/10.2307/j.ctv1qp9gsh.3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Mignucci, Mario. "Expository Proofs In Aristotle’S Syllogistic." In Oxford Studies In Ancient Philosophy, 9–28. Oxford University PressOxford, 1991. http://dx.doi.org/10.1093/oso/9780198239659.003.0002.

Full text
Abstract:
Abstract which exposition appears to be a perfectly admissible way of proving logical theses. Nowadays two interpretations seem to have a following among scholars. One is the Lukasiewicz interpretation, which has been considerably improved by Patzig. The other view is shared by a rather heterogeneous group of scholars, some of them pure logicians rather than historians of logic.
APA, Harvard, Vancouver, ISO, and other styles
3

Grattan-Guinness, Ivor. "Turing’s mentor, Max Newman." In The Turing Guide. Oxford University Press, 2017. http://dx.doi.org/10.1093/oso/9780198747826.003.0052.

Full text
Abstract:
The interaction between mathematicians and mathematical logicians has always been much slighter than one might imagine. This chapter examines the case of Turing’s mentor, Maxwell Hermann Alexander Newman (1897–1984). The young Turing attended a course of lectures on logical matters that Newman gave at Cambridge University in 1935. After briefly discussing examples of the very limited contact between mathematicians and logicians in the period 1850–1930, I describe the rather surprising origins and development of Newman’s own interest in logic. One might expect that the importance to many mathematicians of means of proving theorems, and their desire in many contexts to improve the level of rigour of proofs, would motivate them to examine and refine the logic that they were using. However, inattention to logic has long been common among mathematicians. A very important source of the cleft between mathematics and logic during the 19th century was the founding, from the late 1810s onwards, of the ‘mathematical analysis’ of real variables, grounded on a theory of limits, by the French mathematician Augustin-Louis Cauchy. He and his followers extolled rigour—most especially, careful definitions of major concepts and detailed proofs of theorems. From the 1850s onwards, this project was enriched by the German mathematician Karl Weierstrass and his many followers, who introduced (for example) multiple limit theory, definitions of irrational numbers, and an increasing use of symbols, and then from the early 1870s by Georg Cantor with his set theory. However, absent from all these developments was explicit attention to any kind of logic. This silence continued among the many set theorists who participated in the inauguration of measure theory, functional analysis, and integral equations. The mathematicians Artur Schoenflies and Felix Hausdorff were particularly hostile to logic, targeting the famous 20th-century logician Bertrand Russell. (Even the extensive dispute over the axiom of choice focused mostly on its legitimacy as an assumption in set theory and its use of higher-order quantification: its ability to state an infinitude of independent choices within finitary logic constituted a special difficulty for ‘logicists’ such as Russell.) Russell, George Boole, and other creators of symbolic logics were exceptional among mathematicians in attending to logic, but they made little impact on their colleagues.
APA, Harvard, Vancouver, ISO, and other styles
4

Amsler, Mark. "Margery Kempe’s Strategic Vague Language." In The Medieval Life of Language. Nieuwe Prinsengracht 89 1018 VR Amsterdam Nederland: Amsterdam University Press, 2021. http://dx.doi.org/10.5117/9789463721929_ch06.

Full text
Abstract:
This chapter continues the previous analysis of heretics’ speech from the perspective of Conversation Analysis. Bakhtin’s theory of dialogism sets Kempe’s pragmatic thinking in a sociolinguistic frame. The narrative of her examinations at the Archbishop of York’s court suggests that people’s thinking about how language defines, expresses, controls, and resists also informed how they pragmatically and metapragmatically constructed their speech for social survival, subjective authority, or agency in asymmetric or hostile interactions. Medieval grammarians’ and logicians’ concerns with reference and equivocatio (ambiguity, polysemy, vagueness) were reinterpreted in controversies about how heretics and nonconformists talk in hostile institutional situations. Kempe’s sophisticated use of evasive, vague, hedged, and recontextualized speech and situational pragmatics proves more than a match for the Archbishop and his clerks.
APA, Harvard, Vancouver, ISO, and other styles
5

Balibar, Etienne. "What Is Man’ in Seventeenth-Century Philosophy? Subject, Individual, Citizen." In The Individual in Political Theory and Practice, 215–42. Oxford University PressOxford, 1996. http://dx.doi.org/10.1093/oso/9780198205494.003.0010.

Full text
Abstract:
Abstract My aim in this chapter is to read again the metaphysicians of the classical period (I have selected Descartes, Hobbes, Spinoza, Leibniz, and Locke) by focusing on their conceptions of individuality, inasmuch as these can be compared with contemporary developments in law and political institutions. A judgement by Friedrich Nietzsche in The Will to Power, however questionable it may appear, can introduce us to the interest and difficulty of the task: It proves difficult indeed to classify the authors of the great ‘systems’ in the history of ideas. Many of them are mathematicians or physicians, but they are no longer primarily theologians, logicians, moralists as were their predecessors, nor are they lawyers (with the remarkable exception of Leibniz), or writers, or natural scientists as would be their successors.
APA, Harvard, Vancouver, ISO, and other styles
6

Dummett, Michael. "Some Further Topics." In Elements of Intuitionism, 211–49. Oxford University PressOxford, 2000. http://dx.doi.org/10.1093/oso/9780198505242.003.0007.

Full text
Abstract:
Abstract So far we have discussed various principles which may be taken as axiomatic in intuitionistic mathematics, without attempting to delineate any actual formalizations of intuitionistic theories. This is appropriate to the subject. For the Hilbert school, and for formalists properly so called, formalization is integral to an exact treatment of mathematics; but the original impulse to formalization did not come from them, but from the logicists, for whom the formalization of a theory was a necessary means of identifying its basic principles, so that they could then show these to be derivable from pure logic. The intuitionists, on the other hand, were from the start hostile to formalization: for them, it is highly unlikely that the mental constructions intuitively recognizable as proving a statement of a given theory should be isomorphic to the formal proofs of any calculus, recognizable as such by a mechanical procedure making no appeal to meaning.
APA, Harvard, Vancouver, ISO, and other styles
7

Feferman, Solomon. "What Does Logic Have to Tell Us about Mathematical Proofs?" In In The Light Of Logic, 177–86. Oxford University PressNew York, NY, 1998. http://dx.doi.org/10.1093/oso/9780195080308.003.0009.

Full text
Abstract:
Abstract Modern logic provides a theoretical analysis of mathematics which cuts across its traditional subdivisions into algebra, geometry, analysis, etc. Actually, there is no single logical theory but a variety of such whose aim is to model the reasoning of, say, an idealized platonistic (set-theoretical) mathematician or an idealized constructivist mathematician at work in one of these areas of mathematics. What is common to the various theories is the use of formal systems to describe that activity, namely, as being one of drawing consequences within a specified class of symbolic expressions called formulas from certain formulas called axioms by use of certain specified rules of inference. Some logicians call themselves formalists and think that there is nothing more to mathematics than what is pictured as the production of derivations within formal systems. But those who take seriously the platonistic, constructivist, or other such views (for example, finitist, predicativist, etc.) also concern themselves with the meaning of what is expressed in formal languages. There is then the question as to which choices of axioms and rules are legitimate and-in case the systems are incomplete-how they might be legitimately extended. This leads one into controversial areas of the foundations of mathematics. I stress here the syntactic (formal) aspect as opposed to the semantic (and/or foundational) aspect of the logical description of mathematical activity. This is more neutral territory, but one within which there are a number of notions and results concerning the logical structure of mathematical proofs that are relevant to the remarks by Y. Manin in “Digression: proof” (Manin 1977, pp. 48-51). It should be added that I found those remarks both refreshing and stimulating.
APA, Harvard, Vancouver, ISO, and other styles
8

Barwise, Jon, and John Etchemendy. "Visual Information and Valid Reasoning." In Logical Reasoning with Diagrams. Oxford University Press, 1996. http://dx.doi.org/10.1093/oso/9780195104271.003.0005.

Full text
Abstract:
Psychologists have long been interested in the relationship between visualization and the mechanisms of human reasoning. Mathematicians have been aware of the value of diagrams and other visual tools both for teaching and as heuristics for mathematical discovery. As the chapters in this volume show, such tools are gaining even greater value, thanks in large part to the graphical potential of modern computers. But despite the obvious importance of visual images in human cognitive activities, visual representation remains a second-class citizen in both the theory and practice of mathematics. In particular, we are all taught to look askance at proofs that make crucial use of diagrams, graphs, or other nonlinguistic forms of representation, and we pass on this disdain to our students. In this chapter, we claim that visual forms of representation can be important, not just as heuristic and pedagogic tools, but as legitimate elements of mathematical proofs. As logicians, we recognize that this is a heretical claim, running counter to centuries of logical and mathematical tradition. This tradition finds its roots in the use of diagrams in geometry. The modern attitude is that diagrams are at best a heuristic in aid of finding a real, formal proof of a theorem of geometry, and at worst a breeding ground for fallacious inferences. For example, in a recent article, the logician Neil Tennant endorses this standard view: . . . [The diagram] is only an heuristic to prompt certain trains of inference; . . . it is dispensable as a proof-theoretic device; indeed, . . . it has no proper place in the proof as such. For the proof is a syntactic object consisting only of sentences arranged in a finite and inspectable array (Tennant [1984]). . . . It is this dogma that we want to challenge. We are by no means the first to question, directly or indirectly, the logocentricity of mathematics arid logic. The mathematicians Euler and Venn are well known for their development of diagrammatic tools for solving mathematical problems, and the logician C. S. Peirce developed an extensive diagrammatic calculus, which he intended as a general reasoning tool.
APA, Harvard, Vancouver, ISO, and other styles
9

Fisler, Kathi D. "Exploiting the Potential of Diagrams in Guiding Hardware Reasoning." In Logical Reasoning with Diagrams. Oxford University Press, 1996. http://dx.doi.org/10.1093/oso/9780195104271.003.0016.

Full text
Abstract:
. . .Formal methods offer much more to computer science than just “proofs of correctness” for programs and digital circuits. Many of the problems in software and hardware design are due to imprecision, ambiguity, incompleteness, misunderstanding, and just plain mistakes in the statement of top-level requirements, in the description of intermediate designs, or in the specification of components and interfaces. Rushby [1993] . . .Desire for correctness proofs of systems spawned the research area known as “formal methods”. Today’s systems are of sufficient complexity that testing is infeasible, both computationally and financially. As an alternative, formal methods promote mathematical analysis of a system as a means of locating inconsistencies and other design errors. Techniques used can range from writing system descriptions in a formal notation to verification that the designed system satisfies a particular behavioral specification. A good general introduction to formal methods appears in Rushby [1993]. Ideally, using formal methods increases our assurance in and understanding of our designs. Assurance results from proof, while understanding results from the process of producing the proof. Successful use of formal methods therefore requires powerful proof techniques and clear logical notations. The verification research community has paid considerable attention to the former. Current techniques, many of which can be fully automated, handle sufficiently complex systems that formal methods are now being adopted (albeit slowly) in industry. In our drive to provide powerful proof methods, however, we have overlooked the latter requirement. Research has focused on proof without paying sufficient attention to reasoning. Current tools are often criticized as too hard to use, despite their computational power. Most designers, not having been trained as logicians, find the methodologies and notations very unnatural. Industrial sites, starting out with formal methods, must often rely on external verification professionals to help them use these tools effectively (NASA [1995]). Tools that are not supportive of reasoning therefore fail to provide the full benefits of formal methods. We can augment our current methodologies to address this problem, but we first need to understand reasoning and its role in hardware design.
APA, Harvard, Vancouver, ISO, and other styles
10

Barwise, Jon, and John Etchemendy. "Heterogeneous Logic." In Logical Reasoning with Diagrams. Oxford University Press, 1996. http://dx.doi.org/10.1093/oso/9780195104271.003.0014.

Full text
Abstract:
A major concern to the founders of modern logic—Frege, Peirce, Russell, and Hilbert—was to give an account of the logical structure of valid reasoning. Taking valid reasoning in mathematics as paradigmatic, these pioneers led the way in developing the accounts of logic which we teach today and that underwrite the work in model theory, proof theory, and definability theory. The resulting notions of proof, model, formal system, soundness, and completeness are things that no one claiming familiarity with logic can fail to understand, and they have also played an enormous role in the revolution known as computer science. The success of this model of inference led to an explosion of results and applications. But it also led most logicians—and those computer scientists most influenced by the logic tradition—to neglect forms of reasoning that did not fit well within this model. We are thinking, of course, of reasoning that uses devices like diagrams, graphs, charts, frames, nets, maps, and pictures. The attitude of the traditional logician to these forms of representation is evident in the quotation of Neil Tennant in Chapter I, which expresses the standard view of the role of diagrams in geometrical proofs. One aim of our work, as explained there, is to demonstrate that this dogma is misguided. We believe that many of the problems people have putting their knowledge of logic to work, whether in machines or in their own lives, stems from the logocentricity that has pervaded its study for the past hundred years. Recently, some researchers outside the logic tradition have explored uses of diagrams in knowledge representation and automated reasoning, finding inspiration in the work of Euler, Venn, and especially C. S. Peirce. This volume is a testament to this resurgence of interest in nonlinguistic representations in reasoning. While we applaud this resurgence, the aim of this chapter is to strike a cautionary note or two. Enchanted by the potential of nonlinguistic representations, it is all too easy to overreact and so to repeat the errors of the past.
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Proxys (logiciels)"

1

Costa Mendes, L., and A. G. Bodard. "Apport d’un logiciel de reconstruction 3D dans la prise en charge chirurgico-orthodontique de la dysplasie cléido-crânienne : à propos d’un cas clinique." In 63ème Congrès de la SFCO, edited by S. Boisramé, S. Cousty, J. C. Deschaumes, V. Descroix, L. Devoize, P. Lesclous, C. Mauprivez, and T. Fortin. Les Ulis, France: EDP Sciences, 2015. http://dx.doi.org/10.1051/sfco/20156302002.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography