Academic literature on the topic 'Hypothèse transhumaniste'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Hypothèse transhumaniste.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Hypothèse transhumaniste"

1

Silva Souza, Renata, Edna Alves de Souza, Maria Eunice Quilici Gonzalez, and Tatiane Pereira da Silva. "The Transhumanist conception of body: a critical analysis from a complex systems perspective." Revista Natureza Humana 22, no. 1 (July 6, 2020): 17. http://dx.doi.org/10.17648/2175-2834-v22n1-431.

Full text
Abstract:
What is the conception of the human body underlying the Transhumanismproject? This question guides the present analysis undertaken from a philosophicalinterdisciplinary perspective. Inspired by Le Breton (2008) and Morin’s (2014)hypotheses about the complexity of the human body, we criticize mechanisticconception of the living body underlying the Transhumanism project. Implications ofthe Transhumanism project for personal identity are proposed based on hypotheses ofcomplex systems theory as a starting point for critical reflection on a possible gloomyfuture envisioned by the unnatural/artificial development of the transhuman body.
APA, Harvard, Vancouver, ISO, and other styles
2

Vujačić, Sanja, and Yasser Alayli. "Transhumanism — Robotics Interactions." Journal of Energy - Energija 70, no. 2 (June 30, 2021): 7–10. http://dx.doi.org/10.37798/20217029.

Full text
Abstract:
Thinking wrong that the humanistic ideal has already been realized, transhumanism is promoting the liberation of man by progressive overcoming of his own biological boundaries in order to improve his physical and mental abilities, as well as his life extension. From interaction between biomimetic robotics and transhumanism augmented man was born. But a smart management application is still missing, that one for a future society, in which men and supermen will live together and which would allow technological improvement to everyone and halt the accelerated process of deepening socio-economic inequalities. That's the reason why the transhumanism is often experienced as an opponent and competitor to humanism, same as the new technology development which is often perceived as the goal in itself - as a phenomenon which is not in the service of the real needs of social development. Consequently, It is urgent to move from simple thinking (speculations, preferences and beliefs) to the complex one: proposing hypotheses and solutions, creating relationships and observing interactions, seeking criteria, relying on verified facts and self-correction practices.
APA, Harvard, Vancouver, ISO, and other styles
3

FEDOTOVA, Marina, Elena KOZLOVA, and Yin BÌN. "Artificial Intelligence Methods in Predicting the Productivity of Project Teams: Transhumanism and Experience in Practical Research." WISDOM 2, no. 1 (May 26, 2022): 43–50. http://dx.doi.org/10.24234/wisdom.v2i1.769.

Full text
Abstract:
The article considers issues related to the use of artificial intelligence methods in the technoscience concept while moving from personnel management to human resource management using artificial intelligence elements. Authors consider the development of human resources at the expense of cognitive-communicative resources of personnel in specific (transformed) conditions of consciousness when a synergy of neurocognitively enhanced human capabilities and artificial intelligence occurs. Such situations are considered in predicting the productivity of project teamwork, characterized by various aspects: organizational, cognitive-communicative, socio-psychological, etc. It is analyzed a specific example of predictive analytics related to the assessment of future results of newly created teams (shift teams) and results that are corrected based on already existing teams (V. K. Finn’s DSM-method of automatic hypotheses generation, as a way to organize knowledge using the non-Aristotelian structure of concepts, 2009). Some difficulties of using the shift form of labour organization are considered. The methodology for predicting the teams’ assessment is based on the results of express diagnostics of their work on specific test cases and the general database of characteristics and results already existing successful and unsuccessful teams.
APA, Harvard, Vancouver, ISO, and other styles
4

Bainbridge, William Sims. "Nemesis in Mordor: The Possibility of Posthuman Savagery." Journal of Posthuman Studies 5, no. 2 (December 1, 2021): 169–89. http://dx.doi.org/10.5325/jpoststud.5.2.0169.

Full text
Abstract:
Innovative computer games can sometimes serve as valid simulations of real sociocultural processes, exploring hypotheses about the possible impact of future technology on civilization. In February 2021, Warner Brothers was granted a patent for an artificial social intelligence system that it first used in Shadow of Mordor, a very popular computer game based on the legendarium of J.R.R. Tolkien, but reversing his humanistic and precautionary values. The main theme of Tolkien’s The Lord of the Rings was development of genuine friendship in a crusade to destroy a technology that gave its user superhuman powers at the cost of replacing sympathy with total selfishness. Shadow of Mordor and its successor Shadow of War promoted sadism and enslavement as tools for transcending human limitations, implicitly slandering transhumanism. This article surveys this troubling dynamic in four parts: (1) a conceptual introduction drawing upon a diverse literature about the human dimensions of current technological progress, (2) an overview of recent developments in the genre of Tolkien computer games, (3) a close examination of how the Nemesis multiagent system was designed, and (4) an initial assessment of public reactions to the Shadows expressed through videos and text comments on YouTube.
APA, Harvard, Vancouver, ISO, and other styles
5

Filinich, Renzo, and Tamara Jesus Chibey. "Becoming and Individuation on the Encounter between Technical Apparatus and Natural System." M/C Journal 23, no. 4 (August 12, 2020). http://dx.doi.org/10.5204/mcj.1651.

Full text
Abstract:
This essay sheds lights on the framing process during the research on the crossing between natural and artificial systems. To approach this, we must outline the machine-natural system relation. From this notion, technology is not seen as an external thing, nor even in contrast to an imaginary of nature, but as an effect that emerges from our thinking and revealing being that, in many cases, may be reduced to an issue of knowledge and action. Here, we want to consider the concept of transduction from Gilbert Simondon as one possible framework for considering the socio-technological actions at stake. His thought offers a detailed conceptual vocabulary for the question of individuation as a “revelation process”, a concern with how things come into existence and proceed temporally as projective entities.Moreover, our approach to the work of philosopher Simondon marked the starting point of our interest and approach to the issue of technique and its politics. From this perspective, the reflection given by Simondon in his thesis on the Individuation and the Mode of Existence of Technical Objects, is to trace certain reasons that are necessary for the development of this project and helping to explain it. In first place, Simondon does not state a specific regime of “human individuation”. The possibility of a psychic and collective individuation is produced, as is manifested when addressing the structure of his main thesis, at the heart of biological individuation; Simondon strongly attacks the anthropocentric tendencies that attempt to establish a defining boundary between biological and psychic reality. We may presume, then, that the issue of language as a defining and differencing element of the human does not interest him; it is at this point that our project begins to focus on employing the transduction of the téchnē as a metaphor of life (Espinoza Lolas et al.); regarding the limits that language may imply for the conformation and expression of the psychic reality. In second place, this critique to the economy of attention present across our research and in Simondon’s thinking seeks to introduce a hypothesis raised in another direction: towards the issue of the technique. During the introduction of his Mode of Existence of Technical Objects, Simondon shows some urgency in the need to approach the reality of technical objects as an autonomous reality and as a configuring reality of the psychic and collective individualisation. Facing the general importance granted to language as a key element of the historical and hermeneutical, even ontological, aspects of the human being, Simondon considers that the technique is the reality that plays the fundamental role of mediating between the human being and the world.Following these observations, a possible question that will guide our research arises: How do the technologisation and informatisation of the cultural techniques alter the nature itself of the knowing of the affection of being with others (people, things, animals)? In the hypothesis of this investigation we claim that—insofar as we deliver an approach and perspective on the technologisation of the world as a process of individuation (considering Simondon’s concept in this becoming, in which an artificial agent and its medium may get out of phase to solve its tensions and give rise to physical or living individuals that constitute their system and go through a series of metastable equilibria)—it’s possible to prove this capacity of invention as a clear example of a form of transindividual individuation (referring to the human being), that thanks to the information that the artificial agent acquires and recovers by means of its “imagination”, which integrates in its perception and affectivity, enables the creation of new norms or artifacts installing in its becoming, as is the case of bioeconomy and cognitive capitalism (Fumagalli 219). It is imperious to observe and analyse the fact that the concept of nature must be integrated along with the concept of Cosmotecnia (Hui 3) to avoid the opposition between nature and technique in conceptual terms, and that is the reason why in the following section we will mention a third memory that is inscribed in this concept. There is no linear time development in human history from nature to technique, from nature to politics.The Extended MindThe idea of memory as something transmissible is important when thinking of the present, there is no humanity outside the technical, neither prior to the technical, and it is important to safeguard this idea to highlight the phýsis/téchnē dichotomy presented by Simondon and Stigler. It is erroneous to think that some entity may exceed the human, that it has any exteriority when it is the materialization of the human forms, or even more, that the human is crossed by it and is not separable. For French philosopher Bernard Stiegler there is no human nature without technique, and vice versa (Stigler 223). Here appears the issue of knowing which are the limits where “the body of the human me might stop” (Hutinel 44), a first glimpse of externalized memory was the flint axe, which is made by using other tools, even when its use is unknown. Its mere existence preserves a knowledge that goes beyond who made it, or its genetic or epigenetic transmission is preserved beyond the organic.We raise the question about a phýsis coming from the téchnē, it is a central topic that dominates the discussion nowadays, about technology and its ability to have a transforming effect over every area of contemporary life and human beings themselves. It is being “revealed” that the true qualitative novelty of the technological improves that happen in front of our eyes resides not only in the appearance of new practices that are related to any particular scientific research. We must point out the evident tension between bíos and zôê during the process of this adaptation, which is an ontological one, but we also witness how the recursivity becomes a modus operandi during this process, which is both social and technological. Just as the philosophy of nature, the philosophy of biology confronts its own limit under the light shed by the recursive algorithms implemented as a dominant way of adaptation, which is what Deleuze called societies of control (Deleuze 165). At the same time, there is an artificial selection (instead of a natural selection) imposed by the politics of transhumanism (for example, human improvement, genetic engineering).In this direction, a first aspect to consider resides in that life, held as an object of power and politics, does not constitute a “natural life”, but the result of a technical production from which its “nature” develops, as well as the possibilities of its deployment. Now then, it is precisely due to this gesture that Stiegler longs to distinguish between what is originary in mankind and its artefactual or artificial becoming: “the prosthesis is not a simple extension of the human body, it is the constitution of said body insofar as ‘human’ (the quotes belong to the constitution). It is not a ‘medium’ for mankind, but its end, and it is known the essential mistakenness of the expression, ‘the end of mankind’” (Stiegler 9). Before such phenomena, it is appropriate to lay out a reflexive methodology centered in observing and analysing the aforementioned idea by Stiegler that there is no mankind without techniques; and there is no technique without mankind (Stigler 223). This implies that this idea of téchnē comprises both the techniques needed to create things, as the technical products resulting from these techniques. The word “techniques” also becomes ambiguous among the modern technology of machines and the primitive “tools” and their techniques, whether they have become art of craft, things that we would not necessarily think as “technology”. What Stiegler is suggesting here is to describe the scope of the term téchnē within an ontogenetic and phylogenetic process of the human being; providing us a reflection about what do we “possess as a fundamental thing” for our being as humans is also fundamental to how “we experience time” since the externalization of our memory into our tools, which Stiegler understands as a “third kind” of memory which is separated from the internal memory that is individually acquired from our brain (epigenetic), and the biological evolutive memory that is inherited from our ancestors (phylogenetic); Stiegler calls this kind of evolutive process epiphylogenetic or epiphylogenesis. Therefore, we could argue that we are defined by this process of epiphylogenesis, and that we are constituted by a past that we ourselves, as individuals, have not lived; this past is delivered to us through culture, which is the fusion of the “technical objects that embody the knowledge of our ancestors, tools that we adopt to transform our surroundings” (Stiegler 177). These supports of external memory (this is, exteriorisations of the consciousness) provide a new collectivisation of the consciousness that exists beyond the individual.The current trend of investigation of ontogeny and phylogeny is driven by the growing consensus both in sciences and humanities in that the living world in every one of its aspects – biologic, semiotic, economic, affective, social, etc. – escapes the finite scheme of description and representation. It is for this reason that authors such as Matteo Pasquinelli refer, in a more modest way, to the idea of “augmented intelligence” (9), reminding us that there is a posthuman legacy between human and machine that still is problematic, “though the machines manifest different degrees of autonomous agency” (Pasquinelli 11).For Simondon, and this is his revolutionary contribution to philosophy, one should think individuation not from the perspective of the individual, but from the point of view of the process that originated it. In other words, individuation must be thought in terms of a process that not only takes for granted the individual but understands it as a result.In Simondon’s words:If, on the contrary, one supposes that individuation does not only produce the individual, one would not attempt to pass quickly through the stage of individuation in order arrive at the final reality that is the individual--one would attempt to grasp the ontogenesis in the entire progression of its reality, and to know the individual through the individuation, rather than the individuation through the individual. (5)Therefore, the epistemological problem does not fall in how the téchnē flees the human domain in its course to become technologies, but in how these “exteriorization” processes (Stiegler 213) alter the concepts themselves of number, image, comparison, space, time, or city, to give a few examples. However, the anthropological category of “exteriorization” does not bring entirely justice to these processes, as they work in a retroactive and recursive manner in the original techniques. Along with the concept of text and book, the practice of reading has also changed during the course of digitalisation and algorithmisation of the processing of knowledge; alongside with the concept of comparison, the practice of comparison has changed since the comparison (i.e. of images) has become an operation that is based in the extraction of data and automatic learning. On the other side, in reverse, we must consider, in an archeological and mediatic fashion, the technological state of life as a starting point from which we must ask what cultural techniques were employed in first place. Asking: How does the informatisation of the cultural techniques produce new forms of subjectivity? How does the concept of cultural techniques already imply the idea of “chains of operations” and, therefore, a permanent (retro)coupling between the living and the non-living agency?This reveals that classical cultural techniques such as indexation or labelling, for example, have acquired ontological powers in the Google era: only what is labelled exists; only what can be searched is absolute. At the same time, in the fantasies of the mediatic corporations, the variety of objects that can be labelled (including people) tends to be coextensive with the world of the phenomena itself (if not the real world), which will then always be only an augmented version of itself.Technology became important for contemporary knowledge only through mediation; therefore, the use of tools could not be the consequence of an extremely well-developed brain. On the contrary, the development of increasingly sophisticated tools took place at the same pace as the development of the brain, as Leroi-Gourhan attempts to probe when studying the history of tools together with the history of the human skeleton and brain. And what he managed to demonstrate is that the history of technique and the history of the human being run in parallel lines; they are, if not equal, at least inextricable. Even today, the progress of knowledge is still not completely subordinated to the technological inversion (Lyotard 37). In short, human evolution is inseparable from the evolution of the téchne, the evolution of technology. One may simply think the human being as a natural animal, isolated from the external material world. What he becomes and what he is, is essentially bonded to the techniques, from the very beginning. Leroi-Gourhan puts it this way in his text Gesture and Speech: “the apparition of tools as a species ... feature that marks the boundary between animals and humans” (90).To understand the behavior of the technological systems is essential for our ability to control their actions, to harvest their benefits and to minimize their damage. Here it is argued that this requires a wide agenda of scientific investigation to study the behavior of the machine that incorporates and broadens the biotechnological discipline, and includes knowledges coming from all sciences. In some way, Simondon sensed this encounter of knowledges, and proposed the concept of the Allagmatic, or theory of operations, “constituted by a systematized set of particular knowledges” (Simondon 469). We could attempt to begin by describing a set of questions that are fundamental for this emerging field, and then exploring the technical, legal, and institutional limitations in the study of technological agency.Information, Communication and SignificationTo establish the relation between information and communication, we will speak from the following two perspectives: first with Norbert Wiener, then with Simondon. We will see how the concept of information is essential to start understanding communication in an artificial agent.On one side, we have the notion from Wiener about information that is demarcated in his project about cybernetics. Cybernetics is the study of communication and control through the inquiry of messages in animals, human beings, and machines. This idea of information arises from the interrelation with the surrounding. Wiener defines it as the “content of what is an interchange object with the external world, while we adjust to it and make it adjust to us” (Wiener 17-18). In other words, we receive and use information since we interact with the world in which we live. It is in this sense that information is connected to the idea of feedback that is defined as the exchange and interaction of information in our systems or other systems. In Wiener’s own words, feedback is “the property of adjusting the future behavior to facts of the past” (31).Information, for Wiener, is influenced, at the same time, by the mathematic and probabilistic idea from the theory of information. Wiener refers to the amount of information that finds its starting point at the mechanics of statistics, along with the concept of entropy, inasmuch that the information is opposed to it. Therefore, information, by supplying a set of messages, indicates a measure of organisation. Argentinian philosopher Pablo Rodríguez adds that “information [for Wiener] is a new physical category of the universe. [It is] the measure of organization of any entity, an organization without which the material and energetic systems wouldn’t be able to survive” (2-3). This way, we have that information responds to the measure of organization and self-regulation of a given system.Moreover, and almost in complete contrast, we have the concept given by Simondon, where information is applicable to the whole possible range: animals, machines, human beings, molecules, crystals, etc. In this sense, it is more versatile, as it exceeds the domains of the technique. To understand well the scope of this concept we will approach it from two definitions. In first place, Simondon, in his conference Amplification in the Process of Information, in the book Communication and Information, claims that information “is not a thing, but the operation of a thing that arrives to a system and produces a transformation in there. The information can’t be defined beyond this act of transformative incidence, and the operation of receiving” (Simondon 139). From this definition it follows the idea of modulation, just when he refers to the “transformation” and “act of transformative incidence” modulation corresponds to the energy that flows amplified during that transformation that occurs within a system.There is a second definition of information that Simondon provides in his thesis Individuation in Light of Notions of Form and Information, in which he claims that: “the information signal is not just what is to be transmitted … it is also that what must be received, this is, what must adopt a signification” (Simondon 281). In this definition Simondon clearly distances himself from Wiener’s cybernetics, insofar as it deals with information as that which must be received, and not that that is to be transmitted. Although Simondon refers to a link between information and signification, this last aspect is not measured in linguistic terms. It rather expresses the decodification of a given code. This is, signification, and information as well, are the result of a disparity of energies, namely, between the overlaying of two possible states (0 and 1, or on and off).This is a central point of divergence with Wiener, as he refers to information in terms of transference of messages, while Simondon does it in terms of transformation of energies. This way, Simondon adds an energy element to the traditional definition of information, which now works as an operation, based in the transformation of energies as a result of a disparity or the overlaying of two possible elements within a system (recipient). It is according to this innovative element that modulation operates in a metastable system. And this is precisely the last concept we need to clarify: the idea of metastability and its relationship with the recipient-system.Metastability is an expression that finds its origins in thermodynamics. Philosophy traditionally operates around the idea of the stability of the being, while Simondon’s proposal states that the being is its becoming. This way, metastability is the condition of possibility of the individuation insofar as the metastable medium leaves behind a remainder of energy for future individuation processes. Thus, metastability refers to the temporal equilibrium of a system that remains in time, as it maintains within itself potential energy, useful for other future individuations.Returning to the conference Amplification in the Process of Information, Simondon points out that “the recipient metastability is the condition of efficiency of the incident information” (139). In such sense, we may claim that there is no information if the signal is not received. Therefore, the recipient is a necessary condition for said information to be given. Simondon understands the recipient as a mixed system (a quasi-system): on one hand, it must be isolated in terms of energy, and it must count with a membrane that allows it to not spend all the energy at the same time; on the other hand, it must be heteronomous, as it depends on an external input of information to activate the system (recipient).The metastable medium is the one indicated to understand the artificial agent, as it leaves the possibility open for the potential energy to manifest and not be spent all at once, but to leave a remainder useful for future modulations, and so, new transformations may occur. At the same time, Simondon’s concept of information is the most convenient when referring to communication and the relationship with the medium, primarily for its property of modulating potential energy. Nevertheless, it is also necessary to retrieve the idea of feedback from Wiener, as it is in the relationship of the artificial agent with its surrounding (and the world) that information is given, and it may flow amplified through its system. By this, significations manage to decode the internal code of the artificial agent, which represents the first gesture towards the opening of the communication.ConclusionThe hypotheses on extended cognition are subject to a huge amount of debate in the artistic, philosophical, and science of cognition circles nowadays, but their implications extend further beyond metaphysics and sciences of the mind. It is apparent that we have just began to scratch the surface of the social sphere in a broader way; realising that these start from cultural branches of the sight; as our minds are; if our minds are partially poured into our smartphones and even in our homes, then it is not a transformation in the human nature, but the latest manifestation of an ancient human ontology of the organic cognitive and informative systems dynamically assembled.It is to this condition that the critical digital humanities and every form of critique should answer. This is due to an attempt to dig out the delays and ruptures within the systems of mass media, by adding the relentless belief in real time as the future, to remind that systems always involve an encounter with a radical “strangeness” or “alienity”, an incommensurability between the future and the desire that turns into the radical potential of many of our contemporary social movements and politics. Our challenge in our critical job is to dismantle the practice of the representation and to reincorporate it to different forms of space and experience that are not reactionary but imaginary. What we attempt to bring into the light here is the need to get every spectator to notice the limits of the machinic vision and to acknowledge the role of image in the recruitment of liminal energies for the capital. The final objective of this essay will be to see that nature possesses the technique of an artist who renders contingency into necessity and inscribes the infinite within the finite, in arts it is not the figure of nature that corresponds to individuation but rather the artist whose task is not only to render contingency necessary as its operation, but also aim for an elevation of the audience as a form of revelation. The artist is he who opens up, through his or her work, a process of transindividuation, meaning a psychical and collective individuation.ReferencesDeleuze, Gilles. “Post-Script on Control Societies.” Polis 13 (2006): 1-7. 14 Feb. 2020 <http://journals.openedition.org/polis/5509>.Espinoza Lolas, Ricardo, et al. “On Technology and Life: Fundamental Concepts of Georges Caguilhem and Xavier Zubiri’s Thought.” Ideas y Valores 67.167 (2018): 127-47. 14 Feb. 2020 <http://dx.doi.org/10.15446/ideasyvalores.v67n167.59430>.Fumagalli, Andrea. Bioeconomía y Capitalismo Cognitivo: Hacia un Nuevo Paradigma de Acumulación. Madrid: Traficantes de Sueños, 2010.Hui, Yuk. “On Cosmotechnics: For a Renewed Relation between Technology and Nature in the Anthropocene.” Techné: Research in Philosophy and Technology 21.2/3 (2017): 319-41. 14 Feb. 2020 <https://www.pdcnet.org/techne/content/techne_2017_0021_42769_0319_0341>.Leroi-Gourhan, André. El Gesto y la Palabra. Venezuela: Universidad Central de Venezuela, 1971.———. El Hombre y la Materia: Evolución y Técnica I. Madrid: Taurus, 1989.———. El Medio y la Técnica: Evolución y Técnica II. Madrid: Taurus, 1989.Lyotard, Jean-François. La Condición Postmoderna: Informe sobre el Saber. Madrid: Cátedra, 2006.Pasquinelli, Matteo. “The Spike: On the Growth and Form of Pattern Police.” Nervous Systems 18.5 (2016): 213-20. 14 Feb. 2020 <http://matteopasquinelli.com/spike-pattern-police/>. Rivera Hutinel, Marcela.“Techno-Genesis and Anthropo-Genesis in the Work of Bernard Stiegler: Or How the Hand Invents the Human.” Liminales, Escritos Sobre Psicología y Sociedad 2.3 (2013): 43-58. 15 Dec. 2019 <http://revistafacso.ucentral.cl/index.php/liminales/article/view/228>.Rodríguez, Pablo. “El Signo de la ‘Sociedad de la Información’ de Cómo la Cibernética y el Estructuralismo Reinventaron la Comunicación.” Question 1.28 (2010): 1-17. 14 Feb. 2020 <https://perio.unlp.edu.ar/ojs/index.php/question/article/view/1064>.Simondon, Gilbert. Comunicación e Información. Buenos Aires: Editorial Cactus, 2015.———. La Individuación: a la luz de las nociones de forma y de información. Buenos Aires: La Cebra/Cactus, 2009 / 2015.———. El Modo de Existencia de los Objetos Técnicos. Buenos Aires: Prometeo, 2007.———. “The Position of the Problem of Ontogenesis.” Parrhesia 7 (2009): 4-16. 4 Nov. 2019 <http://parrhesiajournal.org/parrhesia07/parrhesia07_simondon1.pdf>.Stiegler, Bernard. La Técnica y el Tiempo I. Guipúzcoa: Argitaletxe Hiru, 2002.———. “Temporality and Technical, Psychic and Collective Individuation in the Work of Simondon.” Revista Trilogía Ciencia Tecnología Sociedad 4.6 (2012): 133-46.Wiener, Norbert. Cibernética y Sociedad. Buenos Aires: Editorial Sudamericana, 1958.
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Hypothèse transhumaniste"

1

Dieltiens, Baptiste. "Contributions à la gestion des risques en assurance vie." Electronic Thesis or Diss., Lyon, 2021. http://www.theses.fr/2021LYSE1135.

Full text
Abstract:
La gestion des risques est un enjeu majeur pour le pilotage d’une compagnie d’assurance. Les données toujours plus nombreuses, les modèles toujours plus sophistiqués et la puissance informatique croissante permettent aujourd’hui aux actuaires, risk managers et data scientists d’affiner la connaissance de leurs portefeuilles d’assurés et des risques sous-jacents. C’est dans ce contexte que se situe cette thèse, qui a pour objectif de contribuer à la compréhension et à la modélisation des risques biométriques et comportementaux en assurance vie, par le biais de trois chapitres introduits et contextualisés dans une introduction générale. Le Chapitre 1 s’intéresse aux versements libres sur les contrats d’assurance vie. Nous proposons une méthodologie basée sur l’apprentissage automatique pour les piloter efficacement : le modèle, construit via l’algorithme de Gradient Boosting, s’appuie aussi bien sur des variables liées aux versements passés que sur des variables liées au produit en question et au business plan, et nous montrons qu’il donne de meilleurs résultats qu’une méthodologie plus classique fondée sur l’utilisation de séries temporelles. En outre, l’analyse du modèle via le cadre proposé par SHAP (Shapley Additive Explanations) permet de mettre en évidence certains faits stylisés; enfin, l’étude à une maille plus fine complète les travaux et interroge la relation entre les versements et les rachats et arbitrages. Le Chapitre 2 concerne les transferts en assurance vie, qui offrent la possibilité à un épargnant d’investir de l’argent sur un nouveau contrat tout en conservant une partie des avantages afférents à son contrat d’origine. En particulier, nous nous intéressons aux transferts Fourgous et PACTE que nous présentons et dont nous mettons en exergue les principaux points communs et différences majeures. Nous proposons alors une modélisation de l’amendement Fourgous via une régression logistique dynamique et analysons, au vu des premières observations, dans quelle mesure les enseignements que l’on peut en tirer sont applicables à la loi PACTE. Enfin, nous élargissons la réflexion en discutant du cadre législatif et de ses impacts potentiels en termes de comportements des assurés. Enfin, le Chapitre 3 est consacré au risque de longévité, et s’intéresse en particulier à une hypothèse extrême, peu considérée en actuariat : le transhumanisme. Cette hypothèse envisage une potentielle amélioration gigantesque de la longévité par l’apport de la science et des technologies. Après avoir rappelé l’état de la connaissance sur la longévité et tous les sujets y afférent (espérance de vie, âge biologique maximal en particulier) et les principales hypothèses sur son évolution future, mettant ainsi en évidence l’absence de consensus et la complexité du sujet, nous analysons plus en détail l’hypothèse transhumaniste et discutons de ses tenants et aboutissants
Risk management is a major issue for the piloting of an insurance company. The increasing amount of data, the sophistication of models and the growing computing power now allow actuaries, risk managers and data scientists to refine the knowledge of their policyholder portfolios and the underlying risks. This is the context of this thesis, which aims to contribute to the understanding and modeling of biometric and behavioral risks in life insurance, through three chapters introduced and contextualized in a general introduction. Chapter 1 focuses on free payments on life insurance contracts. We propose a methodology based on machine learning to pilot them efficiently: the model, based on the Gradient Boosting algorithm, relies on variables related to past payments as well as variables related to the product in question and its business plan, and we show that it gives better results than a more classical methodology based on the use of time series. In addition, the analysis of the model via the framework proposed by SHAP (Shapley Additive Explanations) makes it possible to highlight certain stylized facts; finally, the study at a finer scale completes the work and questions the relationship between payments and surrenders or arbitrages. Chapter 2 deals with life insurance transfers, which allow a saver to invest money in a new contract while retaining some of the advantages of the original contract. In particular, we are interested in the Fourgous and PACTE transfers, which we present; we highlight the main common points and major differences of those transfers. We then propose a model of the Fourgous amendment using dynamic logistic regression and analyze, given the initial observations, to what the extent the lessons that can be drawn from it are applicable to the PACTE law. Finally, we broaden the reflection by discussing the legislative framework and its potential impacts in terms of policyholder behavior. Finally, Chapter 3 is devoted to the risk of longevity, and focuses in particular on an extreme assumption, not really considered in actuarial science: transhumanism. This assumption considers a potential gigantic improvement in longevity through the use of science and technology. After discussing the state of the art on longevity and all the related subjects (life expectancy, maximum biological age in particular) and the main hypotheses on its future evolution, thus highlighting the lack of consensus and the complexity of this subject, we analyze the transhumanist assumption in more details and discuss its ins and outs
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography