Tesis sobre el tema "Climat – Simulation par ordinateur"
Crea una cita precisa en los estilos APA, MLA, Chicago, Harvard y otros
Consulte los 50 mejores tesis para su investigación sobre el tema "Climat – Simulation par ordinateur".
Junto a cada fuente en la lista de referencias hay un botón "Agregar a la bibliografía". Pulsa este botón, y generaremos automáticamente la referencia bibliográfica para la obra elegida en el estilo de cita que necesites: APA, MLA, Harvard, Vancouver, Chicago, etc.
También puede descargar el texto completo de la publicación académica en formato pdf y leer en línea su resumen siempre que esté disponible en los metadatos.
Explore tesis sobre una amplia variedad de disciplinas y organice su bibliografía correctamente.
Labarre, Vincent. "Représentation des processus sous-mailles dans les modèles simplifiés de climat : maximisation de la production d'entropie et modélisation stochastique". Electronic Thesis or Diss., université Paris-Saclay, 2020. http://www.theses.fr/2020UPASJ001.
Texto completoLike any numerical model, the models usually used to simulate the climate system have a finite resolution, linked to the size of the grid cells on which the model equations are integrated. For a large part of the problems raised in climatology, the mesh size remains, and will surely remain, much larger than the smallest scale necessary for the integration of the equations. This problem appears in particular in the representation of the smallest vortices of atmospheric and oceanic flows. To correct the undesirable effects linked to the lack of resolution, it is necessary to introduce a model for the sub-grid phenomena. The usual approaches generally introduce parameters which are difficult to calibrate in the context of the study of the climate and do not always manage to agree with the observations. One of the central questions of the numerical simulation of the climate system is therefore to find the right sub-grid modeling method(s). Two approaches are investigated to resolve this problem.The first is the construction of simplified climate models based on the energy balance and the Maximization of Entropy Production (MEP) as the closure hypothesis. This approach has the particularity to not introduce additional parameters to represent the processes modeled with MEP. We show that a minimal description of the dynamics in a radiative-convective model using the MEP hypothesis allows us to represent the most important aspects of atmospheric convection. The MEP hypothesis is extended to represent time-dependent problems by applying this hypothesis only to the fast components of a system. This result opens the possibility of using this type of model to represent the seasonal cycle.The second approach consists in implementing a sub-grid analysis and modeling method in a simple system: a diffusive lattice-gas. The temporal variation of the coarse-grained flow of particles at different scales is analyzed before being modeled by statistical physics methods. The sub-grid model, constructed without adding any additional parameters, is based on a local stochastic relaxation equation of the particle current. The analysis method is then used to study the dynamics of vorticity in the Von-Karman flow
Houmed-Gaba, Abdourahman. "Hydrogéologie des milieux volcaniques sous climat aride : caractérisation sur site expérimental et modélisation numérique de l'aquifère basaltique de Djibouti (Corne de l'Afrique)". Poitiers, 2009. http://www.theses.fr/2009POIT2266.
Texto completoThe Djibouti aquifer is constituted by fractured basalts and scoria of 1 to 9My old, intercalated by sediments layers. It is localised in a coastal area under semi-arid conditions. This aquifer is exploited over 15 millions m3/year for Djibouti town water supply. A hydrogeological research site which includes by now 11 wells was set up on the aquifer, over 1 hectare area. Lithological logs of the wells show scoria underlying fractured basalt. Electrical conductivity profiles performed on the research site wells show fresh groundwater overlying at places brackish water. Slug tests were conducted on the fractured basalts, using Hvorslev (1951) and Bouwer & Rice (1976) solutions. Long term pumping tests were conducted to characterise the scoria. An estimation of the average hydraulic conductivity is thus obtained for the fractured basalts (K=10-8m/s) and for the scoria (K=10-2m/s). The long term tests have been interpreted using the semi-confined Hantush-Jacob model (1955) which showed that the scorias are under leaky conditions. The chemical analyses results, using multivariate statistical tools (factorial analysis) show three types of water, chloride water, sulphate group and bicarbonate type. A numerical modelling is elaborated using the pilot points methodology in conjunction with the PEST non-linear parameter estimation and regularisation functionality. The water balance showed that the exploitation of this aquifer has reached a crucial limit and cannot be increased without serious risks of degrading its resources
Prianto, Eddy. "Modélisation des écoulements et analyse architecturale de performances de l'espace habitable en climat tropical humide". Nantes, 2002. http://www.theses.fr/2002NANT2037.
Texto completoBossuet, Cécile. "Étude du transport vertical de quantité de mouvement dans le modèle troposphérique-stratosphérique ARPEGE-climat". Toulouse, INPT, 1998. http://www.theses.fr/1998INPT004H.
Texto completoNahon, Raphaël. "Modélisation des échanges radiatifs à l'échelle urbaine pour un urbanisme bioclimatique". Thesis, Lille 1, 2017. http://www.theses.fr/2017LIL10130/document.
Texto completoThe main objective of this work is to evaluate the bioclimatism of an urban project at its early stages: its capacity to harness daylight, the energy efficiency of the projected building and the thermal comfort of the outdoor environment. A first proposal for the exterior geometry is made in the mass plan. At this stage, buildings are commonly represented as mass blocks. Architectural details, such as windows shapes, interior coatings or wall composition are not defined, and daylighting inside the buildings or thermal behavior of their envelope are impossible to model. Nevertheless, we show that it is possible to evaluate as soon as in the mass plan the impact of the exterior radiative sources on the bioclimatism of the project. The concepts of sufficient and useful luminances and radiant temperatures are introduced. The first two criteria traduce the percentage of the year in which an outside luminous source induces a convenient illuminance inside the buildings; the third one traduces the impact of an outside radiative source on the heating and cooling of the outside surfaces and the outdoor thermal comfort. We analyze their distribution on the sky vault and highlight its variability under different climates. The final objective of this thesis is to propose a software likely to lead the urban planners in their search of bioclimatic urban forms: ensuring daylight access inside the buildings, energy efficiency and outdoor thermal comfort
Touzé-Peiffer, Ludovic. "Paramétrisation de la convection atmosphérique dans les modèles numériques de climat - Pratiques et enjeux épistémologiques". Electronic Thesis or Diss., Sorbonne université, 2021. https://accesdistant.sorbonne-universite.fr/login?url=https://theses-intra.sorbonne-universite.fr/2021SORUS539.pdf.
Texto completoHistorically, general circulation models have played a crucial role in warning policy makers and the general public of future climate change. However, in recent years, there has been a growing debate in the scientific community about the dominant paradigm on which these models have been developed: in particular, parameterizations, which are supposed to represent climate-relevant processes that are not resolved at the scale of the model grid, are sometimes questioned. The objective of our thesis is to conduct an epistemological analysis of parameterizations, focusing on the parameterizations used to represent atmospheric convection. The latter are sometimes based on the distinction between an environment and certain coherent atmospheric structures that we will call ``objects". We first look at the use of such objects in atmospheric science and at their role in our understanding of convection. We then focus on parametrizations themselves, and explain in which context convective parameterizations emerged, what were their historical motivations and how their formalism can be interpreted and justified today. We also question the status and the role of the tuning of free parameters contained in parameterizations. Finally, we expand our reflection to the comparison of several general circulation models in the Coupled Model Intercomparison Project (CMIP). The historical role and structural effects CMIP has had on climate research are analyzed
Niane, Papa Massar. "Modélisation de la méningite bactérienne dans l'interface Environnement-Climat-Société par approche multi-agents : cas d'application au Sénégal". Electronic Thesis or Diss., Sorbonne université, 2023. http://www.theses.fr/2023SORUS535.
Texto completoBacterial meningitis presents a real modeling challenge for the scientific community due to its multiscalar nature, factors of different natures to be taken into account (climate, environment, demographic, societal and biological factors at the individual level). As part of this thesis, we have developed a model called MenAfriSIM™ which takes into account temperature, dust through a COefficient of Meningitis Invasion and Development for Environmental eXposure (COMIDEX) which takes spatialized environmental data from remote sensing and of model. This is the first time that a coefficient that can encompass several climatic and environmental factors has been proposed with the aim of integrating them into a transmission model for meningitis. The model also has a spatial interaction model that takes into account interurban mobility. MenAfriSIM™ is an explanatory model of meningitis cases at the interurban scale in a western Sahel context marked by a relative reduction in the risk of meningitis, as in the case of Senegal. The modeling of a system as complex as meningitis makes it possible to adopt a multi-agent approach. The model was tested during the 2012 season, the analysis of health data from the Ministry of Health confirming the particular nature of this season with a record number of cases. The evaluation of the model showed a good performance of this one with more than half of the total variability of the cases of meningitis explained by the model (R2 = 0.53) and almost a third of the case variability (R2 = 0.29) explained temperature and dust. The model showed that the number of cases of meningitis remains strongly correlated with demography otherwise the municipalities most affected by meningitis are found in the most populated municipalities while the risk of meningitis is more present in areas where the footprint climatic and environmental is dominant. The areas where the risk of meningitis is greater are the north of the country. The "meningitis trizone" reflects the north-south gradient of the risk of meningitis which decreases through the center of the country. These results have been confirmed by the literature on the country's climatic domains and by the exploration of the model carried out over the 2013 season. However, additional studies should be considered over a longer period. The results suggest increased surveillance of the northern part of the country, the starting point of the risk of meningitis and consideration of zoning and the spread time of meningitis from one area to another (2 to 3 weeks) in early warning systems
Voldoire, Aurore. "Prise en compte des changements de végétation dans un scénario climatique du XXIème siècle". Toulouse 3, 2005. http://www.theses.fr/2005TOU30024.
Texto completoThe main objective of this work has been to run a climate simulation of the 21st century that includes not only greenhouse gases and aerosols emitted by human activity but also land-use and land-cover changes. To achieve this goal, the integrated impact model IMAGE2. 2 (developed at RIVM, The Netherlands) was used, which simulates the evolution of greenhouse gases concentrations as well as land-cover changes. This model has been coupled to the general circulation model ARPEGE/OPA provided by the CNRM. Before coupling the models, sensitivity experiments with each model have been performed to test their respective sensitivity to the forcing of the other. Ultimately, a simulation with the two models coupled together has shown that interactions between climate and vegetation are not of primary importance for century scale studies
Guillemot, Hélène. "La modélisation du climat en France des années 1970 aux années 2000 : histoire, pratiques, enjeux politiques". Paris, EHESS, 2007. http://www.theses.fr/2007EHES0149.
Texto completoOur thesis relates to the history of climate modelling from the end of the nineteen sixties to the beginning of this century, focusing on the modelling practices in the two centres developing a climate model in France: the Laboratoire de Météorologie Dynamique (LMD) from the CNRS (and the Institut Pierre Simon Laplace, a fédération of laboratories in and around Paris) and the national organism for vveather forecast, Météo-France. Starting with the first numerical climate models, we trace the évolution of modelling at LMD and Météo-France, and compare the institutions, the carecrs of the researchers and the very différent ways of working in thèse two organisms, determined by their institutional cultures. We describe several modelling practices, in particular the parametrization and validation of models by data, and we analyse the specificities of the scientific practices related to the use of computer. Returning to a critical transitional period in the history of modelling in France, the beginning of the nineties, whcn institutional and scientific reconfigurations allowed coupling of models and simulations of future climate, we analyse the way French modellers confronted the problem of climate change, especially the contribution to IPCC climate prévisions. Finally, we address the expansion of climate modelling to « Earth System », integrating other environments, cycles and interactions, and we discuss the mutations that thèse changes are generating in the working practices of modellisers
Sicard, Marie. "Modéliser les évolutions du climat de l'Arctique et de la calotte groenlandaise pendant le dernier interglaciaire pour en comprendre les mécanismes". Electronic Thesis or Diss., université Paris-Saclay, 2021. http://www.theses.fr/2021UPASJ017.
Texto completoThe Last Interglacial (129 -116 ka BP) is one of the warmest periods in the last 800 ka at many locations. This period is characterized by a strong orbital forcing leading to a different seasonal and latitudinal distribution of insolation compared to today. These changes in insolation result in a temperature increase in the high latitudes of the Northern Hemisphere and a rise in sea level of 6 to 9 m above present. Therefore, the Last Interglacial represents a good case study given the risks of melting ice sheets under the influence of current and future warming. It is also an opportunity to identify and quantify the mechanisms causing polar amplification in a warmer climate than today.Within the framework of the CMIP6-PMIP4 model intercomparison project, I analyzed the lig127k snapshot run with the IPSL-CM6A-LR climate model. In the Arctic region (60-90°N), the insolation variations induce an annual warming of 0.9°C compared to the pre-industrial period (1850) reaching up to 4.0°C in autumn. Investigate changes in the Arctic energy budget relative to the pre-industrial period highlights the crucial roles of changes in the sea ice cover, ocean heat storage and clouds optical properties in the Last Interglacial Arctic warming.As a result of climate change over the Last Interglacial, the GRISLI ice sheet model simulates a Greenland ice loss of 10.7-57.1%, corresponding to a sea level rise of 0.83-4.35 m and a 0.2°C additional warming in the Arctic region. These estimates illustrate the crucial role of polar ice sheets in the climate system. To better assess ice sheet-climate feedbacks in the Arctic, I have therefore carried out a preliminary study using the ICOLMDZOR model that includes the new dynamical core DYNAMICO developed at the IPSL. This study shows that the use of high-resolution atmospheric fields improves the calculation of the surface mass balance in Greenland.Finally, the comparison between past and future Arctic energy budget reveals that the processes causing Arctic warming during the Last Interglacial and the near future are similar
Torres, Olivier. "Représentation des flux turbulents à l’interface air-mer et impact sur les transports de chaleur et d’eau dans un modèle de climat". Electronic Thesis or Diss., Université Paris-Saclay (ComUE), 2019. http://www.theses.fr/2019SACLV002.
Texto completoThe turbulent fluxes at the air-sea interface represent the link between the ocean and the atmosphere and therefore play a major role in the climate system. In climate models, turbulent processes are subgrid scale processes, not explicitly resolved, and must therefore be parameterized. They are estimated from atmospheric and oceanic state variables using mathematical models called “bulk parameterizations”. This thesis aims to characterize and understand the links between the representation of turbulent fluxes at the air-sea interface and the behavior of a climate model at different time scales in tropical regions. To study these links, I developed a modeling strategy using an atmospheric 1D model (SCM), an oceanic (OGCM) or atmospheric (AGCM) general circulation model and a coupled model (GCM). The analysis of SCM simulations allows us to study the direct response of a model to modifications of the turbulent fluxes parameterization. It is shown that it regulates the amount of water, energy and momentum available to the system and therefore its behavior. It can thus represent more than 60% of simulated latent heat flux differences between two climate models in convective periods. The spatial impact of the parameterization of turbulent fluxes is studied through AGCM simulations. They highlight the link between parameterization, its effect on large-scale moisture and temperature gradients, and thus its influence on atmospheric circulation. The study of OGCM simulations underlines the main role of the wind for the behavior of the tropical oceans. If the wind drives changes in SST due to its impact on ocean dynamics and mainly on the equatorial undercurrent, humidity, temperature and radiative flux only influence the ocean surface and are therefore of lesser importance. Finally, the analysis of GCM simulations highlights the feedbacks and the adjustment generated by the modification of turbulent fluxes. When coupling the two components, the ocean acts as a buffer and absorbs the modification of the turbulent fluxes, which leads to a modification of the SST. The adjustment that occurs causes a modification of the atmospheric variables which leads to a new state of equilibrium of the system. The parameterization of surface turbulent fluxes acts at first order on the energy equilibrium of a coupled model and can therefore lead to different simulated climate state. Since this study is focused on the tropics, an interesting perspective would be to extend the study of the turbulent fluxes representation to other spatio-temporal scales (i.e. extra-tropical areas / daily frequency). This would make it possible to validate the systematic behavior of the parameterizations defined in this thesis on a global scale
Abdelkader, di Carlo Isma. "Lien entre la variabilité tropicale aux échelles interannuelles à multi-décennales et l'évolution du climat moyen au cours de l'Holocène". Electronic Thesis or Diss., université Paris-Saclay, 2024. http://www.theses.fr/2024UPASJ005.
Texto completoThe center of action of ENSO, the main mode of tropical interannual variability, is located in the tropical Pacific Ocean. Despite the uncertainty regarding the impact of climate change on its evolution, indications suggest that the climate mean state influences its variability. Observations do not allow for the study of connections between changes in mean climate characteristics and long-term changes. I focused on the evolution of ENSO over the last 6000 years. For this period, numerous high-resolution climate archives (corals, speleothems, bivalves) allow for the study of these variability changes. Transient simulations from models, the same ones used for future projections, offer new possibilities to study long-term changes.I paid particular attention to the thermocline feedback, which could explain the reduced ENSO variability in the Holocene. I used a Lagrangian tool that allowed me to identify variations in three major sources feeding the equatorial Pacific thermocline, mainly from the South Pacific. Contrary to the hypotheses put forth, the temperature evolution of these sources did not seem sufficient to explain the observed change in the Pacific equatorial thermocline during the Holocene.Another part of my thesis focused on examining the diversity of ENSO. Using distinct methods to characterize El Niño events in the eastern (EP) and central (CP) Pacific, opposing results were obtained regarding the evolution over 6000 years of the EP to CP variance ratio. When properly interpreted, ENSO diversity metrics revealed an increase in ENSO variance during the Holocene, with spatial pattern that expands to the west and east of the Pacific. Differences between metrics are the source of contradictory results in the literature.Next, I explored teleconnection changes with the Indian Ocean, particularly examining the interaction between ENSO and the Indian Ocean Dipole (IOD), taking into account variations in Earth's orbit, greenhouse gases, and ice sheets. Analyzing a transient simulation over two million years, I showed a negative correlation between ENSO and IOD variability based on the longitude of the perihelion. During periods of strong ENSO and IOD variability, the mean state and primarily seasonality influenced the development of each variability mode. Furthermore, I established that the western Indian Ocean had its own response to insolation, and the eastern Indian Ocean drove IOD variability on orbital timescales.I returned to the Holocene in the last part of my thesis, with five transient simulations. I confirmed how ENSO variability evolves in connection with changes in the mean state and showed that results are more uncertain in the Indonesian sector. I found greater consistency in the results of atmospheric teleconnection changes over the last 6000 years, via the Walker circulation and the Intertropical Convergence Zone (ITCZ). Comparison with speleothems in the Indonesian sector showed agreement with changes in the position of the ITCZ. Interpretations of coral hydrological changes emphasized the regional complexity of the Indonesian sector.In conclusion, the various aspects addressed in this PhD thesis highlight the importance of considering variability in a changing mean state context, emphasizing the crucial role of transient simulations in understanding chaotic changes over time. The results open new avenues for analyzing long-term climate variability, taking into account the specificities of each climate reconstruction
Quilcaille, Yann. "Retour sur les scénarios climatiques et d'émissions à l'aide d'un modèle compact du système Terre". Electronic Thesis or Diss., Université Paris-Saclay (ComUE), 2018. http://www.theses.fr/2018SACLV041.
Texto completoThis thesis puts into perspective different elements of socio-economic scenarios from a climate change modelling point of view. These elements contribute at improving the comprehension of the current state of climate sciences regarding the scenarios. In the meantime, these elements demonstrate the potential of the recent reduced-form Earth System Model OSCAR v2.2.The first element concerns the uncertainty of emissions. Although emission inventories are uncertain, we ignore what impact on climate change have these uncertainties. We quantify this impact for fossil-fuel emissions, the major contributor to climate change. We show that the uncertainties in emissions are expected to increase with the use of non-conventional fuels, but that they do not increase significantly the uncertainty from Earth system modelling in variables, such as the increase in global surface temperature.The second element is a climate assessment of the recent Shared Socio-economic Pathways (SSP) scenarios. We identify loopholes in the SSP database, and we complete it to calculate the climate projections under these scenarios. Our conclusions suggest inconsistencies in CO2 emissions from Land Use Change (LUC) calculated by the Integrated Assessment Models and in the associated land variables. We identify trade-offs between greenhouse gases in the mitigation of climate change. Using a robust assessment, new carbon budgets are proposed. The uncertainties in increases in global surface temperature are discussed.The third element concerns the negative emissions. Most climate scenarios limiting global warming well below 2°C above preindustrial levels, thus respecting the Paris Agreement, use negative emissions. Using a developed version of OSCAR v2.2, we evaluate the implications for the Earth system of different aspects of different Carbon Dioxide Removal (CDR) technologies. We identify the reversibility in the different components of the Earth system and calculate the cooling potential of carbon dioxide removal technologies. We also show that the potential of afforestation/reforestation techniques may be impeded by the change in albedo, and that the potential of oceanic enhanced weathering may be lower than expected.Overall, this thesis identifies loopholes in the current development of scenarios. Some do not hinder current conclusions regarding climate change, such as the uncertainties in emission inventories. Others call for further analysis, such as the inconsistencies in the use of CO2 emissions from LUC or the eventual overestimation of the potential of some CDR technologies. It emphasizes the need for an urgent mitigation of climate change
Torres, Olivier. "Représentation des flux turbulents à l’interface air-mer et impact sur les transports de chaleur et d’eau dans un modèle de climat". Thesis, Université Paris-Saclay (ComUE), 2019. http://www.theses.fr/2019SACLV002/document.
Texto completoThe turbulent fluxes at the air-sea interface represent the link between the ocean and the atmosphere and therefore play a major role in the climate system. In climate models, turbulent processes are subgrid scale processes, not explicitly resolved, and must therefore be parameterized. They are estimated from atmospheric and oceanic state variables using mathematical models called “bulk parameterizations”. This thesis aims to characterize and understand the links between the representation of turbulent fluxes at the air-sea interface and the behavior of a climate model at different time scales in tropical regions. To study these links, I developed a modeling strategy using an atmospheric 1D model (SCM), an oceanic (OGCM) or atmospheric (AGCM) general circulation model and a coupled model (GCM). The analysis of SCM simulations allows us to study the direct response of a model to modifications of the turbulent fluxes parameterization. It is shown that it regulates the amount of water, energy and momentum available to the system and therefore its behavior. It can thus represent more than 60% of simulated latent heat flux differences between two climate models in convective periods. The spatial impact of the parameterization of turbulent fluxes is studied through AGCM simulations. They highlight the link between parameterization, its effect on large-scale moisture and temperature gradients, and thus its influence on atmospheric circulation. The study of OGCM simulations underlines the main role of the wind for the behavior of the tropical oceans. If the wind drives changes in SST due to its impact on ocean dynamics and mainly on the equatorial undercurrent, humidity, temperature and radiative flux only influence the ocean surface and are therefore of lesser importance. Finally, the analysis of GCM simulations highlights the feedbacks and the adjustment generated by the modification of turbulent fluxes. When coupling the two components, the ocean acts as a buffer and absorbs the modification of the turbulent fluxes, which leads to a modification of the SST. The adjustment that occurs causes a modification of the atmospheric variables which leads to a new state of equilibrium of the system. The parameterization of surface turbulent fluxes acts at first order on the energy equilibrium of a coupled model and can therefore lead to different simulated climate state. Since this study is focused on the tropics, an interesting perspective would be to extend the study of the turbulent fluxes representation to other spatio-temporal scales (i.e. extra-tropical areas / daily frequency). This would make it possible to validate the systematic behavior of the parameterizations defined in this thesis on a global scale
Batton-Hubert, Mireille. "Intégration d'une simulation spatio-temporelle à un modèle topologique et numérique de terrain". Paris 6, 1993. http://www.theses.fr/1993PA066116.
Texto completoCoron, Laurent. "Les modèles hydrologiques conceptuelssont-ils robustes face à un climat en évolution ? Diagnostic sur un échantillon de bassins versants français et australiens". Electronic Thesis or Diss., Paris, AgroParisTech, 2013. http://www.theses.fr/2013AGPT0030.
Texto completoHydrologists are asked to estimate the medium- and long-term evolutions of water resources. To answer these questions, they commonly use conceptual models. In addition, they are often required to provide an estimate of the uncertainties associated with model projections. This raises the question of the robustness of conceptual models, especially in the context of climate evolution. Indeed, using a model in conditions different from those of calibration is based on the hypothesis of parameter transferability, i.e. the possibility to use model parameters in conditions different from those used for the model set-up. We focus on this issue with the aim of answering the following questions:• What is the robustness level of conceptual hydrological models in the context of changing climatic conditions?• What are the causes for the lack of robustness, and are there ways to prevent it?We answer these questions by studying the performance of conceptual models through multiple tests of temporal transfer of their parameters. Results show the existence of correlations between the robustness problems and the difference in climate conditions between model calibration and validation periods. The analysis especially points out the situations of systematic bias correlated to differences in air temperature. However, results are heterogeneous in our catchment set, and climate variables or error type associated with the identified problems vary between catchments.The analysis of simulation biases on catchments where the models are not robust shows alternating phases of flow under- or overestimation, with a possible bias in the mean flow up to 20% over a ten-year period.Our work reveals that very similar results can be obtained for various periods or calibration methods. The robustness issues faced by conceptual models used in this study do not solely stem from inadequate calibrations leading to the selection of parameters unable reproduce the catchment behavior. They seem to be the consequence of overall difficulties for models to satisfactorily simulate water balances simultaneously on various periods.This work opens reflections on the limited capacity of some hydrological models to reproduce low-frequency dynamics and raises questions on the role of inputs estimates errors in model failures, especially the temporal variations of evapotranspiration
Truong, Chi Quang. "Integrating cognitive models of human decision-making in agent-based models : an application to land use planning under climate change in the Mekong river delta". Electronic Thesis or Diss., Paris 6, 2016. http://www.theses.fr/2016PA066580.
Texto completoThe initial goal of this thesis has been then to address this problem by proposing, on one hand, a cognitive approach based on the Belief-Desire-Intention (BDI) paradigm to represent the decision-making processes of human actors in agent-based models and, on the second hand, a validation of this approach in a complete land-use change model in which most of the factors cited above have also been simulated.The outcome of this work is a generic approach, which has been validated in a complex integrated land-use change model of a small region of the Vietnamese Mekong Delta. Our main contributions have been:The integration of the BDI architecture within an agent-based modeling platform (GAMA); The design of the Multi-Agent Based Land-Use Change (MAB-LUC) framework that can take into account the farmers’ decision-making in the land-use change processes;The proposal of a solution to assess the socio-economic and environmental factors in land-use planning and to integrate the MAB-LUC framework into the land-use planning process of. I conclude by showing that this work, designed in a generic fashion, can be reused and generalized for the modeling of complex socio-ecological systems where individual human factors need to be represented accurately
Feng, Yang. "Study of the climate variability and the role of volcanism in the North Atlantic-Mediterranean sector during the last millennium". Electronic Thesis or Diss., Sorbonne université, 2022. http://www.theses.fr/2022SORUS038.
Texto completoThe PhD work aims at studying the role of volcanism in influencing winter climate variability (especially, NAO) over the North Atlantic-Mediterranean sector at inter-annual scale. The first part is devoted to characterizing the simulated NAO signal in winters following stratospheric volcanic eruptions using three long transient simulations of the past millennium (500-1849 CE) by IPSL-CM6A-LR in the frame of PMIP4. The robustness and sensitivity of the response related to the latitude, season and strength of the eruptions are also explored. The second part extends further to decrypt the physical mechanism regarding different components of volcanic radiative forcing (the surface cooling and stratospheric warming). The work focuses on three 25-members ensemble simulations by IPSL-CM6A-LR following the VolMIP protocol for the well observed Mt. Pinatubo tropical eruption (Philippines, June 1991). Sensitivity experiments indicate that the surface positive NAO signature in our model experiments is primarily attributable to heating in the lower tropical stratosphere which generates stronger subtropical zonal winds through the thermal wind balance and accelerates the polar vortex. Stationary planetary wave propagations are also playing indispensable modulations effects
Coron, Laurent. "Les modèles hydrologiques conceptuelssont-ils robustes face à un climat en évolution ? Diagnostic sur un échantillon de bassins versants français et australiens". Phd thesis, AgroParisTech, 2013. http://pastel.archives-ouvertes.fr/pastel-00879090.
Texto completoBéraud, Nicolas. "Fabrication assistée par ordinateur pour le procédé EBM". Thesis, Université Grenoble Alpes (ComUE), 2016. http://www.theses.fr/2016GREAI052/document.
Texto completoThe Electron Beam Melting (EBM) process allows to build metallic parts from powder. Thanks to the geometric and mechanical quality of the parts produced, the EBM process can be used to manufacture functional parts and not only prototypes. This process, with other additive metallic processes, make it possible to consider a transition from 3D printing to metallic additive manufacturing.The use of additive manufacturing in an industrial environment requires compliance with quality, cost and time criteria for the parts produced. The production of manufactured parts involves a series of numerical stages which is called the numerical chain. The numerical chain has a significant impact on the three criteria mentioned above. Thus, this thesis provides an answer to the following question:How Computer Aided Manufacturing can improve the quality, cost and time of the EBM manufacturing process?This problem is addressed through the following underlying question:What are the required characteristics for a Computer Aided Manufacturing system adapted to the EBM process?In order to answer this question, the current numerical chain is analyzed. Three main limitations are found:- the use of STL files format- the process cannot be optimized at different scales- the process cannot be simulatedTo solve these issues, a CAM environment is proposed. It allows the centralization of all numerical operations in a single environment. All supported formats can be used within this environment, such as native CAD file formats or STEP format. Software developments are done to prove the feasibility of such an environment.The CAM environment implementation reveals the crucial role of simulation in this system. It is therefore necessary to answer this second question:How to obtain an EBM process simulation allowing the development of parameters, virtually?Although EBM simulation is a recurrent subject in scientific literature, existing studies are based on the finite elements method but the calculation time needed is too important to be used in an CAM environment. Thus, an alternative type of simulation is created in this thesis: a simulation based on abacus. It is composed of a finite elements model, that allows heat maps generation for standards cases of heating and cooling. These heat maps are then transformed in abacus. The simulation algorithm based on abacus search the nearest abacus from the simulated situation in order to estimate the temperatures at the next time step.This simulation method was used to reduce the calculation time while keeping a sufficient precision to optimize process parameters.With the simulation based on abacus, a tool for the optimization of melting strategies is developed. This tool allows quality improvement for the produced parts through the calculation of melting strategies according to thermic criteria.To summarize, the main contributions of this work are:- the definition of requirements specifications of a powerful numerical chain for the EBM process- the development of a CAM environment adapted to the EBM process- the proposal of a fast simulation for the EBM process, based on abacus- the development of a tool for the optimization of melting strategies
Djian, Francis. "Modélisation thermique des thermostats pour oscillateurs à quartz et applications". Besançon, 1991. http://www.theses.fr/1991BESA2021.
Texto completoLouin, Jean-Charles. "Effets d'hétérogénéités de teneur en carbone sur les cinétiques de transformations de phase et sur la genèse des contraintes internes lors du refroidissement d'aciers". Vandoeuvre-les-Nancy, INPL, 2003. http://www.theses.fr/2003INPL077N.
Texto completoHeat treatment is a process that needs the control of both final microstructures and residual stresses and deformations. Numerical simulation is a useful tool for a better optimization of this process. The aim of our work was to contribute to the development of a numerical tool for the prediction of microstructures, stresses and strains during cooling of pieces that may contain chemical heterogeneities, particularly carbon content heterogeneities. Firstly, an existing model for the prediction of transformation kinetics in steels has been further developped in order to take into account the effects of the carbon content enrichment of austenite due to a partial ferritic transformation on the subsequent transformations. Coupled thermal, metallurgical, mechanical calculations have then been performed to study the effects of carbon content gradients on the microstructural evolutions and on the residual stresses development during cooling. Particularly, the possible effects of solidification macro and mesosegregations have been quantified in massive cylinders with sizes close to the size of an ingot. Secondly, experimental validations have been performed for homogeneous cylindrical specimen (40CrMnMo8 steel) and for a chemically heterogeneous specimen specifically designed for our study. The complete set of input data necessary for the simulations has been established from experimental characterizations of the steel. The role of chemical heterogeneity has been analysed through the experimental and calculated results. Finally, a good correlation has been obtained between measurements and calculation of the deformation during cooling of a 3D "croissant" shaped specimen
Sech, Nicolas Le. "Photocathodes à base de nanotubes de carbone sur substrats semi-conducteurs de type III-V. Application aux amplificateurs hyperfréquence". Palaiseau, Ecole polytechnique, 2010. http://pastel.archives-ouvertes.fr/docs/00/50/14/43/PDF/These_N_Le_Sech.pdf.
Texto completoFauchet, Gauthier. "Modélisation en deux points de la turbulence isotrope compressible et validation à l'aide de simulations numériques". Lyon 1, 1998. http://www.theses.fr/1998LYO10027.
Texto completoMenezla, Rabea. "Réalisation d'un logiciel de résolution de l'équation de poisson à trois dimensions : Simulation numérique tridimensionnelle du claquage des composants à jonctions P-N". Ecully, Ecole centrale de Lyon, 1990. http://www.theses.fr/1990ECDL0027.
Texto completoJoly, Cécile. "Simulations numériques d'un jet rond turbulent et de son interaction avec un tourbillon de sillage". Université de Marne-la-Vallée, 2002. http://www.theses.fr/2002MARN0147.
Texto completoThe general context of this study concerns the impact of contrails, these famous white plumes frequently observed in the aircraft wake, on the atmosphere. From an aerodynamic point of view, the formation of the contrails is characterised by the interaction between a turbulent jet and a wing-tip vortex. The aim of this thesis is to contribute to a better understanding of the thermal and dynamic phenomena involved in this flow. This work is based on the numerical resolution of the three-dimensional, unsteady and compressible Navier-Stokes and energy conservation equations. Two approaches are considered: the direct numerical simulation and the large eddy simulation. A temporal simulation of the transition to turbulence of a non-isothermal jet is performed without accounting for the vortex flow field. A the end of this simulation, a vortex model is superimposed on the jet flow field. The first part of this thesis describes the two approaches, the different subgrid models chosen for the large eddy simulations, and the numerical techniques employed. The second part is devoted to the jet flow simulation, and here the objective is to determine the subgrid model appropriated to this flow configuration. The third part is dedicated to the simulation of interaction between the jet and the vortex. Results are compared to experimental data. The simulations have demonstrated the development of large scale structures all around the vortex core. The temperature field concentrates in the large scale structures
Quatravaux, Thibault. "Évolution de la modélisation du procédé VAR : contribution à la description de la dispersion inclusionnaire dans le puits liquide et à la prévention de défauts de solidification". Vandoeuvre-les-Nancy, INPL, 2004. http://www.theses.fr/2004INPL037N.
Texto completoThis thesis deals with the modelling of Vacuum Arc Remelting process (VAR) using the numerical software SOLAR. The first aim of the study is a better description of several physical phenomena which occur during melting, in order to extend the application of the software to simulate the remelting of steel. The evolution in the modelling of transport phenomena in the secondary ingot is based on three major improvements: - lateral thermal transfer modelling, in order to take into account the formation of a gap between the ingot and the mould walls, a possible injection of a neutral gas, and the heating of water in the coolant circuit, according to its flow, - a better turbulence model, since the k-E model implemented previously in the numerical code was not accurate enough to correctly descri~ the flow of the liquid metal in the pool, - a new method to simulate the ingot growth, based on a cyclic operation of splitting and growth/migration of control volumes, which reproduces the continuous growth of the secondary ingot and allows for the refinement of the mesh close to the free surface. Finally, the improved model has been validated by comparison with experimental results provided from four remeltings carried out on full-scale furnaces. The second objective is the characterization of the quality of the manufactured products in terms of inclusion cleanliness and risk of freckles segregated channel generation. Ln order to describe the behaviour of inclusions in the liquid pool, a trajectory model, adapted to account for turbulent flows, was validated and then implemented in the code. Various particle behaviours were distinguished. A study on the risk of freckles generation led to the establishment of a criterion particularly weIl adapted to the process. A generalization of this criterion, suggested in this work, would allow the prediction of the probable orientation of such segregated channels
Ribot, Bénédicte. "Modélisation numérique d'un système de ventilation d'un tunnel routier par une trappe de désenfumage dans le cas d'un incendie". Lyon 1, 1999. http://www.theses.fr/1999LYO10195.
Texto completoPignolet-Tardan, Florence. "Milieux thermiques et conception urbaine en climat tropical humide : Modélisation thermo-aéraulique globale". Lyon, INSA, 1996. http://www.theses.fr/1996ISAL0066.
Texto completoIn order to reduce design error in urban planning, source of discomfort for the users, designers of urban spaces are looking for expert knowledge and skilled rules that may be achieved by the presented calculation code. Built to become a helpful conceiving code, CODYFLOW allows to simulate the micro-climate in the vicinity of the buildings. The first purpose of our work was to define the subjects studied; the urban fabric of Reunion Island presenting a cultural and historical diversity. The modelisation of thermal and aeraulic exchanges leading to heavy code, instead of dealing with the urban island as a whole, we have focused our study on the elementary urban unit wich compound it (street, square). These urban units were described by an exhaustive way, thanks to an urbanistic study. A systemic approach allows us to build the general structure of the calculation code, which is shown as an assembling of units, each of them describing the thermal behaviour of a part of the physical system. Each of these units is solicited by climatic factors: air temperature, wind, sunshine, humidity. The observed response is the field of temperature, speed and humidity, characterizing the micro-climate generated by the urban unit. These parameters allow us to predict the confort or discomfort sensation felt by the user
Yu, Qizhi. "Modèles de rivières animées pour l'exploration intéractive de paysages". Grenoble INPG, 2008. http://www.theses.fr/2008INPG0126.
Texto completoRivers are ubiquitous in nature, and thus are an important component of the visual simulation of nature scenes. In nature, rivers are dynamic, therefore animation of rivers is necessary in these visual simulation applications. However, animation of rivers is a challenging problem. It incorporates multi-scale surface details and flow motion, and many phenomena involved have complex underlying physical causes. River animation is particular difficult in emerging interactive applications like Google Earth or games, which allow users to explore a very large scene and observe rivers at a very small scale or a very large scale at any moment. Controlling the design of water simulations is another hard problem. The goal of this dissertation is to achieve real-time, scalable, and controllable river animation with a detailed and space-time continuous appearance. To achieve this goal, we break down the river animation problem into macro-, meso-, and micro-scale subproblems from coarse to fine. We propose appropriate models for each scale that capture relevant surface details and fluid motion. In the macro-scale, we propose a procedural method that can compute velocities of rivers with curved banks, branchings and islands on the fly. In themeso-scale, we propose an improved featured-based simulationmethod to generate the crests of the quasi-stationary waves that obstaclesmake. We also propose a method for constructing an adaptive and feature-aligned water surface according to the given wave crests. In the micro-scale, we propose the use of wave sprites, a sprite-based texture model, to represent advected details with stationary spectrum properties on flow surfaces. Armed with wave sprites and a dynamic adaptive sampling scheme, we can texture the surface of a very large or even unbounded river with scene-independent performance. In addition, we propose a Lagrangian texture advection method that has other applications beyond river animation. We demonstrate that combining our models in three scales helps us incorporate visually convincing animated rivers into a very large terrain in real-time interactive applications
Diener, Julien. "Acquisition et génération du mouvement de plantes". Grenoble INPG, 2009. http://www.theses.fr/2009INPG0080.
Texto completoThe primary goal of my Ph. D. Has been to develop innovative animation techniques for virtual plants. Two main approaches have been explored: the reproduction real motion and real-time simulation. First, I have mixed vision algorithms and user interface in order to extract reliable motion data from videos. An innovative method has then been designed to estimate a valid hierarchical structure of branches using statistical study which is used to retarget the video movement on a wide range of virtual plants. Second, I have developed new animation method that computes the response of plants to interactively controllable wind. The main contribution has been to show that simple approximations of the wind load model leads to a drastic reduction of the run-time computations. The simulation algorithm takes full advantage of the parallel processing capabilities of computer graphics cards allowing the animation of thousand trees in real-time
Chen, Haifeng. "Système de simulation de spectres de masse assisté par ordinateur". Paris 7, 2003. http://www.theses.fr/2003PA077143.
Texto completoBenchamma, Mérièm. "Réalisation d'un simulateur d'étude et de faisabilité pour la radiothérapie externe dynamique". Aix-Marseille 3, 1995. http://www.theses.fr/1995AIX30013.
Texto completoFarissier, Pierre. "Etude d'un modèle cartographique adapté à la simulation des écoulements en rivières". Lyon 1, 1993. http://www.theses.fr/1993LYO10278.
Texto completoBonelli, Stéphane. "Contribution à la résolution de problèmes élastoplastiques de mécanique des sols et d'écoulements non saturés par la méthode des éléments finis". Aix-Marseille 2, 1993. http://www.theses.fr/1993AIX22038.
Texto completoReimeringer, Michael. "Une méthodologie et des outils pour concevoir en tenant compte de la simulation". Reims, 2009. http://www.theses.fr/2009REIMS001.
Texto completoSimulation becomes incontrovertible due to the many advancement made in model, software and material. The use of simulation tools supply many advantages: study of alternative solutions, optimization of product, decrease or disparition of physical prototypes, assessment of manufacturing process, and reduction of cost and delay. Nowadays design is often done without considering subsequent step like simulation. However this step is indispensable
Chaillat, Stéphanie. "Méthode multipôle rapide pour les équations intégrales de frontière en élastodynamique 3-D : application à la propagation d’ondes sismiques". Paris Est, 2008. http://pastel.paristech.org/5233/01/these_chaillat.pdf.
Texto completoSimulating wave propagation in 3D configurations is becoming a very active area of research. The main advantage of the BEM is that only the domain boundaries are discretized. As a result, this method is well suited to dealing with unbounded domains. However, the standard BEM leads to fully-populated matrices, which results in high computational costs in CPU time and memory requirements. The Fast Multipole Method (FMM) has dramatically improved the capabilities of BEMs for many areas of application. In this thesis, the FMM is extended to 3D frequencydomain elastodynamics in homogeneous and piecewise-homogeneous media (using in the latter case a FMM-based BE-BE coupling). Improvements of the present FM-BEM are also presented: preconditioning, reduction of the number of moments, and formulation of a multipole expansion for the half space fundamental solutions. Seismological applications are given for canonical problems and the Grenoble valley case
Charentenay, Julien de. "Simulation numérique d'écoulements réactifs instationnaires à faibles nombres de Mach". Châtenay-Malabry, Ecole centrale de Paris, 2002. http://www.theses.fr/2002ECAP0724.
Texto completoDubois, Jean-Luc. "L'abstraction fonctionnelle des parties contrôles des circuits pour l'accélération de simulateurs générés : une contribution au développement d'outils de C.A.O. de l'architecture matérielle". Lille 1, 1991. http://www.theses.fr/1991LIL10037.
Texto completoBoyère, Emmanuel. "Contribution à la modélisation numérique thermomécanique tridimensionnelle du forgeage". Paris, ENMP, 1999. http://www.theses.fr/1999ENMP0915.
Texto completoJamme, Stéphane. "Étude de l'interaction entre une turbulence homogène isotrope et une onde de choc". Toulouse, INPT, 1998. http://www.theses.fr/1998INPT046H.
Texto completoAlbaki, Rachida. "Contribution à l'étude des propriétés dynamiques des métaux liquides simples par simulation numérique et modèles analytiques". Metz, 2002. http://docnum.univ-lorraine.fr/public/UPV-M/Theses/2002/Albaki.Rachida.SMZ0205.pdf.
Texto completoGassenbauer, Václav. "Illumination coherence for ligh transport simulation". Rennes 1, 2011. http://www.theses.fr/2011REN1S098.
Texto completoLa simulation de la propagation de la lumière dans une scène est une tâche essentielle en synthèse d'images réalistes. Cependant, une simulation correcte de la lumière ainsi que ses différents rebonds dans la scène reste couteuse en temps de calcul. Premièrement, nous proposons l'algorithme de cache de luminance spatial et directionnel SDRC. L'algorithme SDRC tire parti du fait que les variations d'éclairage sont douces sur les surfaces brillantes. L'éclairage en un point de la scène est alors calculé en interpolant l'éclairage indirect connu pour un ensemble d'échantillons de luminance spatialement proches et de directions similaires. Dans la partie suivante, nous présentons un algorithme efficace et précis d'analyse locale en composantes principales LPCA pour réduire la dimension et compresser un grandensemble de données. Pour améliorer l'efficacité de notre nouvel algoritme celui-ci propage les informations issues d'une itération à une itération suivante. En choisissant une meilleure graine initiale pour les centroïdes des clusters dans LPCA, la précision de la méthode est améliorée et produit une meilleure classification des données. Enfin, nous décrivons des travaux en cours de réalisation concernant une méthode de ré-éclairage interactif d'une séquence animée en prenant en compte l'éclairage indirect. Le problème de ré-éclairage est représenté sous la forme d'une grande matrice 3D représentant la propagation de la lumière dans la scène pour plusieurs images de la séquence. Un algorithme adaptatif pré-calcule la propagation de la lumière en exploitant les cohérences potentielles
Zhang, Xiao Hui. "Simulation avancée des circuits micro-ondes". Paris 12, 1989. http://www.theses.fr/1989PA120040.
Texto completoVettorel, Thomas. "Polymer crystallization studies by computer simulation". Université Louis Pasteur (Strasbourg) (1971-2008), 2005. https://publication-theses.unistra.fr/public/theses_doctorat/2005/VETTOREL_Thomas_2005.pdf.
Texto completoSemi-crystalline polymers are of great interest for industrial purposes, and the complex structures they involve as well as the mechanisms leading to the formation of crystals make their study very challenging. We investigated polymer crystallization by computer simulation via different methods: An atomisticly detailed model was used to reproduce the crystalline structure of short alkanes at low temperature, and continuous heating simulations gave rise to a transient phase that is well characterized in experiments. The same realistic model was used to simulate continuous cooling of the melt, but could not yield crystalline structures in a limited simulation time. In order to reproduce efficiently the characteristic features of semi-crystalline polymers, we used another simulation model which addresses larger length and time scales: This coarse-grained model allowed us to study the crystallization phenomenon in detail with several order parameters to characterize the crystal and its time evolution. The detailed study of the structure factors of the high-temperature melt has also been investigated so as to determine the influence of the liquid phase structure on crystal formation. These different studies yield a better understanding of the influence on crystallization of the various parameters entering the definitions of the simulation models
Brocail, Julien. "Analyse expérimentale et numérique du contact à l'interface outil-copeau lors de l'usinage à grande vitesse des métaux". Valenciennes, 2009. http://ged.univ-valenciennes.fr/nuxeo/site/esupversions/6c282378-ea86-4bf0-8c06-48498e37e0da.
Texto completoThe study relates to the characterization of the tool-chip interface during the high-speed machining of metals. The existing numerical approaches do not generate good correlations of the process variables, such as the cutting forces and the shape of the chip. Recent studies show that the determination of an interfacial law according to the contact parameters (contact pressure, sliding velocity and interfacial temperature) is necessary to describe more precisely the process parameters. Experiments were carried out on the upsetting sliding test that reproduces the mechanics and thermals contact conditions of the HSM process at the tool tip. This specific device has been adapted and the antagonists have been modified for this study. A friction law according to the contact pressure, the sliding velocity, and the interfacial temperature was defined for the tribological system AISI 1045 steel / uncoated carbide. This law implemented in a numerical model of orthogonal cutting (developed in Abaqus) offers interesting improvements
Mohamed, Kamel. "Simulation numérique en volume finis, de problèmes d'écoulements multidimensionnels raides, par un schéma de flux à deux pas". Paris 13, 2005. http://www.theses.fr/2005PA132020.
Texto completoThis thesis is devoted to the numerical simulation of stiff fluid flows, governed by sys¬tems of conservation laws with source terms (non homogeneous systems). Both one dimensional and two-dimensional configurations are considered. The numerical method used is an extension of the two steps flux scheme (SRNH), which depends on a local adjustable parameter aj+i and which has been proposed by professor F. Benkhaldoun in the one dimensional framework. In a first part of the work, aiming to extend the scheme to the two-dimensional case, we introduce an alternative scheme (SRNHR), which is obtained from SRNH by replacing the numerical velocity, by the local physical Rusanov velocity. Thereafter, the stability analysis of the scheme, shows that the new scheme can be of order 1 or 2 according to the value of the parameter 0j+1. A strategy of variation of this parameter, based on limiters theory was then adopted. The scheme can thus be turned to order 1 in the regions where the flow has a strong variation, and to order 2 in the regions where the flow is regular. After this step, we established the conditions so that this scheme respects the exact C-property introduced by Bermudez and Vazquez. A study of boundary conditions, adapted to this kind of two steps schemes, has also been carried out using the Riemann invariants. In the second part of the thesis, we applied this new scheme to homogeneous and non¬homogeneous monophasic systems. For example, we performed the numerical simulation of shallow water phenomena with bottom topography in both one and two dimensions. We also carried out a numerical convergence study by plotting the error curves. Finally, we used the scheme for the numerical simulation of two phase flow models (Ransom ID and 2D)
Dridi, Samia. "Essais de caractérisation des structures tissées". Lyon, INSA, 2010. http://theses.insa-lyon.fr/publication/2010ISAL0041/these.pdf.
Texto completoIn this work, we interested to the study and the modelling of mechanical behaviour of weave structure. We begin by presenting the properties of tested materials thanks to tests of characterization. An experimental data base is established to analyze the mechanical behaviour of fabric under some solicitations, in particular the shear, using the technique of digital image correlation. Then, by adopting a hyperelastic approach, a simplified model is developed allowing to study numerically the influence of the report of tensile and shearing rigidities on the mechanical behaviour of woven fabric, further to an extension in 45. Finally, by basing on phenomenological approach, a hyperelastic behaviour law is proposed. This model is implanted in a routine Vumat. It is identified from the tensile and the shearing tests and validated by certain cases of composite reinforcement forming
Barrero, Daniel. "Simulation et visualisation de phénomènes naturels pour la synthèse d'images". Toulouse 3, 2001. http://www.theses.fr/2001TOU30001.
Texto completoGuilminot, Virginie. "La synthèse d'image animée : au-delà d'une simulation réaliste". Paris 8, 1996. http://www.theses.fr/1996PA081159.
Texto completoA majority of 3d films and computer graphics respect a realistic aesthetic. On one hand, I will analyse why such totally made images initiate a film or reality and on the other hand, I will try to offer alternativity in order to create an other aesthetic, not being realistic. Softwares need numerous scientific formulas, therefore the author, while manipuling it, will be the main actor to change the situation. This is the main point. Realism exists in every 3d application, such as research, tv, cinema, fiction, school and even often in creation. A few artists managed to escape from realism, i. E. : Joan Stavely, Tamas Waliczky or Michel Bret. Each has its own way of working but they all aim at the same: not to reach realistic simulation. I agree on this point and illustrate my thinking, I made three 3d films with more sensitivity than technical challenge. It is possible to avoid the trap of realism. Indeed, by manipulating and divert the software, one can obtain different 3d computer graphics. Step by step, a new approach to handle and to develop tools apears, and thanks to this, authors can create films other than realistic.