Dissertations / Theses on the topic 'Bloc modèle'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the top 50 dissertations / theses for your research on the topic 'Bloc modèle.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.
Voiglio, Eric Joseph. "Conservation aérobie des organes : développement d'un modèle de bloc multi-viscéral pour l'étude d'une émulsion de fluorocarbure." Lyon 1, 2000. http://www.theses.fr/2000LYO1T124.
Full textBonvoisin, Frédéric. "Evaluation de la performance des blocs opératoires : du modèle aux indicateurs." Phd thesis, Université de Valenciennes et du Hainaut-Cambresis, 2011. http://tel.archives-ouvertes.fr/tel-00626150.
Full textCueff, Guillaume. "Développement d'un modèle thermomécanique du comportement sous agressions thermiques de matériaux cellulosiques : application à l'étude de résistance au feu de panneaux de bloc-porte en aggloméré de bois." Thesis, Bordeaux, 2014. http://www.theses.fr/2014BORD0360/document.
Full textIn the context of fire safety, industrial products used in the building constructionmarket have to satisfy to standard fire resistance tests. In particular, a temperature criterionmust be satisfied on the unexposed side of the product and attention should be given to thedeformation of the product during fire. These tests are restrictive and costly for manufacturerswhich can slow down their R&D program. In this context, a research program was initiatedby the company EFECTIS France in collaboration with the laboratory I2M from theUniversity of Bordeaux whose main objective is to develop a numerical thermomechanicalmodel for simulating a fire resistance test (virtual furnace) on a fire door composed of woodand wood-based materials (particles and fibres boards). Thermomechanical model takes intoaccount the variation of thermal and mechanical properties as a function of vaporization andpyrolysis reactions. Energy impacts of those reactions are also included in the model. Thenumerical model involves experimental data to complete material properties needed for itsutilisation. To achieve this, different experimental programs were carried out, in particularmeasurements using digital image correlation. Based on simulated temperature field andestimation of the global bending of the fire door, the model allows evaluating fireperformances of the product
Makke, Ali. "Mechanical properties of homogenous polymers and block copolymers : a molecular dynamics simulation approach." Thesis, Lyon 1, 2011. http://www.theses.fr/2011LYO10067/document.
Full textWe use molecular dynamics simulation of a coarse grained model to investigate the mechanical properties of homogenous polymers and lamellar block copolymers. Polymer samples have been generated using “radical like polymerisation” method. These samples were submitted to uniaxial and triaxial tensile tests in order to study their mechanical responses. First we compare two tensile test methods: the “homogenous deformation method” and the “boundary driven deformation method”. We find that the two methods lead to similar results at low strain rate. The change of the entanglement network in polymer sample undergoing a tensile deformation was investigated. We have found that the sample exhibits an increase of its entanglement length in uniaxial deformation test compared to triaxial deformation one. Our finding was interpreted by the pronounced chain disentanglement observed in the uniaxial deformation test due to the lateral relaxation of the sample. The cavity nucleation in amorphous polymers has been also studied. We have found that the cavities nucleate preferentially in zones that exhibit a low elastic bulk modulus. These zones can be identified from the initial undeformed state of the sample at low temperature (T~0K). The second part of the work focused in the simulation of the mechanical response of block copolymers. The influence of chain architecture on the mechanical properties was investigated: our finding reveals an important role of the bridging molecules (cilia chains and knotted loop chains) on the stress transmission between phases at high strain. The initiation of plasticity in copolymer samples was also studied. The role of the buckling has been found to be determinant in the mechanical response of the sample The dependence of the buckling instability with the sample size and the deformation rate was investigated. We have found that the fundamental (first) mode of buckling develops at relatively low strain rate whereas at high strain rate the buckling of the sample occurs with the second or higher mode of buckling. A new model that takes into account the buckling kinetic was developed to describe this competition between the buckling modes
Moretti, Isabelle. "Modélisation de l'extension intracontinentale : exemple du Golfe de Suez." Paris 11, 1987. http://www.theses.fr/1987PA112056.
Full textAfdideh, Fardin. "Block-sparse models in multi-modality : application to the inverse model in EEG/MEG." Thesis, Université Grenoble Alpes (ComUE), 2018. http://www.theses.fr/2018GREAT074/document.
Full textThree main challenges have been addressed in this thesis, in three chapters.First challenge is about the ineffectiveness of some classic methods in high-dimensional problems. This challenge is partially addressed through the idea of clustering the coherent parts of a dictionary based on the proposed characterisation, in order to create more incoherent atomic entities in the dictionary, which is proposed as a block structure identification framework. The more incoherent atomic entities, the more improvement in the exact recovery conditions. In addition, we applied the mentioned clustering idea to real-world EEG/MEG leadfields to segment the brain source space, without using any information about the brain sources activity and EEG/MEG signals. Second challenge raises when classic recovery conditions cannot be established for the new concept of constraint, i.e., block-sparsity. Therefore, as the second research orientation, we developed a general framework for block-sparse exact recovery conditions, i.e., four theoretical and one algorithmic-dependent conditions, which ensure the uniqueness of the block-sparse solution of corresponding weighted mixed-norm optimisation problem in an underdetermined system of linear equations. The mentioned generality of the framework is in terms of the properties of the underdetermined system of linear equations, extracted dictionary characterisations, optimisation problems, and ultimately the recovery conditions. Finally, the combination of different information of a same phenomenon is the subject of the third challenge, which is addressed in the last part of dissertation with application to brain source space segmentation. More precisely, we showed that by combining the EEG and MEG leadfields and gaining the electromagnetic properties of the head, more refined brain regions appeared
Laclau, Charlotte. "Hard and fuzzy block clustering algorithms for high dimensional data." Thesis, Sorbonne Paris Cité, 2016. http://www.theses.fr/2016USPCB014.
Full textWith the increasing number of data available, unsupervised learning has become an important tool used to discover underlying patterns without the need to label instances manually. Among different approaches proposed to tackle this problem, clustering is arguably the most popular one. Clustering is usually based on the assumption that each group, also called cluster, is distributed around a center defined in terms of all features while in some real-world applications dealing with high-dimensional data, this assumption may be false. To this end, co-clustering algorithms were proposed to describe clusters by subsets of features that are the most relevant to them. The obtained latent structure of data is composed of blocks usually called co-clusters. In first two chapters, we describe two co-clustering methods that proceed by differentiating the relevance of features calculated with respect to their capability of revealing the latent structure of the data in both probabilistic and distance-based framework. The probabilistic approach uses the mixture model framework where the irrelevant features are assumed to have a different probability distribution that is independent of the co-clustering structure. On the other hand, the distance-based (also called metric-based) approach relied on the adaptive metric where each variable is assigned with its weight that defines its contribution in the resulting co-clustering. From the theoretical point of view, we show the global convergence of the proposed algorithms using Zangwill convergence theorem. In the last two chapters, we consider a special case of co-clustering where contrary to the original setting, each subset of instances is described by a unique subset of features resulting in a diagonal structure of the initial data matrix. Same as for the two first contributions, we consider both probabilistic and metric-based approaches. The main idea of the proposed contributions is to impose two different kinds of constraints: (1) we fix the number of row clusters to the number of column clusters; (2) we seek a structure of the original data matrix that has the maximum values on its diagonal (for instance for binary data, we look for diagonal blocks composed of ones with zeros outside the main diagonal). The proposed approaches enjoy the convergence guarantees derived from the results of the previous chapters. Finally, we present both hard and fuzzy versions of the proposed algorithms. We evaluate our contributions on a wide variety of synthetic and real-world benchmark binary and continuous data sets related to text mining applications and analyze advantages and inconvenients of each approach. To conclude, we believe that this thesis covers explicitly a vast majority of possible scenarios arising in hard and fuzzy co-clustering and can be seen as a generalization of some popular biclustering approaches
Mohamed, Ahmed Abdeldayem. "Behavior, strength and flexural stiffness of circular concrete columns reinforced with FRP bars and spirals/hoops under eccentric loading." Thèse, Université de Sherbrooke, 2017. http://hdl.handle.net/11143/11406.
Full textLa détérioration des structures en béton armé avec des barres d’armature d’acier peut être observée quotidiennement dans les régions à climat agressif. Le renforcement interne en polymères renforcés de fibres (PRF) a démontré sa faisabilité grâce à différents éléments structuraux en génie civil. Les lignes directrices actuelles pour les structures en béton armé de PRF en Amérique du Nord et en Europe n'ont pas encore gérées les sections soumises à des efforts axiaux excentrique, en raison du manque de recherches et d'expériences. Cette recherche permet d’augmenter la base de données expérimentales ainsi établir des analyses approfondies et des recommandations de conception pour les colonnes circulaires en béton armé complètement renforcées de PRF (barres et spirales). Des grandeur-nature colonnes ont été testées sous charge monotone avec différents niveaux d'excentricité. Les variables de test comprenaient le rapport excentricité / diamètre (e/D) ; le type de renfort (PRFV et PRFC comparativement à l’acier); la résistance du béton en compression; le taux d’armature longitudinal et transversal; et la configuration de l’armature de confinement. Tous les échantillons mesuraient 305 mm de diamètre et 1500 mm de hauteur. Les résultats des tests ont indiqué que les spécimens renforcés avec des PRF de verre ou des PRF de carbone atteignaient leur résistance maximale sans endommager les barres d’armature. Des deux types de renforcement, les spécimens de PRFCCFRP se comportaient de manière très similaire à leurs homologues en acier et atteignaient presque les mêmes résistances axiales. Cependant, les spécimens avec renforcement en PRFV ont présenté une rigidité réduite et des forces axiales nominales inférieures à celles de leurs homologues en acier ou en PRFC. Le mode de rupture des spécimens de PRFC et de PRFV a été dominé par l’écrasement du béton à de faibles niveaux d'excentricité (rapports e/D de 8,2% et 16,4%). Les résultats ont révélé que les barres de PRFV ont développé des niveaux élevés de déformations et de contraintes sur les faces en compression et en tension et, par conséquent, les spécimens de PRFVC pourraient supporter une charge axiale constante après la résistance ultime pendant un certain temps jusqu'à la limite de la rupture en compression du béton du noyau à des niveaux supérieurs d'excentricité (rapport e/D de 8,2% et 16,4%), ce qui contribue à retarder la dégradation. À ces niveaux, une rupture en tension a été initiée dans les spécimens de PRFV résultant à de grandes déformations axiales et latérales et des fissures du côté de la face en tension jusqu'à ce que la rupture en compression du béton. La rupture des spécimens de PRFC à des niveaux supérieurs d'excentricité (rapport e/D de 8,2% et 16,4%) a été caractérisé comme étant en compression du béton dans laquelle il s'est déroulé de manière moins fragile. D'autre part, cette recherche comprenait également différentes études pour analyser les résultats des tests, évaluer l'efficacité des barres d'armature et fournir des recommandations pour l'analyse et la conception. Il a donc été indiqué que les capacités axiales et de flexion des spécimens en PRF testées pourraient être raisonnablement prédites en utilisant une analyse en section plane, en utilisant les paramètres du bloc de contrainte rectangulaire équivalent (BCRE) donnés par l'ACI 440.1R-15 ou la CSA S806- 12. Toutes les prédictions ont sous-estimé la résistance réelle avec des niveaux de variabilité conservateur entre 1,05 et 1,25 pour les spécimens de PRFC et entre 1,20 et 1,40 pour les spécimens de PRFC. Ces niveaux ont été nettement réduits à des limites critiques dans les spécimens avec des bétons à haute résistance. Un examen approfondi a été effectué sur les paramètres du BCRE disponibles dans les normes et les directives de conception actuelles en acier et en PRF. Les expressions modifiées du BCRE fournies dans ACI 440.1R-15 et CSA S806-12 ont été développées. Les résultats indiquent une bonne corrélation entre les valeurs de résistance prédites et mesurées avec des niveaux accrus de conservatisme. La contribution de la résistance à la compression du renforcement en PRF a été soigneusement examinée et discutée. Le taux d’armature minimum de PRFV et de PRFC pour éviter la rupture de l'armature ont été largement examinés. Enfin, la rigidité en flexion (EI) des spécimens testés a été déterminée de manière analytique et comparée aux expressions disponibles dans la littérature en utilisant les réponses expérimentales et analytiques M-ψ. Les expressions modifiées de la rigidité en flexion EI apportées dans l’ACI 440.1R ont été développées et validées.
Brault, Vincent. "Estimation et sélection de modèle pour le modèle des blocs latents." Thesis, Paris 11, 2014. http://www.theses.fr/2014PA112238/document.
Full textClassification aims at sharing data sets in homogeneous subsets; the observations in a class are more similar than the observations of other classes. The problem is compounded when the statistician wants to obtain a cross classification on the individuals and the variables. The latent block model uses a law for each crossing object class and class variables, and observations are assumed to be independent conditionally on the choice of these classes. However, factorizing the joint distribution of the labels is impossible, obstructing the calculation of the log-likelihood and the using of the EM algorithm. Several methods and criteria exist to find these partitions, some frequentist ones, some bayesian ones, some stochastic ones... In this thesis, we first proposed sufficient conditions to obtain the identifiability of the model. In a second step, we studied two proposed algorithms to counteract the problem of the EM algorithm: the VEM algorithm (Govaert and Nadif (2008)) and the SEM-Gibbs algorithm (Keribin, Celeux and Govaert (2010)). In particular, we analyzed the combination of both and highlighted why the algorithms degenerate (term used to say that it returns empty classes). By choosing priors wise, we then proposed a Bayesian adaptation to limit this phenomenon. In particular, we used a Gibbs sampler and we proposed a stopping criterion based on the statistics of Brooks-Gelman (1998). We also proposed an adaptation of the Largest Gaps algorithm (Channarond et al. (2012)). By taking their demonstrations, we have shown that the labels and parameters estimators obtained are consistent when the number of rows and columns tend to infinity. Furthermore, we proposed a method to select the number of classes in row and column, the estimation provided is also consistent when the number of row and column is very large. To estimate the number of classes, we studied the ICL criterion (Integrated Completed Likelihood) whose we proposed an exact shape. After studying the asymptotic approximation, we proposed a BIC criterion (Bayesian Information Criterion) and we conjecture that the two criteria select the same results and these estimates are consistent; conjecture supported by theoretical and empirical results. Finally, we compared the different combinations and proposed a methodology for co-clustering
Zheng, Wenjie. "A distributed Frank-Wolfe framework for trace norm minimization via the bulk synchronous parallel model." Thesis, Sorbonne université, 2018. http://www.theses.fr/2018SORUS049/document.
Full textLearning low-rank matrices is a problem of great importance in statistics, machine learning, computer vision, recommender systems, etc. Because of its NP-hard nature, a principled approach is to solve its tightest convex relaxation : trace norm minimization. Among various algorithms capable of solving this optimization is the Frank-Wolfe method, which is particularly suitable for high-dimensional matrices. In preparation for the usage of distributed infrastructures to further accelerate the computation, this study aims at exploring the possibility of executing the Frank-Wolfe algorithm in a star network with the Bulk Synchronous Parallel (BSP) model and investigating its efficiency both theoretically and empirically. In the theoretical aspect, this study revisits Frank-Wolfe's fundamental deterministic sublinear convergence rate and extends it to nondeterministic cases. In particular, it shows that with the linear subproblem appropriately solved, Frank-Wolfe can achieve a sublinear convergence rate both in expectation and with high probability. This contribution lays the theoretical foundation of using power iteration or Lanczos iteration to solve the linear subproblem for trace norm minimization. In the algorithmic aspect, within the BSP model, this study proposes and analyzes four strategies for the linear subproblem as well as methods for the line search. Moreover, noticing Frank-Wolfe's rank-1 update property, it updates the gradient recursively, with either a dense or a low-rank representation, instead of repeatedly recalculating it from scratch. All of these designs are generic and apply to any distributed infrastructures compatible with the BSP model. In the empirical aspect, this study tests the proposed algorithmic designs in an Apache SPARK cluster. According to the experiment results, for the linear subproblem, centralizing the gradient or averaging the singular vectors is sufficient in the low-dimensional case, whereas distributed power iteration, with as few as one or two iterations per epoch, excels in the high-dimensional case. The Python package developed for the experiments is modular, extensible and ready to deploy in an industrial context. This study has achieved its function as proof of concept. Following the path it sets up, solvers can be implemented for various infrastructures, among which GPU clusters, to solve practical problems in specific contexts. Besides, its excellent performance in the ImageNet dataset makes it promising for deep learning
Rodrigues, Laura. "Nanoparticules polymères ciblant le récepteur CXCR3 : élaboration et évaluation sur modèles de tumeur." Thesis, Bordeaux, 2018. http://www.theses.fr/2018BORD0104/document.
Full textThis thesis deals with the elaboration of polymeric nanoparticles functionalized by the ligand SCH546738 to target the CXCR3 receptor overexpressed in human healthy or tumoral cells. Poly(trimethylene carbonate-b-Poly(ethylene glycol) (PTMC-b-PEG) blocks copolymers and PTMC-b-PEG-SCH546738 synthesis, then their self-assembly with different ratios in water, and finally biological activity in vitro of these different nanoparticles were studied. A serie of PTMC-b-PEG with different hydrophilic mass fractions f (between 34 and 6%) were obtained by ring opening polymerization (ROP) of trimethylene carbonate (TMC) initiated by a block PEG (MW: 2000 g/mol). Self-assembly studies showed that the hydrophilic mass fraction was related to the morphology of the nano objects (micelles and vesicles) and that size and morphology of nano objects can be changed by the self-assembly protocol. PTMC-b-PEG-SCH546738 were obtained by the convergent coupling between PEG-SCH546738 and PTMC block. The co self-assembly of functionalized and not functionalized copolymers was done by nanoprecipitation controlled by a microfluidic system that allows monodisperse polymersomes with controlled size to be produced. The molar percentage of SCH546738 at the surface of polymersomes was fixed at 5, 10 and 20 %, and with the control nanoparticle, these samples were tested in vitro on HEK 293 and U87 cells overexpressing the CXCR3-A. The influence of the ligand and its percentage on nanoparticles internalization and signaling pathways blocking on cells were analyzed
ARENAS, SANGUINETI CLAUDIO. "Specification et estimation du bloc financier d'un modele macroeconomique trimestriel." Paris 1, 1986. http://www.theses.fr/1986PA010066.
Full textTurmel, Dominique. "Analyse des chutes de bloc dans le domaine subaquatique." Thesis, Université Laval, 2008. http://www.theses.ulaval.ca/2008/25529/25529.pdf.
Full textTabouy, Timothée. "Impact de l’échantillonnage sur l’inférence de structures dans les réseaux : application aux réseaux d’échanges de graines et à l’écologie." Thesis, Université Paris-Saclay (ComUE), 2019. http://www.theses.fr/2019SACLS289/document.
Full textIn this thesis we are interested in studying the stochastic block model (SBM) in the presence of missing data. We propose a classification of missing data into two categories Missing At Random and Not Missing At Random for latent variable models according to the model described by D. Rubin. In addition, we have focused on describing several network sampling strategies and their distributions. The inference of SBMs with missing data is made through an adaptation of the EM algorithm : the EM with variational approximation. The identifiability of several of the SBM models with missing data has been demonstrated as well as the consistency and asymptotic normality of the maximum likelihood estimators and variational approximation estimators in the case where each dyad (pair of nodes) is sampled independently and with equal probability. We also looked at SBMs with covariates, their inference in the presence of missing data and how to proceed when covariates are not available to conduct the inference. Finally, all our methods were implemented in an R package available on the CRAN. A complete documentation on the use of this package has been written in addition
Wang, Xiaopei. "Multi-Way Block Models." University of Cincinnati / OhioLINK, 2012. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1342716695.
Full textTami, Myriam. "Approche EM pour modèles multi-blocs à facteurs à une équation structurelle." Thesis, Montpellier, 2016. http://www.theses.fr/2016MONTT303/document.
Full textStructural equation models enable the modeling of interactions between observed variables and latent ones. The two leading estimation methods are partial least squares on components and covariance-structure analysis. In this work, we first describe the PLS and LISREL methods and, then, we propose an estimation method using the EM algorithm in order to maximize the likelihood of a structural equation model with latent factors. Through a simulation study, we investigate how fast and accurate the method is, and thanks to an application to real environmental data, we show how one can handly construct a model or evaluate its quality. Finally, in the context of oncology, we apply the EM approach on health-related quality-of-life data. We show that it simplifies the longitudinal analysis of quality-of-life and helps evaluating the clinical benefit of a treatment
Kabajulizi, Julian. "The cost of bypassing MFN obligations through GSP schemes: EU-India GSP case and its implications for developing countries." Thesis, University of the Western Cape, 2005. http://etd.uwc.ac.za/index.php?module=etd&.
Full textMekkaoui, Imen. "Analyse numérique des équations de Bloch-Torrey." Thesis, Lyon, 2016. http://www.theses.fr/2016LYSEI120/document.
Full textDiffusion magnetic resonance imaging (dMRI) is a non-invasive technique allowing access to the structural information of the biological tissues through the study of the diffusion motion of water molecules in tissues. Its applications are numerous in neurology, especially for the diagnosis of certain brain abnormalities, and for the study of the human cerebral white matter. However, due to the cardiac motion, the use of this technique to study the architecture of the in vivo human heart represents a great challenge. Cardiac motion has been identified as a major source of signal loss. Because of the sensitivity to motion, it is difficult to assess to what extent the diffusion characteristics obtained from diffusion MRI reflect the real properties of the cardiac tissue. In this context, modelling and numerical simulation of the diffusion MRI signal offer an alternative approach to address the problem. The objective of this thesis is to study numerically the influence of cardiac motion on the diffusion images and to focus on the issue of attenuation of the cardiac motion effect on the diffusion MRI signal. The first chapter of this thesis is devoted to the introduction of the physical principle of nuclear magnetic resonance (NMR) and image reconstruction techniques in MRI. The second chapter presents the principle of diffusion MRI and summarizes the state of the art of the various models proposed in the litera- ture to model the diffusion MRI signal. In the third chapter a modified model of the Bloch-Torrey equation in a domain that deforms over time is introduced and studied. This model represents a generalization of the Bloch-Torrey equation used to model the diffusion MRI signal in the case of static organs. In the fourth chapter, the influence of cardiac motion on the diffusion MRI signal is investigated numerically by using the modified Bloch-Torrey equation and an analytical motion model mimicking a realistic deformation of the heart. The numerical study reported here, can quantify the effect of motion on the diffusion measurement depending on the type of the diffusion coding sequence. The results obtained allow us to classify the diffusion encoding sequences in terms of sensitivity to the cardiac motion and identify for each sequence a temporal window in the cardiac cycle in which the influence of motion is reduced. Finally, in the fifth chapter, a motion correction method is presented to minimize the effect of cardiac motion on the diffusion images. This method is based on a singular development of the modified Bloch-Torrey model in order to obtain an asymptotic model of ordinary differential equation that gives a relationship between the true diffusion and the diffusion reconstructed in the presence of motion. This relationship is then used to solve the inverse problem of recovery and correction of the diffusion influenced by the cardiac motion
Rochdi, Youssef. "Identification de systèmes non linéaires blocs." Phd thesis, Université de Caen, 2006. http://tel.archives-ouvertes.fr/tel-00261896.
Full textLa dernière partie du mémoire est centrée sur l'identification des systèmes de Wiener, dont l'élément non linéaire n'est pas supposé inversible. A cet effet, nous présentons deux schémas d'identification de type fréquentiel et établissons leur consistance dans les mêmes conditions que précédemment concernant les perturbations. L'exigence d'excitation persistante occupe une place centrale dans cette thèse. Pour procurer cette propriété aux différents schémas d'identifications proposés, il a été fait appel à une famille de signaux d'excitation de type impulsionnelle. Dans ce cadre, un lemme technique est élaboré précisant, pour les systèmes linéaires, le lien entre cette famille de signaux et la propriété d'excitation persistante. L'adaptation de ce lemme au cas des systèmes non linéaires est illustrée dans les différents schémas d'identification.
Zhang, Xusheng. "Mesoscopic models of block copolymer rheology." Thesis, McGill University, 2011. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=96823.
Full textNous developpons un cadre theorique propre a l'echelle me oscopique dans le but d'etudier la reponse viscoelastique des blocs de copolymeres pres du point de transition entre leur etat ordonne et desordonne. Nous utilisons cette theorie pour e tudier la selection de l'orientation des phases de lamelles des blocs de copolymeres subissant des cisaillements oscillatoires. Nous examinons les effets hydrodynamiques de la relaxation des phases lamellaires et nous incluons les stress visqueux anisotropes, dues a la nature uniaxiale des phases. Nous introduisons aussi les effets viscoelastique relatifs aux r eseaux modelisant l'enchevetrement des chaines dans une approche consistante avec la symetrie des phases. Un algorithme nume rique sous implementation parallele a ete developpe pour resoudre les equations relatives a cette etude. Des cas simples impliquant la relaxation diffuse du parametre relatif a l'ordre ont ete examines et utilises pour verifier le code numerique. Nous adressons aussi la question de la selection de l'orientation spontane d'un etat initialement desordonne due a un cisaillement oscillatoire impose au systeme. Dans l'absence d'interaction hydrodynamique, nous observons que l'orientation denomme e parallele est selectionnee pour des petites frequences et amplitudes de cisaillement mais adopte une orientation perpendiculaire pour de grandes frequences et amplitudes de cisaillement. Les effets hydrodynamiques changent la region de transition. Nous avons aussi examin e l'effet d'enchevetrement du reseau pour des frequences finies. Nous trouvons que l'enchevetrement du reseau mene a un alignement plus rapide et que les stress des reseaux anisotropes peuvent influencer de maniere significative le processus de se lection d'orientation.
Mikail, Abud Filho Régis. "Littérature et religion. Le modèle hagiographique chez Flaubert, Bloy et Huysmans." Thesis, Paris 4, 2017. http://www.theses.fr/2017PA040019.
Full textThe crisis the Catholic Church went through during the 19th century spread into the literary universe. The relation between faith and aesthetics establishes itself in a different way from the catholic novelists of the first half of the century: the exemplary transmission of faith through religious discourse mingles with the representation of this transmission itself through hagiographic literature. This literary and aesthetic renewal puts both fiction and hagiography into perspective. For instance, in the works of Flaubert, Bloy and Huysmans, a sanctity claiming itself to be both primitive and terrifying disrupts a catholic art accused of sentimentalism. Moreover, narratives of hagiographical inspiration, intention or subversion question literary representations of naturalism and decadentism. Rewriting sanctity is accomplished as a parallel to the transformations which affect the novel, whereas fictional characters are more closely represented in the manner of saints. The character as a saint and the saint as a character lay somewhere within an indefinite land between the narrative of the novel and the hagiographic narrative. The faith of writers such as Bloy and Huysmans calls for reflections on the discursive subversion, characteristic of the literary discourse: can faith be subversive in spite of its intentions? Inversely, a novel structured on the medievel hagiographic models of the Legend of saint Julien the Hospitaller reveals that a certain respect of the form does not necessarily imply professing one’s faith
Galéra, Cyril. "Construction de blocs géographiques cohérents dans GOCAD." Vandoeuvre-les-Nancy, INPL, 2002. http://www.theses.fr/2002INPLA05N.
Full textThe definition of a 3D geological model is a complicated task, done ftom poor data and let a large space to the geologist interpretation. Thanks to restorations methods which consist in unfolding the model in order to see if it seems credible before deformation. Nevertheless, the efficiency of this method in 3D is subject to caution. Also, another approach presented in this thesis and implemented in the GOCAD geomodeller consist in building directly new horizons based on the most reliable data and the supposed deformation, the simple shear or flexural slip. Thanks to these tools, the geologist will thus be able to comp!ete and check the model coherency. The developability or unfoldability of the horizons have also been study in these in order to implement new unfoldable methods and artefacts corrections
Stirzaker, Kim E. (Kim Elizabeth). "Structure and Form in Two Late Works for Flute and Orchestra by Ernest Bloch (1880-1959): Suite Modale (1956) and Two Last Poems (Maybe. . .) (1958) -- a Lecture Recital, Together With Three Recitals of Selected Works of J.S. Bach, Jolivet, Mozart,and Others." Thesis, University of North Texas, 1992. https://digital.library.unt.edu/ark:/67531/metadc278196/.
Full textSong, Li. "Piecewise models for long memory time series." Paris 11, 2010. http://www.theses.fr/2010PA112128.
Full textThere are many studies on stationary processes exhibiting long range dependence (LRD) and on piecewise models involving structural changes. But the literature on structural breaks in LRD models is relatively sparse because structural changes and LRD are easily confused. Some works consider the case where only some coefficients in a LRD model are allowed to change. Ln this thesis, we consider a non-stationary LRD parametric model, namely the piecewise fractional autoregressive integrated moving-average (FARlMA) model. It is a pure structural change model inwhich the nurnber and the locations of break points (BPs) as well as the ARMA orders and the corresponding coefficients are allowed to change between two regimes. Two methods are proposed to estimate the parameters of this model. The first one is to optimize a criterion based on the minimum description length (MDL) principle. We show that this criterion outperforms the Bayesian information criterion and another MDL based criterion proposed in the literature. Since the search space is huge, the practical optimization of our criterion is a complicated task and we design an automatic methodology based on a genetic algorithm. The second method is designed for very long time series, like Internet traffic data. Ln such cases, the minimisation of the criterion based on MDL is very difficult. We propose a method based on the differences between parameter estimations of different blocks of data to fit the piecewise FARIMA model. This method consists in a four-step procedure. Ln Step 1, we fit a stationary FARiMA model to the whole series. Local parameter estimates are obtained in Step 2. Ln Step 3, for all possible BP numbers, we select the intervals with a BP, we estimate the BP locations and we estimate the parameters of each stationary block. Lastly, Step 4 concerns the selection of the BP number using the sum of squared residuals of the different fitted piecewise models. The effectiveness of the two methods proposed in the thesis is shown by simulations and applications to real data are considered
Abebe, Opeyemi Temitope. "Regional trade agreements and its impact on the multilateral trading system: eroding the preferences of developing countries?" Thesis, University of the Western Cape, 2005. http://etd.uwc.ac.za/index.php?module=etd&.
Full textDelhomme, Fabien. "Etude du comportement sous impact d'une structure pare-blocs en béton armé." Chambéry, 2005. http://www.theses.fr/2005CHAMS004.
Full textThis thesis studies the behaviour of a new concept for a protection gallery under rock fall, called Structurally Dissipating Rock-shed (SDR). The main innovation, compared to conventional solutions, is to dissipate the impact energy directly into the reinforced concrete slab or into fuse supports, and no longer in a cushion layer. The dynamic phenomena, taking place during the impact of a block onto the slab, are analyzed by means of experiments on a 1/3 scale SDR structure. The percussion loads applied to the slab, during the contact phase with the block, are assessed as weIl as the various energy transfers and dissipations. The results allowed to validate the operating and repair principles of the SDR and revealed that the slab is damaged by three main mechanisms: the punching, the bending and the breaking clown at surface level of the impacted zone. The principal experimental values are found by numerical simulations of the tests with a finite elements tool. A simplified mechanical model "masses-springs-damping" is also developed with the aim of implementing design methods for engineering offices. The prospects for this work are to succeed in establishing design and construction recommendations for structurally dissipating rock-sheds
Eichenauer, Florian. "Analysis for dissipative Maxwell-Bloch type models." Doctoral thesis, Humboldt-Universität zu Berlin, Mathematisch-Naturwissenschaftliche Fakultät, 2016. http://dx.doi.org/10.18452/17661.
Full textThis thesis deals with the mathematical modeling of semi-classical matter-light interaction. In the semi-classical picture, matter is described by a density matrix "rho", a quantum mechanical concept. Light on the other hand, is described by a classical electromagnetic field "(E,H)". We give a short overview of the physical background, introduce the usual coupling mechanism and derive the classical Maxwell-Bloch equations which have intensively been studied in the literature. Moreover, We introduce a mathematical framework in which we state a systematic approach to include dissipative effects in the Liouville-von-Neumann equation. The striking advantage of our approach is the intrinsic existence of a Liapunov function for solutions to the resulting evolution equation. Next, we couple the resulting equation to the Maxwell equations and arrive at a new self-consistent dissipative Maxwell-Bloch type model for semi-classical matter-light interaction. The main focus of this work lies on the intensive mathematical study of the dissipative Maxwell-Bloch type model. Since our model lacks Lipschitz continuity, we create a regularized version of the model that is Lipschitz continuous. We mostly restrict our analysis to the Lipschitz continuous regularization. For regularized versions of the dissipative Maxwell-Bloch type model, we prove existence of solutions to the corresponding Cauchy problem. The core of the proof is based on results from compensated compactness due to P. Gérard and a Rellich type lemma. In parts, this proof closely follows the lines of an earlier work due to J.-L. Joly, G. Métivier and J. Rauch.
Robert, Valérie. "Classification croisée pour l'analyse de bases de données de grandes dimensions de pharmacovigilance." Thesis, Université Paris-Saclay (ComUE), 2017. http://www.theses.fr/2017SACLS111/document.
Full textThis thesis gathers methodological contributions to the statistical analysis of large datasets in pharmacovigilance. The pharmacovigilance datasets produce sparse and large matrices and these two characteritics are the main statistical challenges for modelling them. The first part of the thesis is dedicated to the coclustering of the pharmacovigilance contingency table thanks to the normalized Poisson latent block model. The objective is on the one hand, to provide pharmacologists with some interesting and reduced areas to explore more precisely. On the other hand, this coclustering remains a useful background information for dealing with individual database. Within this framework, a parameter estimation procedure for this model is detailed and objective model selection criteria are developed to choose the best fit model. Datasets are so large that we propose a procedure to explore the model space in coclustering, in a non exhaustive way but a relevant one. Additionnally, to assess the performances of the methods, a convenient coclustering index is developed to compare partitions with high numbers of clusters. The developments of these statistical tools are not specific to pharmacovigilance and can be used for any coclustering issue. The second part of the thesis is devoted to the statistical analysis of the large individual data, which are more numerous but also provides even more valuable information. The aim is to produce individual clusters according their drug profiles and subgroups of drugs and adverse effects with possible links, which overcomes the coprescription and masking phenomenons, common contingency table issues in pharmacovigilance. Moreover, the interaction between several adverse effects is taken into account. For this purpose, we propose a new model, the multiple latent block model which enables to cocluster two binary tables by imposing the same row ranking. Assertions inherent to the model are discussed and sufficient identifiability conditions for the model are presented. Then a parameter estimation algorithm is studied and objective model selection criteria are developed. Moreover, a numeric simulation model of the individual data is proposed to compare existing methods and study its limits. Finally, the proposed methodology to deal with individual pharmacovigilance data is presented and applied to a sample of the French pharmacovigilance database between 2002 and 2010
Healy, Timothy M. "Multi-block and overset-block domain decomposition techniques for cardiovascular flow simulation." Diss., Georgia Institute of Technology, 2001. http://hdl.handle.net/1853/15622.
Full textHarnischmacher, Gerrit. "Block structured modeling for control /." Düsseldorf : VDI-Verl, 2007. http://bvbr.bib-bvb.de:8991/F?func=service&doc_library=BVB01&doc_number=016244726&line_number=0001&func_code=DB_RECORDS&service_type=MEDIA.
Full textKeenan, David Wayne 1955. "BLOCK PLAN CONSTRUCTION FROM A DELTAHEDRON-BASED ADJACENCY GRAPH." Thesis, The University of Arizona, 1986. http://hdl.handle.net/10150/292025.
Full textPaltrinieri, Federico. "Modeling temporal networks with dynamic stochastic block models." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2019. http://amslaurea.unibo.it/18805/.
Full textKendall, Toby. "Theoretical models of trade blocs and integrated markets." Thesis, University of Warwick, 2000. http://wrap.warwick.ac.uk/4014/.
Full textNorton, Kevin M. "Parameter optimization of seismic isolator models using recursive block-by-block nonlinear transient structural synthesis." Thesis, Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 2002. http://library.nps.navy.mil/uhtbin/hyperion-image/02sep%5FNorton.pdf.
Full textDeshayes, Aurélia. "Modèles de croissance aléatoire et théorèmes de forme asymptotique : les processus de contact." Thesis, Université de Lorraine, 2014. http://www.theses.fr/2014LORR0168/document.
Full textThis thesis is a contribution to the mathematical study of interacting particles systems which include random growth models representing a spreading shape over time in the cubic lattice. These processes are used to model the crystal growth or the spread of an infection. In particular, Harris introduced in 1974 the contact process to represent such a spread. It is one of the simplest interacting particles systems which exhibits a critical phenomenon and today, its behaviour is well-Known on each phase. Many questions about its extensions remain open and motivated our work, especially the one on the asymptotic shape. After the presentation of the contact process and its extensions, we introduce a new one: the contact process with aging where each particle has an age age that influences its ability to give birth to its neighbours. We build a coupling between our process and a supercritical oriented percolation adapted from Bezuidenhout-Grimmett's construction and we establish the 'at most linear' growth of our process. In the last part of this work, we prove an asymptotic shape theorem for general random growth models thanks to subadditive techniques, which can be complicated in the case of non-Permanent models conditioned to survive. We conclude that the process with aging, the contact process in randomly evolving environment, the oriented percolation with hostile immigration and the bounded modified contact process satisfy asymptotic shape results
Corneli, Marco. "Dynamic stochastic block models, clustering and segmentation in dynamic graphs." Thesis, Paris 1, 2017. http://www.theses.fr/2017PA01E012/document.
Full textThis thesis focuses on the statistical analysis of dynamic graphs, both defined in discrete or continuous time. We introduce a new extension of the stochastic block model (SBM) for dynamic graphs. The proposed approach, called dSBM, adopts non homogeneous Poisson processes to model the interaction times between pairs of nodes in dynamic graphs, either in discrete or continuous time. The intensity functions of the processes only depend on the node clusters, in a block modelling perspective. Moreover, all the intensity functions share some regularity properties on hidden time intervals that need to be estimated. A recent estimation algorithm for SBM, based on the greedy maximization of an exact criterion (exact ICL) is adopted for inference and model selection in dSBM. Moreover, an exact algorithm for change point detection in time series, the "pruned exact linear time" (PELT) method is extended to deal with dynamic graph data modelled via dSBM. The approach we propose can be used for change point analysis in graph data. Finally, a further extension of dSBM is developed to analyse dynamic net- works with textual edges (like social networks, for instance). In this context, the graph edges are associated with documents exchanged between the corresponding vertices. The textual content of the documents can provide additional information about the dynamic graph topological structure. The new model we propose is called "dynamic stochastic topic block model" (dSTBM).Graphs are mathematical structures very suitable to model interactions between objects or actors of interest. Several real networks such as communication networks, financial transaction networks, mobile telephone networks and social networks (Facebook, Linkedin, etc.) can be modelled via graphs. When observing a network, the time variable comes into play in two different ways: we can study the time dates at which the interactions occur and/or the interaction time spans. This thesis only focuses on the first time dimension and each interaction is assumed to be instantaneous, for simplicity. Hence, the network evolution is given by the interaction time dates only. In this framework, graphs can be used in two different ways to model networks. Discrete time […] Continuous time […]. In this thesis both these perspectives are adopted, alternatively. We consider new unsupervised methods to cluster the vertices of a graph into groups of homogeneous connection profiles. In this manuscript, the node groups are assumed to be time invariant to avoid possible identifiability issues. Moreover, the approaches that we propose aim to detect structural changes in the way the node clusters interact with each other. The building block of this thesis is the stochastic block model (SBM), a probabilistic approach initially used in social sciences. The standard SBM assumes that the nodes of a graph belong to hidden (disjoint) clusters and that the probability of observing an edge between two nodes only depends on their clusters. Since no further assumption is made on the connection probabilities, SBM is a very flexible model able to detect different network topologies (hubs, stars, communities, etc.)
Cheng, Xiang. "TFTs circuit simulation models and analogue building block designs." Thesis, University of Cambridge, 2018. https://www.repository.cam.ac.uk/handle/1810/271853.
Full textHédouin, Renaud. "Diffusion MRI processing for multi-comportment characterization of brain pathology." Thesis, Rennes 1, 2017. http://www.theses.fr/2017REN1S035/document.
Full textDiffusion weighted imaging (DWI) is a specific type of MRI acquisition based on the direction of diffusion of the brain water molecule. Its allow, through several acquisitions, to model brain microstructure, as white matter, which are significantly smaller than the voxel-resolution. To acquire a large number of images in a clinical use, very-fast acquisition technique are required as single-shot imaging, however these acquisitions suffer local large distortions. We propose a Block-Matching registration method based on a the acquisition of images with opposite phase-encoding directions (PED). This technique specially designs for Echo-Planar Images (EPI), but which could be generic, robustly correct images and provide a deformation field. This field is applicable to an entire DWI series from only one reversed b 0 allowing distortion correction with a minimal time acquisition cost. This registration algorithm has been validated both on a phantom data set and on in-vivo data and is available in our source medical image processing toolbox Anima. From these diffusion images, we are able to construct multi-compartments models (MCM) which could represented complex brain microstructure. We need to do registration, average, create atlas on these MCM to be able to make studies and produce statistic analysis. We propose a general method to interpolate MCM as a simplification problem based on spectral clustering. This technique, which is adaptable for any MCM, has been validated for both synthetic and real data. Then, from a registered dataset, we made analysis at a voxel-level doing statistic on MCM parameters. Specifically design tractography can also be perform to make analysis, following tracks, based on individual compartment. All these tools are designed and used on real data and contribute to the search of biomakers for brain diseases
AKYOL, MUCAHIT. "DESIGN OF A SWITCH BLOCK MODULE FOR SECOND GENERATION MULTI-TECHNOLOGY FPGA." University of Cincinnati / OhioLINK, 2006. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1159558241.
Full textPoivet, Sylwia. "Adhésion instantanée de deux systèmes modèles : liquides simples et copolymères à blocs." Bordeaux 1, 2003. http://www.theses.fr/2003BOR12763.
Full textWithin an expérimental approach whose scope is fundamental as well as applied, we study the separation mechanisms encoutered when a material confined between two parallel plates is put under traction (probe-tack test). The study is conducted on two model systems : simple liquids and block copolymers. The simplicity of the first system allows us to provide a detailed interpretation of our observations. Indeed, two competing regimes are identified : a fingering regime and a cavitation regime. The curve shape for the measured force, specific to each regime, along with the conditions required for cavitation, are modelled using models for Newtonian fluids. This allows us to build a phase diagram capable of prediction the different regimes. The excellent fit of the experimental data with our theoretical model demonstrates that the cavitation mechanism, commonly thought to be characteristic of viscoelastic materials like adhesives, can also be encountered in viscous liquids. The second system, a candidate for use as an adhesive material even in wet environment, is essentially composed of amphiphilic diblock copolymers. We characterize the structure of these materials, their ability to absorb water and their adhesive properties on dry and wet substrates. Our study demonstrates the critical role of the polymer structure on the adhesion properties. Moreover, we show that, owing to their amphiphilic behaviour, these materials are particulary promising for the well-known problem of adhesion on wet surfaces
King, John. "The effects of Institutional models on electoral participation and democracy in the former Soviet Bloc." Honors in the Major Thesis, University of Central Florida, 2007. http://digital.library.ucf.edu/cdm/ref/collection/ETH/id/1176.
Full textBachelors
Sciences
Political Science
Driss, Khodja Kouider. "Etude expérimentale et modélisation de la fonction diélectrique des milieux inhomogènes 2D et 3D." Rouen, 1989. http://www.theses.fr/1989ROUES018.
Full textSantos, Carla Maria Lopes da Silva Afonso dos. "Error Orthogonal Models: Structure, Operations and Inference." Doctoral thesis, Universidade da Beira Interior, 2012. http://hdl.handle.net/10400.6/1742.
Full textNesta tese é desenvolvida a teoria dos modelos Error-orthogonal recorrendo à identidade entre estes modelos e os modelos com estrutura ortogonal de blocos comutativos. Desta forma, o tratamento apresentado irá assentar na estrutura algébrica dos modelos. No desenvolvimento considera-se: a estimação das componentes de variância; o cruzamento e aninhamento de modelos; a junção de modelos, na qual vectores das observações obtidos separadamente são analisados conjuntamente; aninhamento em escada, que requer muito menos observações do que os modelos correspondentes. Para alargar o tratamento apresentado consideram-se também Extensões L de modelos Error-orthogonal. Desta forma, poderemos considerar casos interessantes como o dos modelos com número diferente de repetições para os vários tratamentos. Por fim, inclui-se o caso normal. Com base no pressuposto da normalidade pretende-se obter estatísticas suficientes assim como condições para que estas sejam completas. É realizada inferência e consideram-se extensões L ortogonais.
Porter, Richard J. "Non-Gaussian and block based models for sparse signal recovery." Thesis, University of Bristol, 2016. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.702908.
Full textMercat, Christian. "Holomorphie discrète et modèle d'Ising." Phd thesis, Université Louis Pasteur - Strasbourg I, 1998. http://tel.archives-ouvertes.fr/tel-00001851.
Full textRossi, Louis Frank, and Louis Frank Rossi. "A spreading blob vortex method for viscous bounded flows." Diss., The University of Arizona, 1993. http://hdl.handle.net/10150/186562.
Full textLu, Nan. "La modélisation de l'indice CAC 40 avec le modèle basé agents." Thesis, Paris Est, 2018. http://www.theses.fr/2018PESC0004/document.
Full textWe develop an agent-based model to replicate two frequently observed anomalies in the financial markets: the fat tails and the clustered volatility of the distribution of the returns. Our goal is to show conclusively that these anomalies could be attributed to a mimetic formation of the expectations of the stakeholders in the markets. We did not follow the rencent developpments in the field of the ACE model in the finance, but we propose a very simple model which is estimated from the stylized facts of the French daily index CAC 40. The hypothesis of mimetic anticipations can thus be tested: it is not rejected in our modeling
Van, Bloemenstein Chantell Berenice. "Petrographic characterization of sandstones in borehole E-BA1, Block 9, Bredasdorp Basin, Off-Shore South Africa." Thesis, University of the Western Cape, 2006. http://etd.uwc.ac.za/index.php?module=etd&action=viewtitle&id=gen8Srv25Nme4_5957_1189147269.
Full textThe reservoir quality (RQ) of well E-BA1 was characterized using thin sections and core samples in a petrographic study. Well E-BA1 is situated in the Bredasdorp Basin, which forms part of the Outeniqua Basin situated in the Southern Afircan offshore region. Rifting as a result of the break up of Gondwanaland formed the Outeniqua Basin. The Bredasorp Basin is characterized by half-graben structures comprised of Upper Jurassic, Lower Cretaceous and Cenozoic rift to drift strata. The current research within the thesis has indicated that well E-BA1 is one of moderate to good quality having a gas-condensate component.
Ortiz, Pablo Chaves Ortiz. "Aliança do Pacífico: uma visão do bloco através do modelo gravitacional." Universidade do Vale do Rio dos Sinos, 2015. http://www.repositorio.jesuita.org.br/handle/UNISINOS/4871.
Full textMade available in DSpace on 2015-10-21T12:42:10Z (GMT). No. of bitstreams: 1 PABLO CHAVES ORTIZ_.pdf: 463828 bytes, checksum: e88392b8026ccd01016ffae992b7bb3b (MD5) Previous issue date: 2015-07-29
Nenhuma
A partir da década de 1990, houve uma proliferação de Acordos Preferenciais de Comércio (APC) ao redor do mundo. Dentro deste cenário de mudança do comércio mundial, a América Latina foi um importante ator na criação de novos acordos. Entretanto, devido a histórica instabilidade política e econômica da região, nunca houve uma integração de fato, devido principalmente ao caráter protecionista dos países. Nesse sentido, a Aliança do Pacífico (Chile, Colômbia, Peru e México) vem com uma proposta de integração econômica diferente, com objetivo de unir suas economias ainda mais e estar aberta às negociações comerciais com terceiros países. O objetivo deste estudo é estimar o comércio bilateral potencial entre os países membros da Aliança do Pacífico, através do modelo gravitacional de comércio por meio de dados em painel com efeitos fixos para o ano de 2013, com uma amostra de 98 países. Os resultados mostraram que o comércio estimado para o ano de 2013 ficou apenas 1% abaixo do comércio efetivo, o equivalente a US$ 240,6 milhões. A análise por par de países mostrou que o mais beneficiado com a criação da AP seria o México, expandindo consideravelmente suas importações e exportações.
From the 1990s, there was a proliferation of Preferential Trade Agreements (APC) around the world. Within this world trade change of scenery, Latin America was a key player in the creation of new agreements. However, due to historical political and economic instability in the region, there has never been an integration, mainly due to the protectionist nature of countries. In this sense, the Pacific Alliance (Chile, Colombia, Peru and Mexico) comes with a proposal for a different economic integration, aiming to unite their economies further and be open to trade negotiations with third countries. The aim of this study is to estimate the bilateral trade potential between the member countries of the Pacific Alliance (PA), through the gravitational trade model in panel data with fixed effects for year 2013, with a sample of 98 countries. The results showed that the estimated trade for the year 2013 was only 1% below the actual trade, equivalent to US $ 240.6 million. Analysis by pair of countries showed that most benefited from the creation of the Pacific Alliance would be Mexico, considerably expanding its imports and exports.
Ortiz, Pablo Chaves. "Aliança do Pacífico : uma visão do bloco através do modelo gravitacional." reponame:Biblioteca Digital de Teses e Dissertações da UFRGS, 2015. http://hdl.handle.net/10183/171709.
Full textFrom the 1990s, there was a proliferation of Preferential Trade Agreements (APC) around the world. Within this world trade change of scenery, Latin America was a key player in the creation of new agreements. However, due to historical political and economic instability in the region, there has never been an integration, mainly due to the protectionist nature of countries. In this sense, the Pacific Alliance (Chile, Colombia, Peru and Mexico) comes with a proposal for a different economic integration, aiming to unite their economies further and be open to trade negotiations with third countries. The aim of this study is to estimate the bilateral trade potential between the member countries of the Pacific Alliance (PA), through the gravitational trade model in panel data with fixed effects for year 2013, with a sample of 98 countries. The results showed that the estimated trade for the year 2013 was only 1% below the actual trade, equivalent to US $ 240.6 million. Analysis by pair of countries showed that most benefited from the creation of the Pacific Alliance would be Mexico, considerably expanding its imports and exports.