Academic literature on the topic 'Estimateurs explicites'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Estimateurs explicites.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Estimateurs explicites"

1

Ramaré, Olivier. "From explicit estimates for primes to explicit estimates for the Möbius function." Acta Arithmetica 157, no. 4 (2013): 365–79. http://dx.doi.org/10.4064/aa157-4-4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Piau, Didier. "Quasi-renewal estimates." Journal of Applied Probability 37, no. 1 (March 2000): 269–75. http://dx.doi.org/10.1239/jap/1014842284.

Full text
Abstract:
We show that the solution of a quasi-renewal equation with an exponential distribution of the renewals converges at infinity and we compute explicitly the limit, hence generalizing the classical renewal theorem. We apply this result to a stochastic model of DNA replication introduced by Cowan and Chiu (1994).
APA, Harvard, Vancouver, ISO, and other styles
3

Piau, Didier. "Quasi-renewal estimates." Journal of Applied Probability 37, no. 01 (March 2000): 269–75. http://dx.doi.org/10.1017/s0021900200015412.

Full text
Abstract:
We show that the solution of a quasi-renewal equation with an exponential distribution of the renewals converges at infinity and we compute explicitly the limit, hence generalizing the classical renewal theorem. We apply this result to a stochastic model of DNA replication introduced by Cowan and Chiu (1994).
APA, Harvard, Vancouver, ISO, and other styles
4

Sanghvi, Navyata, Shinnosuke Usami, Mohit Sharma, Joachim Groeger, and Kris Kitani. "Inverse Reinforcement Learning with Explicit Policy Estimates." Proceedings of the AAAI Conference on Artificial Intelligence 35, no. 11 (May 18, 2021): 9472–80. http://dx.doi.org/10.1609/aaai.v35i11.17141.

Full text
Abstract:
Various methods for solving the inverse reinforcement learning (IRL) problem have been developed independently in machine learning and economics. In particular, the method of Maximum Causal Entropy IRL is based on the perspective of entropy maximization, while related advances in the field of economics instead assume the existence of unobserved action shocks to explain expert behavior (Nested Fixed Point Algorithm, Conditional Choice Probability method, Nested Pseudo-Likelihood Algorithm). In this work, we make previously unknown connections between these related methods from both fields. We achieve this by showing that they all belong to a class of optimization problems, characterized by a common form of the objective, the associated policy and the objective gradient. We demonstrate key computational and algorithmic differences which arise between the methods due to an approximation of the optimal soft value function, and describe how this leads to more efficient algorithms. Using insights which emerge from our study of this class of optimization problems, we identify various problem scenarios and investigate each method's suitability for these problems.
APA, Harvard, Vancouver, ISO, and other styles
5

Romero, José Luis. "Explicit Localization Estimates for Spline-Type Spaces." Sampling Theory in Signal and Image Processing 8, no. 3 (September 2009): 249–59. http://dx.doi.org/10.1007/bf03549518.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

C, Gallesco, Gallo S, and Takahashi D. Y. "Explicit estimates in the Bramson–Kalikow model." Nonlinearity 27, no. 9 (August 14, 2014): 2281–96. http://dx.doi.org/10.1088/0951-7715/27/9/2281.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Cheng, Yuanyou F., and Sidney W. Graham. "Explicit Estimates for the Riemann Zeta Function." Rocky Mountain Journal of Mathematics 34, no. 4 (December 2004): 1261–80. http://dx.doi.org/10.1216/rmjm/1181069799.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Dusart, Pierre. "Explicit estimates of some functions over primes." Ramanujan Journal 45, no. 1 (October 27, 2016): 227–51. http://dx.doi.org/10.1007/s11139-016-9839-4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Juricevic, Robert. "Explicit estimates of solutions of some Diophantine equations." Functiones et Approximatio Commentarii Mathematici 38, no. 2 (September 2008): 171–94. http://dx.doi.org/10.7169/facm/1229696538.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Consiglieri, Luisa. "Explicit Estimates for Solutions of Mixed Elliptic Problems." International Journal of Partial Differential Equations 2014 (March 31, 2014): 1–16. http://dx.doi.org/10.1155/2014/845760.

Full text
Abstract:
We deal with the existence of quantitative estimates for solutions of mixed problems to an elliptic second-order equation in divergence form with discontinuous coefficients. Our concern is to estimate the solutions with explicit constants, for domains in ℝn (n≥2) of class C0,1. The existence of L∞ and W1,q estimates is assured for q=2 and any q<n/(n-1) (depending on the data), whenever the coefficient is only measurable and bounded. The proof method of the quantitative L∞ estimates is based on the De Giorgi technique developed by Stampacchia. By using the potential theory, we derive W1,p estimates for different ranges of the exponent p depending on the fact that the coefficient is either Dini-continuous or only measurable and bounded. In this process, we establish new existences of Green functions on such domains. The last but not least concern is to unify (whenever possible) the proofs of the estimates to the extreme Dirichlet and Neumann cases of the mixed problem.
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Estimateurs explicites"

1

Strambi, Marco. "Effective estimates for coverings of curves over number fields." Thesis, Bordeaux 1, 2009. http://www.theses.fr/2009BOR13895/document.

Full text
Abstract:
Le but de cette thèse est d'obtenir des versions totalement explicite de deux résultats fondamentales sur les revêtements de courbes algébriques: le Théorème d'existence de Riemann et le théorème de Chevalley-Weil. La motivation de notre travail sur le Théorème d'existence de Riemann réside dans le domaine de l'analyse diophantienne effective, lorsque la technique des revêtements est largement utilisé: trés souvent il arrive qu'on ne connait que le degré du revêtement et les points de ramification, et pour travailler avec le revêtement il faut en avoir une description efficace. Le théorème de Chevalley-Weil est également indispensable dans l'analyse diophantienne, car il permet de réduire un problème diophantien sur la variété V à celui sur le revêtement W, ce qui peut être plus simple à étudier. Dans la thèse on obtient une version du théorème de Chevalley-Weil en dimension 1, explicite en tous les paramètres et nettement meilleur que les versions précédentes
The purpose of this thesis is to obtain totally explicit versions for two fundamental results about coverings of algebraic curves: the Riemann Existence Theorem and the Chevalley-Weil Theorem. The motivation behind our work about Riemann Existence Theorem lies in the field of effective Diophantine analysis, where the covering technique is widely used: it happens quite often that only the degree of the covering and the ramification points are known, and to work with the covering curve, one needs to have an effective description of it. The Chevalley-Weil theorem is also indispensable in the Diophantine analysis because it reduces a Diophantine problem on the variety V to that on the covering variety W, which can often be simpler to deal. In the thesis we obtain a version of the Chevalley-Weil theorem in dimension 1, explicit in all parameters and considerably sharper than the previous versions
La tesi si propone di ottenere versioni totalmente esplicite di due risultati fondamentali riguardanti rivestimenti di curve algebriche: il teorema di esistenza di Riemann e il teorema di Chevalley-Weil. Le motivazioni del nostro lavoro sul teorema di esistenza di Riemann risiedono nella analisi diofantea effettiva, dove le tecniche di rivestimento sono ampiamente utilizzate: capita spesso di conoscere solo il grado e i punti di ramificazione di un rivestimento, e per lavorare con la curva e' necessario averne una descrizione esplicita. Il teorema di Chevalley-Weil e' altrettanto indispensabile in analisi diofantea poiche' riduce un problema diofanteo su una varieta' V a quello di un rivestimento W, dove spesso e' piu' facile lavorare. Nella tesi otteniamo una versione totalmente esplicita del teorema di Chevalley-Weil in dimensione 1, con stime molto migliori di quelle precedentemente conosciute
APA, Harvard, Vancouver, ISO, and other styles
2

Burg, Antoine. "Multivariate extensions for mortality modelling." Electronic Thesis or Diss., Université Paris sciences et lettres, 2025. http://www.theses.fr/2025UPSLD002.

Full text
Abstract:
Au cours des deux derniers siècles, l’espérance de vie tout autour du globe a connu un accroissement considérable. Si la tendance sur le long terme est plutôt régulière, l’amélioration de la longévité peut être décomposée sur le court-terme en plusieurs phases, que l’on peut relier le plus souvent aux progrès médicaux et à la diminution de causes de mortalité particulières. L’année 2020 marque un tournant du fait de l’ampleur de la pandémie Covid-19 et de ses conséquences. Ses effets directs et indirects sur l’économie et les systèmes de santé se manifestent également au travers des autres causes majeures de décès. Pour comprendre et anticiper les risques liés à la mortalité, il devient de plus en plus nécessaire pour les acteurs de la réassurance de raisonner et modéliser en termes de causes de décès. Ce type de modélisation pose néanmoins des défis spécifiques, issues de la nature multivariée des modèles, dont la complexité dépasse celle des outils classiques de l’actuaire. Nous proposons dans cette thèse plusieurs axes pour étendre la modélisation de la mortalité à un cadre multivarié. Ces axes sont abordés sous forme d’articles de recherche. La première étude porte sur des aspects techniques des distributions multivariées au sein de modèles linéaires généralisés. Lorsque les variables explicatives sont catégorielles, nous proposons de nouveaux estimateurs pour les distributions multinomiale, multinomiale négative et de Dirichlet sous forme de formules fermées, qui permettent notamment un gain considérable en temps de calcul. Ces estimateurs sont utilisés dans la seconde étude pour proposer une nouvelle méthode d’estimation des paramètres de modèles de mortalité. Cette méthode prolonge le cadre existant pour la mortalité toute cause, et permet de traiter toutes les problématiques de modélisation de mortalité en une seule étape, en particulier par cause de décès. Le troisième axe porte sur les projections de mortalité. Nous étudions des réseaux de neurones spécifiquement adaptés aux séries temporelles. Nous montrons par des exemples concrets auxquels peut faire face l’actuaire que ces modèles sont suffisamment flexibles et robustes, offrant une alternative crédible aux modèles classiques
Over the past two centuries, life expectancy around the globe has increased considerably. While the long-term trend is fairly regular, the improvement in longevity can be broken down into several phases in the short term, which can most often be linked to medical progress and the reduction in specific causes of mortality. The year 2020 marks a turning point due to the scale of the Covid-19 pandemic and its consequences. Its direct and indirect effects on the economy and healthcare systems will also be felt through other major causes of death. To understand and anticipate mortality-related risks, it is becoming increasingly necessary for reinsurance players to reason and model in terms of causes of death. However, this type of modeling poses specific challenges. By its very nature, it involves multivariate models, whose complexity exceeds that of conventional actuary tools. In this thesis, we propose several avenues for extending mortality modeling to a multivariate framework. These are presented in the form of research articles. The first study deals with technical aspects of multivariate distributions within generalized linear models. When the explanatory variables are categorical, we propose new estimators for the multinomial, negative multinomial and Dirichlet distributions in the form of closed formulas, which notably enable considerable savings in computation time. These estimators are used in the second study to propose a new method for estimating the parameters of mortality models. This method extends the existing framework for all-cause mortality, and enables all mortality modeling issues to be addressed in a single step, particularly by cause-of-death. The third axis concerns mortality forecasts. We study neural networks specifically adapted to time series. Based on concrete use cases, we show that these models are sufficiently flexible and robust to offer a credible alternative to conventional models
APA, Harvard, Vancouver, ISO, and other styles
3

蔡國光 and Kwok-kwong Stephen Choi. "Some explicit estimates on linear diophantine equations in three primevariables." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 1990. http://hub.hku.hk/bib/B3120966X.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Choi, Kwok-kwong Stephen. "Some explicit estimates on linear diophantine equations in three prime variables /." [Hong Kong] : University of Hong Kong, 1990. http://sunzi.lib.hku.hk/hkuto/record.jsp?B12907236.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Gokpi, Kossivi. "Modélisation et Simulation des Ecoulements Compressibles par la Méthode des Eléments Finis Galerkin Discontinus." Thesis, Pau, 2013. http://www.theses.fr/2013PAUU3005/document.

Full text
Abstract:
L’objectif de ce travail de thèse est de proposer la Méthodes des éléments finis de Galerkin discontinus (DGFEM) à la discrétisation des équations compressibles de Navier-Stokes. Plusieurs challenges font l’objet de ce travail. Le premier aspect a consisté à montrer l’ordre de convergence optimal de la méthode DGFEM en utilisant les polynômes d’interpolation d’ordre élevé. Le deuxième aspect concerne l’implémentation de méthodes de ‘‘shock-catpuring’’ comme les limiteurs de pentes et les méthodes de viscosité artificielle pour supprimer les oscillations numériques engendrées par l’ordre élevé (lorsque des polynômes d’interpolation de degré p>0 sont utilisés) dans les écoulements transsoniques et supersoniques. Ensuite nous avons implémenté des estimateurs d’erreur a posteriori et des procédures d ’adaptation de maillages qui permettent d’augmenter la précision de la solution et la vitesse de convergence afin d’obtenir un gain de temps considérable. Finalement, nous avons montré la capacité de la méthode DG à donner des résultats corrects à faibles nombres de Mach. Lorsque le nombre de Mach est petit pour les écoulements compressibles à la limite de l’incompressible, la solution souffre généralement de convergence et de précision. Pour pallier ce problème généralement on procède au préconditionnement qui modifie les équations d’Euler. Dans notre cas, les équations ne sont pas modifiées. Dans ce travail, nous montrons la précision et la robustesse de méthode DG proposée avec un schéma en temps implicite de second ordre et des conditions de bords adéquats
The aim of this thesis is to deal with compressible Navier-Stokes flows discretized by Discontinuous Galerkin Finite Elements Methods. Several aspects has been considered. One is to show the optimal convergence of the DGFEM method when using high order polynomial. Second is to design shock-capturing methods such as slope limiters and artificial viscosity to suppress numerical oscillation occurring when p>0 schemes are used. Third aspect is to design an a posteriori error estimator for adaptive mesh refinement in order to optimize the mesh in the computational domain. And finally, we want to show the accuracy and the robustness of the DG method implemented when we reach very low mach numbers. Usually when simulating compressible flows at very low mach numbers at the limit of incompressible flows, there occurs many kind of problems such as accuracy and convergence of the solution. To be able to run low Mach number problems, there exists solution like preconditioning. This method usually modifies the Euler. Here the Euler equations are not modified and with a robust time scheme and good boundary conditions imposed one can have efficient and accurate results
APA, Harvard, Vancouver, ISO, and other styles
6

Potts, Joanne M. "Estimating abundance of rare, small mammals : a case study of the Key Largo woodrat (Neotoma floridana smalli)." Thesis, University of St Andrews, 2011. http://hdl.handle.net/10023/2068.

Full text
Abstract:
Estimates of animal abundance or density are fundamental quantities in ecology and conservation, but for many species such as rare, small mammals, obtaining robust estimates is problematic. In this thesis, I combine elements of two standard abundance estimation methods, capture-recapture and distance sampling, to develop a method called trapping point transects (TPT). In TPT, a "detection function", g(r) (i.e. the probability of capturing an animal, given it is r m from a trap when the trap is set) is estimated using a subset of animals whose locations are known prior to traps being set. Generalised linear models are used to estimate the detection function, and the model can be extended to include random effects to allow for heterogeneity in capture probabilities. Standard point transect methods are modified to estimate abundance. Two abundance estimators are available. The first estimator is based on the reciprocal of the expected probability of detecting an animal, ^P, where the expectation is over r; whereas the second estimator is the expectation of the reciprocal of ^P. Performance of the TPT method under various sampling efforts and underlying true detection probabilities of individuals in the population was investigated in a simulation study. When underlying probability of detection was high (g(0) = 0:88) and between-individual variation was small, survey effort could be surprisingly low (c. 510 trap nights) to yield low bias (c. 4%) in the two estimators; but under certain situations, the second estimator can be extremely biased. Uncertainty and relative bias in population estimates increased with decreasing detectability and increasing between-individual variation. Abundance of the Key Largo woodrat (Neotoma floridana smalli), an endangered rodent with a restricted geographic range, was estimated using TPT. The TPT method compared well to other viable methods (capture-recapture and spatially-explicit capture-recapture), in terms of both field practicality and cost. The TPT method may generally be useful in estimating animal abundance in trapping studies and variants of the TPT method are presented.
APA, Harvard, Vancouver, ISO, and other styles
7

Dudek, Adrian. "Explicit Estimates in the Theory of Prime Numbers." Phd thesis, 2016. http://hdl.handle.net/1885/110018.

Full text
Abstract:
It is the purpose of this thesis to enunciate and prove a collection of explicit results in the theory of prime numbers. First, the problem of primes in short intervals is considered. We furnish an explicit result on the existence of primes between consecutive cubes. To prove this, we first derive an explicit version of the Riemann--von Mangoldt explicit formula. We then assume the Riemann hypothesis and improve on the known conditional explicit estimates for primes in short intervals. Using recent results on primes in arithmetic progressions, we prove two new results in additive number theory. First, we prove that every integer greater than two can be written as the sum of a prime and a square-free number. We then work similarly to prove that every integer greater than ten and not congruent to one modulo four can be written as the sum of the square of a prime and a square-free number. Finally, we provide new explicit results on an arcane inequality of Ramanujan. We solve the inequality completely on the assumption of the Riemann hypothesis.
APA, Harvard, Vancouver, ISO, and other styles
8

Turner, Matthew D. "Explicit Lp-norm estimates of infinitely divisible random vectors in Hilbert spaces with applications." 2011. http://trace.tennessee.edu/utk_graddiss/1035.

Full text
Abstract:
I give explicit estimates of the Lp-norm of a mean zero infinitely divisible random vector taking values in a Hilbert space in terms of a certain mixture of the L2- and Lp-norms of the Levy measure. Using decoupling inequalities, the stochastic integral driven by an infinitely divisible random measure is defined. As a first application utilizing the Lp-norm estimates, computation of Ito Isomorphisms for different types of stochastic integrals are given. As a second application, I consider the discrete time signal-observation model in the presence of an alpha-stable noise environment. Formulation is given to compute the optimal linear estimate of the system state.
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Estimateurs explicites"

1

Epstein, Charles L., and Rafe Mazzeo. Holder Estimates for the 1-dimensional Model Problems. Princeton University Press, 2017. http://dx.doi.org/10.23943/princeton/9780691157122.003.0006.

Full text
Abstract:
This chapter establishes Hölder space estimates for the 1-dimensional model problems. It gives a detailed treatment of the 1-dimensional case, in part because all of the higher dimensional estimates are reduced to estimates on heat kernels for the 1-dimensional model problems. It also presents the proof of parabolic Schauder estimates for the generalized Kimura diffusion operator using the explicit formula for the heat kernel, along with standard tools of analysis. Finally, it considers kernel estimates for degenerate model problems, explains how Hölder estimates are obtained for the 1-dimensional model problems, and describes the properties of the resolvent operator.
APA, Harvard, Vancouver, ISO, and other styles
2

Maron, Martine. Is “no net loss of biodiversity” a good idea? Oxford University Press, 2017. http://dx.doi.org/10.1093/oso/9780198808978.003.0022.

Full text
Abstract:
This chapter explores biodiversity offsetting as a tool used to achieve “no net loss” of biodiversity. Unfortunately, no-net-loss offsetting can be—and often is—unintentionally designed in a way that inevitably results in ongoing biodiversity decline. Credit for offset sites is given in proportion to the assumed loss that would happen at those sites if not protected, and this requires clear baselines and good estimates of the risk of loss. This crediting calculation also creates a perverse incentive to overstate—or even genuinely increase—the threat to biodiversity at potential offset sites, in order to generate more offset “credit” that can then be exchanged for damaging actions elsewhere. The phrase “no net loss,” when used without an explicit frame of reference and quantified counterfactual scenario, is meaningless, and potentially misleading. Conservation scientists have a core role in interpreting, communicating, and improving the robustness of offset policy.
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Estimateurs explicites"

1

Moral, Gregorio. "EAD Estimates for Facilities with Explicit Limits." In The Basel II Risk Parameters, 201–46. Berlin, Heidelberg: Springer Berlin Heidelberg, 2011. http://dx.doi.org/10.1007/978-3-642-16114-8_11.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

de Monvel, Anne Boutet, and Lech Zielinski. "Explicit Error Estimates for Eigenvalues of Some Unbounded Jacobi Matrices." In Spectral Theory, Mathematical System Theory, Evolution Equations, Differential and Difference Equations, 189–217. Basel: Springer Basel, 2012. http://dx.doi.org/10.1007/978-3-0348-0297-0_11.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Kurasov, Pavel. "Higher Eigenvalues and Topological Perturbations." In Operator Theory: Advances and Applications, 317–30. Berlin, Heidelberg: Springer Berlin Heidelberg, 2023. http://dx.doi.org/10.1007/978-3-662-67872-5_13.

Full text
Abstract:
AbstractSome fundamental estimates for higher eigenvalues of standard Laplacians have already been derived in Sect. 4.6. The goal of this chapter is twofold: on the one hand considering the standard Laplacian we derive explicit fundamental estimates for higher eigenvalues and describe the behaviour of such eigenvalues under topological perturbations. Here techniques developed in the previous chapter are used. On the other hand, considering Schrödinger operators with most general vertex conditions we analyse the behaviour of the spectrum under topological perturbations and show that intuition gained during our studies of standard Laplacians cannot always be applied: the eigenvalues may depend on topological perturbations in a completely opposite way.
APA, Harvard, Vancouver, ISO, and other styles
4

Zitikis, Ričardas. "A berry - esséen bound for multivariate l-estimates with explicit dependence on dimension." In Stability Problems for Stochastic Models, 197–211. Berlin, Heidelberg: Springer Berlin Heidelberg, 1993. http://dx.doi.org/10.1007/bfb0084498.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Erev, Ido. "The Effect of Explicit Probability Estimates on Violations of Subjective Expected Utility Theory in the Allais Paradox." In Decision Making Under Risk and Uncertainty, 117–24. Dordrecht: Springer Netherlands, 1992. http://dx.doi.org/10.1007/978-94-011-2838-4_14.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Molnar, Christoph, Timo Freiesleben, Gunnar König, Julia Herbinger, Tim Reisinger, Giuseppe Casalicchio, Marvin N. Wright, and Bernd Bischl. "Relating the Partial Dependence Plot and Permutation Feature Importance to the Data Generating Process." In Communications in Computer and Information Science, 456–79. Cham: Springer Nature Switzerland, 2023. http://dx.doi.org/10.1007/978-3-031-44064-9_24.

Full text
Abstract:
AbstractScientists and practitioners increasingly rely on machine learning to model data and draw conclusions. Compared to statistical modeling approaches, machine learning makes fewer explicit assumptions about data structures, such as linearity. Consequently, the parameters of machine learning models usually cannot be easily related to the data generating process. To learn about the modeled relationships, partial dependence (PD) plots and permutation feature importance (PFI) are often used as interpretation methods. However, PD and PFI lack a theory that relates them to the data generating process. We formalize PD and PFI as statistical estimators of ground truth estimands rooted in the data generating process. We show that PD and PFI estimates deviate from this ground truth not only due to statistical biases, but also due to learner variance and Monte Carlo approximation errors. To account for these uncertainties in PD and PFI estimation, we propose the learner-PD and the learner-PFI based on model refits and propose corrected variance and confidence interval estimators.
APA, Harvard, Vancouver, ISO, and other styles
7

Tomaselli, Venera, and Giulio Giacomo Cantone. "Multipoint vs slider: a protocol for experiments." In Proceedings e report, 91–96. Florence: Firenze University Press, 2021. http://dx.doi.org/10.36253/978-88-5518-304-8.19.

Full text
Abstract:
Since the broad diffusion of Computer-Assisted survey tools (i.e. web surveys), a lively debate about innovative scales of measure arose among social scientists and practitioners. Implications are relevant for applied Statistics and evaluation research since while traditional scales collect ordinal observations, data from sliders can be interpreted as continuous. Literature, however, report excessive times of completion of the task from sliders in web surveys. This experimental protocol is aimed at testing hypotheses on the accuracy in prediction and dispersion of estimates from anonymous participants who are recruited online and randomly assigned into tasks in recognition of shades of colour. The treatment variable is two scales: a traditional multipoint 0-10 multipoint vs a slider 0-100. Shades have a unique parametrisation (true value) and participants have to guess the true value through the scale. These tasks are designed to recreate situations of uncertainty among participants while minimizing the subjective component of a perceptual assessment and maximizing information about scale-driven differences and biases. We propose to test statistical differences in the treatment variable: (i) mean absolute error from the true value (ii), time of completion of the task. To correct biases due to the variance in the number of completed tasks among participants, data about participants can be collected through both pre-tasks acceptance of web cookies and post-tasks explicit questions.
APA, Harvard, Vancouver, ISO, and other styles
8

Matveev, M. G., E. A. Sirota, and E. A. Kopytina. "Analysis of the Quality of Estimates in the Problem of Parametric Identification of Distributed Dynamic Processes in the Case of an Explicit and Implicit Difference Scheme." In Information Systems and Design, 40–50. Cham: Springer Nature Switzerland, 2023. http://dx.doi.org/10.1007/978-3-031-32092-7_4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Li, Tatsien, and Yi Zhou. "Sharpness of Lower Bound Estimates on the Life-Span of Classical Solutions to the Cauchy Problem—The Case that the Nonlinear Term $$F=F(Du, D_xDu)$$ F = F ( D u , D x D u ) on the Right-Hand Side Does not Depend on u Explicitly." In Series in Contemporary Mathematics, 303–17. Berlin, Heidelberg: Springer Berlin Heidelberg, 2017. http://dx.doi.org/10.1007/978-3-662-55725-9_13.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Li, Tatsien, and Yi Zhou. "Sharpness of Lower Bound Estimates on the Life-Span of Classical Solutions to the Cauchy Problem—The Case that the Nonlinear Term $$F=F(u,Du, D_xDu)$$ F = F ( u , D u , D x D u ) on the Right-Hand Side Depends on u Explicitly." In Series in Contemporary Mathematics, 319–61. Berlin, Heidelberg: Springer Berlin Heidelberg, 2017. http://dx.doi.org/10.1007/978-3-662-55725-9_14.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Estimateurs explicites"

1

Rego, Francisco, and Daniel Silvestre. "Explicit Computation of Guaranteed State Estimates using Constrained Convex Generators." In 2024 IEEE 63rd Conference on Decision and Control (CDC), 1400–1405. IEEE, 2024. https://doi.org/10.1109/cdc56724.2024.10885976.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Greene, Jonathan W. "FPGA Mux Usage and Routability Estimates without Explicit Routing." In FPGA '23: The 2023 ACM/SIGDA International Symposium on Field Programmable Gate Arrays. New York, NY, USA: ACM, 2023. http://dx.doi.org/10.1145/3543622.3573045.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Bellegarda, Jerome R. "Time-varying system identification via explicit filtering of the parameter estimates." In San Diego, '91, San Diego, CA, edited by Franklin T. Luk. SPIE, 1991. http://dx.doi.org/10.1117/12.49821.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

McLauchlan, Philip, and John Mayhew. "Needles: A Stereo Algorithm for Texture." In Image Understanding and Machine Vision. Washington, D.C.: Optica Publishing Group, 1989. http://dx.doi.org/10.1364/iumv.1989.tud1.

Full text
Abstract:
This paper describes Needles, an edge based stereo algorithm designed to take advantage of the smoothness of many textured surfaces. The correspondence problem is not addressed explicitly. Rather, a simple two stage process extracts surface position and orientation directly. Firstly local disparity histograms over a large range are constructed. Maxima in the histograms correspond to the possible surface depths. A Hough transform is used to fit a plane to the ambiguous disparity points close to the histogram maxima. This confirms and makes more precise the estimates of disparity obtained from the histograms. Local surface disparity and orientation are calculated from the best planar fit after all the histogram maxima (above a threshold) have been tried. This is an extension of an algorithm described in (Pollard 1985) which uses a Hough transform to find local surface orientation without explicit matching. In his algorithm pairs of possible matches vote for the disparity gradient between them. When all pairs have voted the winning disparity gradient (and hence, surface orientation) has the highest Hough accumulator value.
APA, Harvard, Vancouver, ISO, and other styles
5

Fickert, Maximilian, Tianyi Gu, and Wheeler Ruml. "Bounded-cost Search Using Estimates of Uncertainty." In Thirtieth International Joint Conference on Artificial Intelligence {IJCAI-21}. California: International Joint Conferences on Artificial Intelligence Organization, 2021. http://dx.doi.org/10.24963/ijcai.2021/231.

Full text
Abstract:
Many planning problems are too hard to solve optimally. In bounded-cost search, one attempts to find, as quickly as possible, a plan that costs no more than a user-provided absolute cost bound. Several algorithms have been previously proposed for this setting, including Potential Search (PTS) and Bounded-cost Explicit Estimation Search (BEES). BEES attempts to improve on PTS by predicting whether nodes will lead to plans within the cost bound or not. This paper introduces a relatively simple algorithm, Expected Effort Search (XES), which uses not just point estimates but belief distributions in order to estimate the probability that a node will lead to a plan within the bound. XES's expansion order minimizes expected search time in a simplified formal model. Experimental results on standard planning and search benchmarks show that it consistently exhibits strong performance, outperforming both PTS and BEES. We also derive improved variants of BEES that can exploit belief distributions. These new methods advance the recent trend of taking advantage of uncertainty estimates in deterministic single-agent search.
APA, Harvard, Vancouver, ISO, and other styles
6

Dewees, David J., and Robert H. Dodds. "Comparison of Flaw Driving Force Estimates in Simulated Weld Residual Stress Fields." In ASME 2013 Pressure Vessels and Piping Conference. American Society of Mechanical Engineers, 2013. http://dx.doi.org/10.1115/pvp2013-97410.

Full text
Abstract:
Previous work has focused on the methods and results for calculating flaw driving force in simulated three-dimensional (3D) weld residual stress (WRS) fields using contour (J) integral techniques. This paper extends that work to look at explicit modeling of the crack tip opening displacement (CTOD) in these same WRS fields, and for the same range of semi-elliptical flaws. Comparison is made between the predicted trends of driving force with crack size for the calculated driving force (J-integral) versus the “measured” value (CTOD). Implications for fracture assessments are given, and recommendations for future work are made.
APA, Harvard, Vancouver, ISO, and other styles
7

Baskaran, Shyamsunder, and R. P. Millane. "Bayesian Image Reconstruction in X-ray Fiber Diffraction." In Signal Recovery and Synthesis. Washington, D.C.: Optica Publishing Group, 1998. http://dx.doi.org/10.1364/srs.1998.swa.3.

Full text
Abstract:
The structure completion problem in x-ray fiber diffraction analysis, a crystallographic method for studying polymer structures, involves reconstructing an incomplete image from a known part and experimental data in the form of the squared amplitudes of the Fourier coefficients. Formulating this as a Bayesian estimation problem allows explicit expressions for MMSE and MAP estimates to be obtained. Calculations using simulated fiber diffraction data show that the MMSE estimate out- performs current methods that correspond to certain MAP estimates.
APA, Harvard, Vancouver, ISO, and other styles
8

Heinstein, Martin W., Frank J. Mello, and Clark R. Dohrmann. "A Nodal-Based Stable Time Step Predictor for Transient Dynamics With Explicit Time Integration." In ASME 1996 International Mechanical Engineering Congress and Exposition. American Society of Mechanical Engineers, 1996. http://dx.doi.org/10.1115/imece1996-0577.

Full text
Abstract:
Abstract A procedure for estimating the maximum stable time step for explicit finite element codes using central time difference is presented. The method is based on an estimate of maximum nodal stiffness and lumped nodal mass. The estimate of maximum nodal stiffness makes use of the finite element assembly operator and the maximum eigenvalue estimates of the finite elements being considered (in this case Flanagan’s mean quadrature hexahedron). The nodal-based stable time step calculated by this method is shown to be greater than or equal to an element based estimate alone.
APA, Harvard, Vancouver, ISO, and other styles
9

Brooker, Daniel C., Geoffrey K. Cole, and Jason D. McConochie. "The Influence of Hindcast Modeling Uncertainty on the Prediction of High Return Period Wave Conditions." In ASME 2004 23rd International Conference on Offshore Mechanics and Arctic Engineering. ASMEDC, 2004. http://dx.doi.org/10.1115/omae2004-51161.

Full text
Abstract:
Extreme value analysis for the prediction of long return period met-ocean conditions is often based upon hindcast studies of wind and wave conditions. The random errors associated with hindcast modeling are not usually incorporated when fitting an extreme value distribution to hindcast data. In this paper, a modified probability distribution function is derived so that modeling uncertainties can be explicitly included in extreme value analysis. Maximum likelihood estimation is then used to incorporate hindcast uncertainty into return value estimates and confidence intervals. The method presented here is compared against simulation techniques for accounting for hindcast errors. The influence of random errors within modeled datasets on predicted 100 year return wave estimates is discussed.
APA, Harvard, Vancouver, ISO, and other styles
10

Karadeniz, H. "A Fast Calculation Procedure for Fatigue Reliability Estimates of Offshore Structures." In ASME 2003 22nd International Conference on Offshore Mechanics and Arctic Engineering. ASMEDC, 2003. http://dx.doi.org/10.1115/omae2003-37110.

Full text
Abstract:
This paper presents formulations and procedure of a fast and efficient computation of fatigue reliability estimates of offshore structures, which eliminates repetitive execution of spectral analysis procedure so that it is performed only once for all reliability iterations. This is archived by a suitable uncertainty modelling and spectral formulation of the stress process. For this purpose, a new uncertainty variable is defined to represent all uncertainties in the stress spectrum, except those in the damping and inertia force coefficients, thicknesses of marine growths and structural members, which are represented by their own uncertainty variables. Apart from uncertainties in the stress spectrum, a detailed modelling of the fatigue-related uncertainties is presented. Uncertainties in SCF, damage model (S-N line), analytical modelling of the probability distribution of non-narrow banded stress process, long-term probability distribution of sea states and in the reference damage at which failure occurs, are all considered in the group of fatigue-related uncertainties. Formulation of the stress spectrum and stress spectral moments is presented explicitly in the idealized uncertainty space. Then, the failure function of the reliability analysis is expressed in terms of uncertainty variables as being independent of the spectral analysis. The advanced FORM reliability method is used to calculate the reliability index and to identify important uncertainty origins. The procedure presented in the paper is demonstrated by an example jacket type structure and the results are compared with previously calculated results using more sophisticated uncertainty modelling of the stress spectrum.
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Estimateurs explicites"

1

Walsh, Timothy Francis, Garth M. Reese, and Ulrich L. Hetmaniuk. Explicit a posteriori error estimates for eigenvalue analysis of heterogeneous elastic structures. Office of Scientific and Technical Information (OSTI), July 2005. http://dx.doi.org/10.2172/923176.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Gibbs, Holly, Sahoko Yui, and Richard Plevin. New Estimates of Soil and Biomass Carbon Stocks for Global Economic Models. GTAP Technical Paper, March 2014. http://dx.doi.org/10.21642/gtap.tp33.

Full text
Abstract:
We synthesized a range of geographically-explicit forest, grassland and cropland biomass and soil carbon input data sources and used geographic information systems (GIS) software to calculate new estimates of soil and biomass carbon stocks for use with global economic models, particularly for the Global Trade and Analysis Project (GTAP). Our results quantify the average amount of carbon stored in soil and biomass in each of the 246 countries, stratified by agro-ecological zones (available in the accompanying spreadsheet). We also provide the data aggregated to the 134 regions defined for the GTAP 8.1 database both in spreadsheet form and in GTAP’s native binary file format. Finally, we provide an add-on to FlexAgg2 program to further aggregate the 134 regions as desired. Our analysis makes substantial refinements to the estimates of carbon stocks used for modeling carbon emissions from indirect land use change. The spatial detail of our analysis is a major advantage over previous databases because it provides estimates tailored to the regions of interest and better accounts for the variation of carbon stocks across the landscape, and between wetland and non-wetland regions.
APA, Harvard, Vancouver, ISO, and other styles
3

Villoria, Nelson B., and Jing Liu. Using continental grids to improve our understanding of global land supply responses: Implications for policy-driven land use changes in the Americas. GTAP Working Paper, June 2015. http://dx.doi.org/10.21642/gtap.wp81.

Full text
Abstract:
Global economic models with explicit treatment of global land markets are crucial to understanding the consequences of different policy choices on global food and environmental security. However, these models rely on parameters for which there is little econometric evidence. A fundamental parameter in these models is the land supply elasticity. We provide a novel set of land supply elasticities estimated using gridded data for the American continent, and we use them in exploring previous work on the indirect land-use effects of US ethanol policy. Our estimates provide a basis for better-informed simulations of global land-use transitions under different economic and policy scenarios. JEL Codes: Q24, C21, C68
APA, Harvard, Vancouver, ISO, and other styles
4

Castro, Carlos, and Karen Garcia. Default Risk in Agricultural Lending: The Effects of Commodity Price Volatility and Climate. Inter-American Development Bank, September 2014. http://dx.doi.org/10.18235/0006991.

Full text
Abstract:
This paper proposes and estimates a default risk model for agricultural lenders that explicitly accounts for two risks that are endemic to agricultural activities: commodity price volatility and climate. The results indicate that both factors are relevant in explaining the occurrence of default in the portfolio of a rural bank. In addition, the paper illustrates how to integrate the default risk model into standard techniques of portfolio credit risk modeling. The portfolio credit risk model provides a quantitative tool to estimate the loss distribution and the economic capital for a rural bank. The estimated parameters of the default risk model, along with scenarios for the evolution of the risk factors, are used to construct stress tests on the portfolio of a rural bank. These stress tests indicate that climate factors have a larger effect on economic capital than commodity price volatility.
APA, Harvard, Vancouver, ISO, and other styles
5

Granados, Camilo, and Daniel Parra-Amado. Output Gap Measurement after COVID for Colombia: Lessons from a Permanent-Transitory Approach. Banco de la República, January 2025. https://doi.org/10.32468/be.1295.

Full text
Abstract:
We estimate the output gap for the Colombian economy explicitly accounting for the COVID-19 period. Our estimates reveal a significant $20$\% decline in the output gap but with a faster recovery compared to previous crises. Our empirical strategy follows a two-stage Bayesian vector autoregressive (BSVAR) model where i) a scaling factor in the reduced form of VAR is used to model extreme data, such as those observed around the COVID-19 period, and ii) permanent and transitory shocks are structurally identified. As a result, we obtain that a single structural shock explains the potential GDP, while the remaining shocks within the model are transitory in nature and thus can be used to estimate the output gap. We elaborate on the relative strengths of our method for drawing policy lessons and show that the improved approximation accuracy of our method allows for inflation forecasting gains through the use of Phillips curves, as well as for rule-based policy diagnostics that align more closely with the observed behavior of the Central Bank.
APA, Harvard, Vancouver, ISO, and other styles
6

Villoria, Nelson B. Estimation of Missing Intra-African Trade. GTAP Research Memoranda, December 2008. http://dx.doi.org/10.21642/gtap.rm12.

Full text
Abstract:
Missing trade is defined as the exports and imports that may have taken place between two potential trading partners, but which are unknown to the researcher because neither partner reported them to the United Nation’s COMTRADE, the official global repository of trade statistics. In a comprehensive sample of African countries, over 40% of the potential trade flows fit this definition. For a continent whose trade integration remains an important avenue for development, this lack of information hinders the analysis of policy mechanisms -- such as the Economic Partnership Agreements with the EU -- that influence intra-regional trade patterns. This paper estimates the likely magnitude of the missing trade by modeling the manufacturing trade data in the GTAP Data Base using a gravity approach. The gravity approach employed here relates bilateral trade to country size, distance, and other trade costs while explicitly considering that high fixed costs can totally inhibit trade. This last feature provides an adequate framework to explain the numerous zero-valued flows that characterize intra-African trade. The predicted missing exports are valued at approximately 300 million USD. The incidence of missing trade is highest in the lowest income countries of Central and West Africa.
APA, Harvard, Vancouver, ISO, and other styles
7

Ianchovichina, Elena. GTAP-DD: A Model for Analyzing Trade Reforms in the Presence of Duty Drawbacks. GTAP Technical Paper, March 2004. http://dx.doi.org/10.21642/gtap.tp21.

Full text
Abstract:
Duty drawback schemes, which typically involve a combination of duty rebates and exemptions, are a feature of many countries' trade regimes. They are used in highly protected, developing economies as means of providing exporters with imported inputs at world prices, and thus increasing their competitiveness, while maintaining the protection on the rest of the economy. In China duty exemptions have been central to the process of trade reform and have led to a tremendous increase in processed exports utilizing imported materials. Despite the widespread use and importance of duty drawbacks, these "new trade liberalization" instruments have been given relatively little attention in empirical multilateral trade liberalization studies. This paper presents an empirical multi-region trade model GTAP-DD, an extension of GTAP, in which the effects of policy reform are differentiated based on the trade-orientation of the firms. Both GTAP and GTAP-DD are used to analyze the impact of China's WTO accession, which involves liberalization in China from 1997 to post-accession tariffs among a number of other liberalization measures. The analysis shows that failure to account of duty exemptions in the case of China's recent WTO accession will overstate the increase in : (a) China's trade flows by 40 percent, (b) China's welfare by 15 percent, and (c) exports of selected sectors by as much as 90 percent. The magnitude of the bias depends on the level of pre-intervention tariffs and the size of tariff cuts - the larger the initial distortions and tariff reductions, the larger the bias when duty drawbacks are ignored. The bias in GTAP's estimates of China's real GDP, trade flows and welfare changes due to WTO accession increases more three times when China's pre-intervention tariffs are raised from their 1997 levels to the much higher 1995 levels. These results suggest that trade liberalization studies focusing on economies in which protection is high, import concessions play an important role and planned tariff cuts are deep, must treat duty drawbacks explicitly in order to avoid serious errors in their estimates of sectoral, trade flows and welfare changes.
APA, Harvard, Vancouver, ISO, and other styles
8

Hummels, David. Toward a Geography of Trade Costs. GTAP Working Paper, January 2003. http://dx.doi.org/10.21642/gtap.wp17.

Full text
Abstract:
What are the barriers that separate nations? While recent work provides intriguing clues, we have remarkably little concrete evidence as to the nature, size, and shape of barriers. This paper offers direct and indirect evidence on trade barriers, moving us toward a comprehensive geography of trade costs. There are three main contributions. One, we provide detailed data on freight rates for a number of importers. Rates vary substantially over exporters, and aggregate expenditures on freight are at the low end of the observed range. This suggests import choices are made so as to minimize transportation costs. Two, we estimate the technological relationship between freight rates and distance and use this to interpret the trade barriers equivalents of common trade barrier proxies taken from the literature. The calculation reveals implausibly large barriers. Three, we use a multi-sector model of trade to isolate channels through which trade barriers affect trade volumes. The model motivates an estimation technique that delivers direct estimates of substitution elasticities. This allows a complete characterization of the trade costs implied by trade flows and a partition of those costs into three components: explicitly measured costs (tariffs and freight), costs associated with common proxy variables, and costs that are implied but unmeasured. Acknowledgments: Thanks for the gracious provision of data go to Jon Haveman, Rob Feenstra, Azita Amjadi and the ALADI secretariat. Thanks for helpful suggestions on previous drafts go to seminar participants at the Universities of Chicago, Michigan, and Texas, Boston University, NBER and the 4th Annual EIIT Conference at Purdue University. Finally, Julia Grebelsky and Dawn Conner provided outstanding research assistance. This research was funded by a grant from the University of Chicago’s Graduate School of Business.
APA, Harvard, Vancouver, ISO, and other styles
9

Hertel, Thomas, Jevgenijs Steinbuks, and Uris Lantz Baldos. Competition for Land in the Global Bioeconomy. GTAP Working Paper, September 2012. http://dx.doi.org/10.21642/gtap.wp68.

Full text
Abstract:
The global land use implications of biofuel expansion have received considerable attention in the literature over the past decade. Model-based estimates of the emissions from cropland expansion have been used to assess the environmental impacts of biofuel policies. And integrated assessment models have estimated the potential for biofuels to contribute to greenhouse gas abatement over the coming century. All of these studies feature, explicitly or implicitly, competition between biofuel feed stocks and other land uses. However, the economic mechanisms governing this competition, as well as the contribution of biofuels to global land use change, have not received the close scrutiny that they deserve. The purpose of this paper is to offer a deeper look at these factors. We begin with a comparative static analysis which assesses the impact of exogenously specified forecasts of biofuel expansion over the 2006-2035 period. Global land use change is decomposed according to the three key margins of economic response: extensive supply, intensive supply, and demand. Under the International Energy Agency’s “New Policies” scenario, biofuels account for nearly one-fifth of global land use change over the 2006-2035 period. The paper also offers a comparative dynamic analysis which determines the optimal path for first and second generation biofuels over the course of the entire 21st century. In the absence of GHG regulation, the welfare-maximizing path for global land use allocates 170 Mha to biofuel feed stocks by 2100, with the associated biofuels accounting for about 30% of global liquid fuel consumption. This area expansion is somewhat diminished by expected climate change impacts on agriculture, while it is significantly increased by a moderately aggressive GHG emissions target and by advances in conversion efficiency of second generation biofuels.
APA, Harvard, Vancouver, ISO, and other styles
10

Albert, Jose Ramon, Connie Bayudan-Dacuycuy, Imelda Angeles-Agdeppa, Jan Carlo Punongbayan, Deanne Lorraine Cabalfin, Anna Rita Vargas, Charmaine Duante, Eldridge Ferrer, and Chona Fernandez-Patalen. Measuring Poverty within Filipino Households: Examining of Resource Sharing and Economies of Scale. Philippine Institute for Development Studies, December 2024. https://doi.org/10.62986/dp2024.37.

Full text
Abstract:
The Philippine government's long-term vision, AmBisyon Natin 2040, aims for a prosperous, predominantly middle-class society where no one is poor. The Philippine Development Plan 2023-2028 emphasizes strategies to develop and protect individual and family capabilities by reducing vulnerabilities and strengthening social protection. While official poverty statistics indicate progress in poverty reduction at the aggregate level, with poverty incidence declining to 16.4 percent in the first semester of 2023, standard measurement approaches may mask significant inequalities in resource allocation within households. This study examines household resource sharing in the Philippines using a collective household model to construct poverty indices that complement official poverty statistics. Using data from the Family Income and Expenditures Survey, we estimate Engel curves for different demographic groups based on assignable good expenditures, including clothing, cereals, and protein-rich foods. We also analyze individual-level food consumption data from the National Nutrition Survey to estimate food poverty using caloric intake. Our findings reveal substantial gender and age-based disparities in resource allocation, with particularly concerning implications for women and children in vulnerable household types. Child poverty rates under our methodology are up to twice as high as suggested by standard measures, while women consistently show higher poverty rates than men, especially in rural areas. Analysis across basic sectors reveals varying patterns of intra-household inequality, with farmers and fisherfolk showing particularly complex disparities between clothing-based and food-based poverty measures. While official statistics show poverty rates of 30.0% for farmers and 30.6% for fisherfolk, our adjusted estimates suggest significant variations in poverty rates depending on the choice of assignable good, indicating that standard approaches may misunderstand both the extent and nature of poverty among vulnerable groups. These results suggest the need for more nuanced, sector-sensitive approaches to both poverty measurement and social protection policies that explicitly consider intra-household inequality patterns across different basic sectors.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography