Dissertations / Theses on the topic 'Conjoined data'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the top 21 dissertations / theses for your research on the topic 'Conjoined data.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.
Pierrot, Henri Jan, and n/a. "Artificial intelligence architectures for classifying conjoined data." Swinburne University of Technology, 2007. http://adt.lib.swin.edu.au./public/adt-VSWT20070426.102059.
Full textPierrot, Henri Jan. "Artificial intelligence architectures for classifying conjoined data." Australasian Digital Thesis Program, 2007. http://adt.lib.swin.edu.au/public/adt-VSWT20070426.102059/index.html.
Full textSubmitted in partial fulfilment of the requirements for the degree of Master of Science (IT), [Information and Communication Technology], Swinburne University of Technology - 2007. Typescript. Includes bibliographical references.
Yuan, Yuan. "Bayesian Conjoint Analyses with Multi-Category Consumer Panel Data." University of Cincinnati / OhioLINK, 2021. http://rave.ohiolink.edu/etdc/view?acc_num=ucin162766827512258.
Full textWong, Shing-tat. "Disaggregate analyses of stated preference data for capturing parking choice behavior." Click to view the E-thesis via HKUTO, 2006. http://sunzi.lib.hku.hk/hkuto/record/B36393678.
Full textNatter, Martin, and Markus Feurstein. "Correcting for CBC model bias. A hybrid scanner data - conjoint model." SFB Adaptive Information Systems and Modelling in Economics and Management Science, WU Vienna University of Economics and Business, 2001. http://epub.wu.ac.at/880/1/document.pdf.
Full textSeries: Report Series SFB "Adaptive Information Systems and Modelling in Economics and Management Science"
Wong, Shing-tat, and 黃承達. "Disaggregate analyses of stated preference data for capturing parking choice behavior." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2006. http://hub.hku.hk/bib/B36393678.
Full textGoudia, Dalila. "Tatouage conjoint a la compression d'images fixes dans JPEG2000." Thesis, Montpellier 2, 2011. http://www.theses.fr/2011MON20198.
Full textTechnological advances in the fields of telecommunications and multimedia during the two last decades, derive to create novel image processing services such as copyright protection, data enrichment and information hiding applications. There is a strong need of low complexity applications to perform seveval image processing services within a single system. In this context, the design of joint systems have attracted researchers during the last past years. Data hiding techniques embed an invisible message within a multimedia content by modifying the media data. This process is done in such a way that the hidden data is not perceptible to an observer. Digital watermarking is one type of data hiding. The watermark should be resistant to a variety of manipulations called attacks. The purpose of image compression is to represent images with less data in order to save storage costs or transmission time. Compression is generally unavoidable for transmission or storage purposes and is considered as one of the most destructive attacks by the data hiding. JPEG2000 is the last ISO/ ITU-T standard for still image compression.In this thesis, joint compression and data hiding is investigated in the JPEG2000 framework. Instead of treating data hiding and compression separately, it is interesting and beneficial to look at the joint design of data hiding and compression system. The joint approach have many advantages. The most important thing is that compression is no longer considered as an attack by data hiding.The main constraints that must be considered are trade offs between payload, compression bitrate, distortion induced by the insertion of the hidden data or the watermark and robustness of watermarked images in the watermarking context. We have proposed several joint JPEG2000 compression and data hiding schemes. Two of these joint schemes are watermarking systems. All the embedding strategies proposed in this work are based on Trellis Coded Quantization (TCQ). We exploit the channel coding properties of TCQ to reliably embed data during the quantization stage of the JPEG2000 part 2 codec
Kim, Hyowon. "Improving Inferences about Preferences in Choice Modeling." The Ohio State University, 2020. http://rave.ohiolink.edu/etdc/view?acc_num=osu1587524882296023.
Full textFournier, Marie-Cecile. "Pronostic dynamique de l'évolution de l'état de santé de patients atteints d'une maladie chronique." Thesis, Nantes, 2016. http://www.theses.fr/2016NANT1004/document.
Full textFor many chronic diseases, the monitoring of patients can be improved by a better understanding of disease growth and the ability to predict the occurrence of major events. Health status evolution can be measured by repeated measurements of a longitudinal marker, as serumcreatinine in renal transplantation.This thesis work in epidemiology and biostatistics applied to renal transplantation focuses on jointmodels for longitudinal and time-to-event data.These models have various benefits but their use is still uncommon in practice. In a first part, we use this methodology to identify the specific role of risk factors on serum creatinine evolution and/or graftfailure risk. We give a rich epidemiological overview and highlights some features which deserve additional attention as they seemassociated with graft failure risk without previousmodification of the longitudinal marker, the serumcreatinine. In a second part, we focus on dynamic predictions, which can be estimated from a jointmodel. They are called dynamic because of an update performed at each new measurement of the longitudinal marker. The clinical usefulness of this type of predictions has to be evaluated and should be based on good accuracy in terms of discrimination and calibration. To assess the prognostic capacities, the Brier Score or the ROCcurve have already been developed. To complete them, we propose an R² type indicator in order to complement some limitations of previous tools
Ferrer, Loic. "Modélisation et prédiction conjointe de différents risques de progression de cancer à partir des mesures répétées de biomarqueurs." Thesis, Bordeaux, 2017. http://www.theses.fr/2017BORD0875/document.
Full textIn longitudinal studies in cancer, a major problem is the description of the patient’s disease evolution or the prediction of his future state, based on repeated measurements of a biological marker. Joint modelling enables to meet these objectives but it has mainlybeen developed for the simultaneous study of a Gaussian longitudinal marker and a single event time. In order to characterize the transitions between successive events that a patient may experience, we extend the classical methodology by introducing a joint model for a Gaussian longitudinal process and a non-homogeneous Markovian multi-state process. The model assumes that individual transition times are independent conditionally to included covariates. We also propose a score test to assess this assumption. These developments are applied on two cohorts of men with localized prostate cancer treated with radiotherapy. The model quantifies the impact of prostate specific antigen dynamics, and other prognostic factors measured at the end of treatment, on each transition intensity between predefined clinical states. This thesis then provides statistical tools and guidelines for the computation of individual dynamic predictions of clinical events in the context of competitive risks. Finally, a last work leads to a reflection on joint modelling of longitudinal ordinal data and survival data with an innovative inference technique. To conclude, this work introduces statistical methods adapted to various types of longitudinal data and event history data, which meet the needs of clinicians. Methodological recommendations and software tools are associated with each development, for practical use by the clinical and statistical communities
Dantan, Etienne. "Modèles conjoints pour données longitudinales et données de survie incomplètes appliqués à l'étude du vieillissement cognitif." Thesis, Bordeaux 2, 2009. http://www.theses.fr/2009BOR21658/document.
Full textIn cognitive ageing study, older people are highly selected by a risk of death associated with poor cognitive performances. Modeling the natural history of cognitive decline is difficult in presence of incomplete longitudinal and survival data. Moreover, the non observed cognitive decline acceleration beginning before the dementia diagnosis is difficult to evaluate. Cognitive decline is highly heterogeneous, e.g. there are various patterns associated with different risks of survival event. The objective is to study joint models for incomplete longitudinal and survival data to describe the cognitive evolution in older people. Latent variable approaches were used to take into account the non-observed mechanisms, e.g. heterogeneity and decline acceleration. First, we compared two approaches to consider missing data in longitudinal data analysis. Second, we propose a joint model with a latent state to model cognitive evolution and its pre-dementia acceleration, dementia risk and death risk
Karimi, Maryam. "Modélisation conjointe de trajectoire socioprofessionnelle individuelle et de la survie globale ou spécifique." Thesis, Université Paris-Saclay (ComUE), 2016. http://www.theses.fr/2016SACLS120/document.
Full textBeing in low socioeconomic position is associated with increased mortality risk from various causes of death. Previous studies have already shown the importance of considering different dimensions of socioeconomic trajectories across the life-course. Analyses of professional trajectories constitute a crucial step in order to better understand the association between socio-economic position and mortality. The main challenge in measuring this association is then to decompose the respectiveshare of these factors in explaining the survival level of individuals. The complexity lies in the bidirectional causality underlying the observed associations:Are mortality differentials due to differences in the initial health conditions that are jointly influencing employment status and mortality, or the professional trajectory influences directly health conditions and then mortality?Standard methods do not consider the interdependence of changes in occupational status and the bidirectional causal effect underlying the observed association and that leads to substantial bias in estimating the causal link between professional trajectory and mortality. Therefore, it is necessary to propose statistical methods that consider simultaneously repeated measurements (careers) and survivalvariables. This study was motivated by the Cosmop-DADS database, which is a sample of the French salaried population.The first aim of this dissertation was to consider the whole professional trajectories and an accurate occupational classification, instead of using limitednumber of stages during life course and a simple occupational classification that has been considered previously. For this purpose, we defined time-dependent variables to capture different life course dimensions, namely critical period, accumulation model and social mobility model, and we highlighted the association between professional trajectories and cause-specific mortality using the definedvariables in a Cox proportional hazards model.The second aim was to incorporate the employment episodes in a longitudinal sub-model within the joint model framework to reduce the bias resulting from the inclusion of internal time-dependent covariates in the Cox model. We proposed a joint model for longitudinal nominal outcomes and competing risks data in a likelihood-based approach. In addition, we proposed an approach mimicking meta-analysis to address the calculation problems in joint models and large datasets, by extracting independent stratified samples from the large dataset, applying the joint model on each sample and then combining the results. In the same objective, that is fitting joint model on large-scale data, we propose a procedure based on the appeal of the Poisson regression model. This approach consist of finding representativetrajectories by means of clustering methods and then applying the joint model on these representative trajectories
Krasnowski, Piotr. "Codage conjoint source-chiffrement-canal pour les canaux de communication vocaux sécurisés en temps réel." Thesis, Université Côte d'Azur, 2021. http://www.theses.fr/2021COAZ4029.
Full textThe growing risk of privacy violation and espionage associated with the rapid spread of mobile communications renewed interest in the original concept of sending encrypted voice as audio signal over arbitrary voice channels. The usual methods used for encrypted data transmission over analog telephony turned out to be inadequate for modern vocal links (cellular networks, VoIP) equipped with voice compression, voice activity detection, and adaptive noise suppression algorithms. The limited available bandwidth, nonlinear channel distortion, and signal fadings motivate the investigation of a dedicated, joint approach for speech encodingand encryption adapted to modern noisy voice channels.This thesis aims to develop, analyze, and validate secure and efficient schemes for real-time speech encryption and transmission via modern voice channels. In addition to speech encryption, this study covers the security and operational aspects of the whole voice communication system, as this is relevant from an industrial perspective.The thesis introduces a joint speech encryption scheme with lossy encoding, which randomly scrambles the vocal parameters of some speech representation (loudness, pitch, timbre) and outputs an encrypted pseudo-voice signal robust against channel noise. The enciphering technique is based on random translations and random rotations using lattices and spherical codes on flat tori. Against transmission errors, the scheme decrypts the vocal parameters approximately and reconstructs a perceptually analogous speech signal with the help of a trained neural-based voice synthesizer. The experimental setup was validated by sending encrypted pseudo-voice over a real voice channel, and the decrypted speech was tested using subjective quality assessment by a group of about 40 participants.Furthermore, the thesis describes a new technique for sending data over voice channels that relies on short harmonic waveforms representing quaternary codewords. This technique achieves a variable bitrate up to 6.4 kbps and has been successfully tested over various real voice channels. Finally, the work considers a dedicated cryptographic key exchange protocol over voice channels authenticated by signatures and a vocal verification. The protocol security has been verified in a symbolic model using Tamarin Prover.The study concludes that secure voice communication over real digital voice channels is technically viable when the voice channels used for communication are stable and introduce distortion in a predictable manner.stabintroduce distortion in a predictable manner
Bouhou, Boutayeb. "Recherche conjointe d'ondes gravitationnelles et de neutrino cosmiques de haute énergie avec les détecteurs VIRGO-LIGO et ANTARES." Phd thesis, Université Pierre et Marie Curie - Paris VI, 2012. http://tel.archives-ouvertes.fr/tel-00819985.
Full textGamatié, Abdoulaye. "Design and Analysis for Multi-Clock and Data-Intensive Applications on Multiprocessor Systems-on-Chip." Habilitation à diriger des recherches, Université des Sciences et Technologie de Lille - Lille I, 2012. http://tel.archives-ouvertes.fr/tel-00756967.
Full textMesanovic, Diana, Dijana Rubil, and Beatrice Rylander. "A Conjoint based study on meat preferences. The effect of Country-of-Origin, Price, Quality and Expiration date on the consumer decision making process." Thesis, Jönköping University, JIBS, EMM (Entrepreneurship, Marketing, Management), 2009. http://urn.kb.se/resolve?urn=urn:nbn:se:hj:diva-11589.
Full textThis study will examine the importance of Country-of-Origin, Price, Quality and Expiration date, in the consumer decision making process for fresh meat. Country-of-Origin has earlier been investigated, however the research has been focusing on manipulating one single cue. With the recent scandals in the fresh meat industry, were animals being abused and expiration dates being changed, it is interesting to investigate how important the consumers find the four attributes; Country-of-Origin, price, quality and expiration date.In order to answer the research questions, and fulfil the purpose, the authors will use a mix of different data collection methods. Qualitative data will be gathered by performing interviews and quantitative data will be gathered by conducting a pilot study and an experiment. The data will be retrieved with the use of SPSS 17.0 and the conjoint analysis procedure. Country-of-origin has been found to be the most preferred attribute for consumers in their purchasing process for fresh meat, closely followed by expiration date. The consumer did find price and quality to be of importance, however the attributes were not found to be as important as Country-of-Origin and expiration date. As Country-of-Origin was found to be the most significant attribute for consumers in their decision making process, this indicates that the consumers are ethnocentric in their behaviour, i.e. they consider their own country and culture to be above others, which leads to a purchase of Swedish meat. It has also been found that the purchasing process of fresh meat is of great complexity, especially with the negative attention the fresh meat industry has induced.
Gaultier, Lucile. "Couplage des observations spatiales dynamiques et biologiques pour la restitution des circulations océaniques : une approche conjointe par assimilation de données altimétriques et de traceurs." Phd thesis, Université de Grenoble, 2013. http://tel.archives-ouvertes.fr/tel-01067698.
Full textLima, Mariana Sarmanho de Oliveira. "A aplicabilidade do gás natural do ponto de vista mercadológico, econômico e ambiental: um estudo para os Estados do Amazonas e de São Paulo." Universidade de São Paulo, 2011. http://www.teses.usp.br/teses/disponiveis/18/18140/tde-27062011-141122/.
Full textIndustrial activity is one of the main originators of negative impacts to the environment, given that the use of raw materials from natural resources is indispensable when operating a production process. To mitigate this problem, it is important to broaden the participation of cleaner energy inputs in the Brazilian energy sector in order to promote growth based on the proposals of sustainable development. An alternative energy is natural gas (NG), which became important after the energy crisis of 2000/2001. When compared with some of its energy substitutes, NG brings great expectations to the consumer industry since it is able to reduce costs, mitigate pollution and ensure the level of production without the risk of supply interruption from the hydroelectric plants during the dry periods. Based on this context, this paper analyzes the applicability of natural gas from a marketing, economic and environmental point of view, by identifying the attributes that influence the adoption of natural gas as alternative energy in the major industrial sectors of the states of São Paulo (SP) and Amazonas (AM), and examines the relative productive efficiency of a set of equipments (boilers and heaters) in the industrial sector, in order to compare the performance of equipments that use natural gas with those using other energy sources. To achieve the stated objectives, the method used included a combination of the critical incident technique (CIT) and conjoint analysis (CA). The application of the critical incident technique provided the attributes considered important for the adoption of this energy source in the industrial sector, and the conjoint analysis determined the usefulness and relative importance of the pertinent attributes in the consumers choice. Additionally, the data envelopment analysis (DEA) was used as a way to analyze the relative productive efficiency of the equipments that were studied. It should be noted that the DEA analysis was based on the equipments cost-effective ratio. The benefits considered were essentially economic and environmental. This study obtained important results that can help NG user and nonuser companies to more objectively measure the benefits of applying this energy source in production processes, as well as enable the government to establish appropriate strategies to encourage gas use after its foreseen production expansion, on account of recent discoveries of reserves in the pre-salt layers and the operation of the new Urucu-Coari-Manaus gas pipeline.
Majidi, Mohammad Hassan. "Bayesian estimation of discrete signals with local dependencies." Thesis, Supélec, 2014. http://www.theses.fr/2014SUPL0014/document.
Full textThe aim of this thesis is to study the problem of data detection in wireless communication system, for both case of perfect and imperfect channel state information at the receiver. As well known, the complexity of MLSE being exponential in the channel memory and in the symbol alphabet cardinality is quickly unmanageable and forces to resort to sub-optimal approaches. Therefore, first we propose a new iterative equalizer when the channel is unknown at the transmitter and perfectly known at the receiver. This receiver is based on continuation approach, and exploits the idea of approaching an original optimization cost function by a sequence of more tractable functions and thus reduce the receiver's computational complexity. Second, in order to data detection under linear dynamic channel, when the channel is unknown at the receiver, the receiver must be able to perform joint equalization and channel estimation. In this way, we formulate a combined state-space model representation of the communication system. By this representation, we can use the Kalman filter as the best estimator for the channel parameters. The aim in this section is to motivate rigorously the introduction of the Kalman filter in the estimation of Markov sequences through Gaussian dynamical channels. By this we interpret and make clearer the underlying approximations in the heuristic approaches. Finally, if we consider more general approach for non linear dynamic channel, we can not use the Kalman filter as the best estimator. Here, we use switching state-space model (SSSM) as non linear state-space model. This model combines the hidden Markov model (HMM) and linear state-space model (LSSM). In order to channel estimation and data detection, the expectation and maximization (EM) procedure is used as the natural approach. In this way extended Kalman filter (EKF) and particle filters are avoided
Lin, Tse-Ju, and 林則如. "Data-driven Handwriting Synthesis with Conjoined Manner." Thesis, 2014. http://ndltd.ncl.edu.tw/handle/70128227713746511044.
Full text國立臺灣大學
資訊工程學研究所
102
A person''s handwriting appears differently within a typical range of variations, and the shapes of handwriting characters also show complex interaction with their nearby neighbours. This makes automatic synthesis of handwriting characters and paragraphs very challenging. In this paper, we propose a method for synthesizing handwriting texts according to a writer''s handwriting style. The synthesis algorithm is composed by two phases. First, we create the shape models for different characters based on one writer''s data. Then, we compute the cursive probability to decide whether each pair of neighbouring characters are conjoined together or not. By jointly modelling the handwriting style and conjoined property through a novel trajectory optimization, final handwriting words can be synthesized from a set of collected samples. Furthermore, the paragraphs'' layouts are also automatically generated and adjusted according to the writer''s style obtained from the same dataset. We demonstrate that our method can successfully synthesize an entire paragraph that imitate a writer''s handwriting using his/her collected handwriting samples.
Pirot, Dorian. "Reconstruction des structures magnéto-convectives solaires sous une région active, par l’utilisation conjointe d’un modèle de convection anélastique et d’une méthode d’assimilation de données." Thèse, 2012. http://hdl.handle.net/1866/8662.
Full textWe use a data assimilation technique, together with an anelastic convection model, in order to reconstruct the convective patterns below a solar active region. Our results yield information about the magnetic field emergence through the convective zone and the mechanisms of active region formation. The solar data we used are taken from the instrument MDI on board the spatial observatory SOHO on July 2000 the 14th for the event called ”bastille day event”. This specific event leads to a solar flare followed by a coronal mass ejection. Assimilated data (magnetograms, temperature maps and vertical velocity maps) cover an area of 175 Mm × 175 Mm at photospheric level. The data assimilation technique we used, the ”Nudging Back and Forth”, is a Newtonian re- laxation technique similar to the ”quasi linear inverse 3D”. Such a technique does not require computation of the adjoint equations. Thus, simplicity of this method is a numerical advantage. Our study shows with a simple test case the applicability of this method to a convection model treated with the anelastic approximation. We show the efficiency of the NBF technique and we detail its potential for solar data assimi- lation. In addition, to ensure mathematical unicity of the obtained solution, a regularization has been imposed in the whole simulation domain. This is a new approach. Finally, we show that the interest of such a technique is not limited to the reconstruction of convective patterns but that it also allows optimal interpolation of photospheric magnetograms and predictions.