Academic literature on the topic 'Predictive exposure models'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Predictive exposure models.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Predictive exposure models"

1

Sheh Rahman, Shaesta Khan, Noraziah Adzhar, and Nazri Ahmad Zamani. "Comparative Analysis of Machine Learning Models to Predict Common Vulnerabilities and Exposure." Malaysian Journal of Fundamental and Applied Sciences 20, no. 6 (December 16, 2024): 1410–19. https://doi.org/10.11113/mjfas.v20n6.3822.

Full text
Abstract:
Predicting Common Vulnerabilities and Exposures (CVE) is a challenging task due to the increasing complexity of cyberattacks and the vast amount of threat data available. Effective prediction models are crucial for enabling cybersecurity teams to respond quickly and prevent potential exploits. This study aims to provide a comparative analysis of machine learning techniques for CVE prediction to enhance proactive vulnerability management and strengthening cybersecurity practices. The supervised machine learning model which is Gaussian Naive Bayes and unsupervised machine learning models that utilize clustering algorithms which are K-means and DBSCAN were employed for the predictive modelling. The performance of these models was compared using performance metrics such as accuracy, precision, recall, and F1-score. Among these models, the Gaussian Naive Bayes achieved an accuracy rate of 99.79%, and outperformed the clustering-based machine learning models in effectively determining the class labels or results of the data it was trained on or tested against. The outcome of this study will provide a proof of concept to Cybersecurity Malaysia, offering insights into the CVE model.
APA, Harvard, Vancouver, ISO, and other styles
2

Soo, Jhy-Charm, Perng-Jy Tsai, Shih-Chuan Lee, Shih-Yi Lu, Cheng-Ping Chang, Yuh-When Liou, and Tung-Sheng Shih. "Establishing aerosol exposure predictive models based on vibration measurements." Journal of Hazardous Materials 178, no. 1-3 (June 2010): 306–11. http://dx.doi.org/10.1016/j.jhazmat.2010.01.079.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Zhang, Ying, Cheng Zhao, Yu Lei, Qilin Li, Hui Jin, and Qianjin Lu. "Development of a predictive model for systemic lupus erythematosus incidence risk based on environmental exposure factors." Lupus Science & Medicine 11, no. 2 (November 2024): e001311. http://dx.doi.org/10.1136/lupus-2024-001311.

Full text
Abstract:
ObjectiveSystemic lupus erythematosus (SLE) is an autoimmune disease characterised by a loss of immune tolerance, affecting multiple organs and significantly impairing patients’ health and quality of life. While hereditary elements are essential in the onset of SLE, external environmental influences are also significant. Currently, there are few predictive models for SLE that takes into account the impact of occupational and living environmental exposures. Therefore, we collected basic information, occupational background and living environmental exposure data from patients with SLE to construct a predictive model that facilitates easier intervention.MethodsWe conducted a study comparing 316 individuals diagnosed with SLE and 851 healthy volunteers in a case–control design, collecting their basic information, occupational exposure history and environmental exposure data. Subjects were randomly allocated into training and validation groups using a 70/30 split. Using three-feature selection methods, we constructed four predictive models with multivariate logistic regression. Model performance and clinical utility were evaluated via receiver operating characteristic, calibration and decision curves. Leave-one-out cross-validation further validated the models. The best model was used to create a dynamic nomogram, visually representing the predicted relative risk of SLE onset.ResultsThe ForestMDG model demonstrated strong predictive ability, with an area under the curve of 0.903 (95% CI 0.880 to 0.925) in the training set and 0.851 (95% CI 0.809 to 0.894) in the validation set, as indicated by model performance evaluation. Calibration and decision curves demonstrated accurate results along with practical clinical value. Leave-one-out cross-validation confirmed that the ForestMDG model had the best accuracy (0.8338). Finally, we developed a dynamic nomogram for practical use, which is accessible via the following link:https://yingzhang99321.shinyapps.io/dynnomapp/.ConclusionWe created a user-friendly dynamic nomogram for predicting the relative risk of SLE onset based on occupational and living environmental exposures.Trial registration numberChiCTR2000038187.
APA, Harvard, Vancouver, ISO, and other styles
4

Aronoff-Spencer, Eliah, Sepideh Mazrouee, Rishi Graham, Mark S. Handcock, Kevin Nguyen, Camille Nebeker, Mohsen Malekinejad, and Christopher A. Longhurst. "Exposure notification system activity as a leading indicator for SARS-COV-2 caseload forecasting." PLOS ONE 18, no. 8 (August 18, 2023): e0287368. http://dx.doi.org/10.1371/journal.pone.0287368.

Full text
Abstract:
Purpose Digital methods to augment traditional contact tracing approaches were developed and deployed globally during the COVID-19 pandemic. These “Exposure Notification (EN)” systems present new opportunities to support public health interventions. To date, there have been attempts to model the impact of such systems, yet no reports have explored the value of real-time system data for predictive epidemiological modeling. Methods We investigated the potential to short-term forecast COVID-19 caseloads using data from California’s implementation of the Google Apple Exposure Notification (GAEN) platform, branded as CA Notify. CA Notify is a digital public health intervention leveraging resident’s smartphones for anonymous EN. We extended a published statistical model that uses prior case counts to investigate the possibility of predicting short-term future case counts and then added EN activity to test for improved forecast performance. Additional predictive value was assessed by comparing the pandemic forecasting models with and without EN activity to the actual reported caseloads from 1–7 days in the future. Results Observation of time series presents noticeable evidence for temporal association of system activity and caseloads. Incorporating earlier ENs in our model improved prediction of the caseload counts. Using Bayesian inference, we found nonzero influence of EN terms with probability one. Furthermore, we found a reduction in both the mean absolute percentage error and the mean squared prediction error, the latter of at least 5% and up to 32% when using ENs over the model without. Conclusions This preliminary investigation suggests smartphone based ENs can significantly improve the accuracy of short-term forecasting. These predictive models can be readily deployed as local early warning systems to triage resources and interventions.
APA, Harvard, Vancouver, ISO, and other styles
5

Hosein, Roland, Paul Corey, Frances Silverman, Anthony Ayiomamitis, R. Bruce Urch, and Neil Alexis. "Predictive Models Based on Personal, Indoor and Outdoor Air Pollution Exposure." Indoor Air 1, no. 4 (December 1991): 457–64. http://dx.doi.org/10.1111/j.1600-0668.1991.00010.x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Wei, Chih-Chiang, and Wei-Jen Kao. "Establishing a Real-Time Prediction System for Fine Particulate Matter Concentration Using Machine-Learning Models." Atmosphere 14, no. 12 (December 13, 2023): 1817. http://dx.doi.org/10.3390/atmos14121817.

Full text
Abstract:
With the rapid urbanization and industrialization in Taiwan, pollutants generated from industrial processes, coal combustion, and vehicle emissions have led to severe air pollution issues. This study focuses on predicting the fine particulate matter (PM2.5) concentration. This enables individuals to be aware of their immediate surroundings in advance, reducing their exposure to high concentrations of fine particulate matter. The research area includes Keelung City and Xizhi District in New Taipei City, located in northern Taiwan. This study establishes five fine prediction models based on machine-learning algorithms, namely, the deep neural network (DNN), M5’ decision tree algorithm (M5P), M5’ rules decision tree algorithm (M5Rules), alternating model tree (AMT), and multiple linear regression (MLR). Based on the predictive results from these five models, the study evaluates the optimal model for forecast horizons and proposes a real-time PM2.5 concentration prediction system by integrating various models. The results demonstrate that the prediction errors vary across different models at different forecast horizons, with no single model consistently outperforming the others. Therefore, the establishment of a hybrid prediction system proves to be more accurate in predicting future PM2.5 concentration compared to a single model. To assess the practicality of the system, the study process involved simulating data, with a particular focus on the winter season when high PM2.5 concentrations are prevalent. The predictive system generated excellent results, even though errors increased in long-term predictions. The system can promptly adjust its predictions over time, effectively forecasting the PM2.5 concentration for the next 12 h.
APA, Harvard, Vancouver, ISO, and other styles
7

Gomah, Mohamed Elgharib, Guichen Li, Naseer Muhammad Khan, Changlun Sun, Jiahui Xu, Ahmed A. Omar, Baha G. Mousa, Marzouk Mohamed Aly Abdelhamid, and Mohamed M. Zaki. "Prediction of Strength Parameters of Thermally Treated Egyptian Granodiorite Using Multivariate Statistics and Machine Learning Techniques." Mathematics 10, no. 23 (November 30, 2022): 4523. http://dx.doi.org/10.3390/math10234523.

Full text
Abstract:
The mechanical properties of rocks, such as uniaxial compressive strength and elastic modulus of intact rock, must be determined before any engineering project by employing lab or in situ tests. However, there are some circumstances where it is impossible to prepare the necessary specimens after exposure to high temperatures. Therefore, the propensity to estimate the destructive parameters of thermally heated rocks based on non-destructive factors is a helpful research field. Egyptian granodiorite samples were heated to temperatures of up to 800 °C before being treated to two different cooling methods: via the oven (slow-cooling) and using water (rapid cooling). The cooling condition, temperature, mass, porosity, absorption, dry density (D), and P-waves were used as input parameters in the predictive models for the UCS and E of thermally treated Egyptian granodiorite. Multi-linear regression (MLR), random forest (RF), k-nearest neighbor (KNN), and artificial neural networks (ANNs) were used to create predictive models. The performance of each prediction model was also evaluated using the (R2), (RMSE), (MAPE), and (VAF). The findings revealed that cooling methods and mass as input parameters to predict UCS and E have a minor impact on prediction models. In contrast, the other parameters had a good relationship with UCS and E. Due to severe damage to granodiorite samples, many input and output parameters were impossible to measure after 600 °C. The prediction models were thus developed up to this threshold temperature. Furthermore, the comparative analysis of predictive models demonstrated that the ANN pattern for predicting the UCS and E is the most accurate model, with R2 of 0.99, MAPE of 0.25%, VAF of 97.22%, and RMSE of 2.04.
APA, Harvard, Vancouver, ISO, and other styles
8

Symanski, E., L. L. Kupper, I. Hertz-Picciotto, and S. M. Rappaport. "Comprehensive evaluation of long-term trends in occupational exposure: Part 2. Predictive models for declining exposures." Occupational and Environmental Medicine 55, no. 5 (May 1, 1998): 310–16. http://dx.doi.org/10.1136/oem.55.5.310.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Moon, H., and M. Cong. "Predictive models of cytotoxicity as mediated by exposure to chemicals or drugs." SAR and QSAR in Environmental Research 27, no. 6 (June 2, 2016): 455–68. http://dx.doi.org/10.1080/1062936x.2016.1208272.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Fu, Siheng. "Comparative Analysis of Expected Goals Models: Evaluating Predictive Accuracy and Feature Importance in European Soccer." Applied and Computational Engineering 117, no. 1 (December 19, 2024): 1–10. https://doi.org/10.54254/2755-2721/2024.18300.

Full text
Abstract:
Expected Goals (xG) is a widely used metric in soccer analytics that estimates the probability of a shot resulting in a goal based on various characteristics of the shot. This study compares the predictive accuracy and feature importance of two prominent xG models: Opta and Understat. Using data from the top five European leagues from the 2017-2018 to the 2023-2024 seasons, we evaluate the predictive accuracy of each model using L1 and L2 loss metrics. Our findings indicate that Understat outperforms Opta in terms of lower prediction errors in the Bundesliga, Premier League, and Serie A, while Opta yields more stable predictions in La Liga and Ligue 1. We further analyze the factors influencing xG predictions through feature importance techniques using Random Forest and XGBoost models, complemented by SHAP (SHapley Additive exPlanations) analysis. Results reveal that goal exposure angle, shooting angle, and shot distance are key features in predicting goal probability, with differences in how categorical variables are weighted between the models. The study concludes with a discussion of the strengths, limitations, and league-specific applications of both models, highlighting the need for standardized data collection practices and expanded contextual features to enhance xG model utility and accuracy.
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Predictive exposure models"

1

Bresson, Morgane. "Quelles stratégies de prévention primaire peuvent-elles être envisagées pour prévenir les risques liés aux pesticides, en France." Electronic Thesis or Diss., Normandie, 2024. http://www.theses.fr/2024NORMC423.

Full text
Abstract:
L’exposition aux pesticides augmente le risque de certaines pathologies à long-terme chez les agriculteurs. La prévention repose actuellement sur l’application de « bonnes pratiques agricoles », mal définies et éloignées des pratiques habituelles des agriculteurs. Notre objectif était de contribuer à la prévention en France, en adoptant une double approche systémique et individuelle, visant à améliorer la prise en compte des expositions réelles des agriculteurs et proposer des solutions adaptées.Le premier volet de la thèse a étudié le caractère conservateur des modèles de prédiction des expositions professionnelles en comparant les expositions mesurées dans des divers contextes de travail habituel avec celles calculées par les modèles. Les modèles réglementaires sous-estiment les expositions des opérateurs agricoles, notamment en arboriculture, espaces verts et grandes cultures, en surestimant l’efficacité des équipements de protection individuelle et en négligeant certains déterminants d’exposition. Pour les travailleurs en réentrée/récolte, l’exposition après plusieurs jours est également sous-estimée.Dans le second volet, après un diagnostic des pratiques de prévention des agriculteurs, une intervention multicomposante basée notamment sur des théories psychosociales et destinées à influencer les comportements, alternative à la formation Certiphyto classique, a été développée. Les agriculteurs n’adoptent pas toujours les pratiques de prévention malgré leur connaissance des risques, en raison de barrières, normes sociales et auto-efficacité perçues. Une intervention a été conçue, incluant des démonstrations pratiques, un formateur pair et des processus d’engagement et de changement de normes sociales. Son efficacité sera évaluée par des critères objectifs (expositions urinaires) et auto-rapportés (comportements, perceptions psychosociales).Cette thèse propose d’intégrer davantage les expositions réelles des agriculteurs dans la prévention, tant dans les processus réglementaires que dans la formation pour encourager l’adoption de pratiques de protection. Nous devons poursuivre nos efforts pour atteindre d’autres travailleurs fortement exposés mais peu formés et adopter une approche pluridisciplinaire et globale pour réduire les risques des pesticides
Pesticide exposure increases the risk of some long-term disease among farmers. Prevention currently relies on the application of “good agricultural practices”, which are poorly defined and far from farmers’ usual practices. Our aim was to contribute to prevention in France, by adopting a dual systemic and individual approach, aimed at improving consideration of farmers’ actual exposures and proposing appropriate solutions.The first part of this thesis studied the conservative approach of occupational exposure prediction models, by comparing exposures measured in various usual working contexts with those calculated by the models. Regulatory models underestimate the exposure of agricultural operators, particularly in fruit growing, green spaces and field crops, by overestimating the effectiveness of personal protective equipment and neglecting some exposure determinants. For re-entry/harvest workers, exposure after several days is also underestimated.In the second part, following a diagnosis of farmers’ preventive practices, a multi-component intervention was developed, based in particular on psychosocial theories and designed to influence behavior, as an alternative to standard Certiphyto training. Farmers do not always adopt preventive practices despite their knowledge of the risks, due to perceived barriers, social norms and self-efficacy. An intervention has been designed, including practical demonstrations, a peer trainer and processes of commitment and social norm change. Its effectiveness will be assessed by objective (urinary exposures) and self-reported (behaviours, psychosocial perceptions) criteria.This thesis proposes to integrate farmers’ actual exposures more closely into prevention, both in regulatoryprocesses and in training to encourage the adoption of protective practices. We need to continue our efforts to reach other highly exposed but poorly trained workers, and adopt a multidisciplinary and comprehensive approach to reducing pesticide risks
APA, Harvard, Vancouver, ISO, and other styles
2

Nethery, Elizabeth Michel Kennedy. "From measures to models : predicting exposure to air pollution among pregnant women." Thesis, University of British Columbia, 2007. http://hdl.handle.net/2429/32150.

Full text
Abstract:
Introduction: Exposure assessment is a key challenge in environmental epidemiology. When modeling exposures for populations, one should consider (1) the applicability of the exposure model to the health effect of interest (i.e. chronic, acute), (2) the applicability of the model to the population of interest, (3) the extent to which modeled exposures account for individual factors and (4) the sources of variability within the model. Epidemiological studies of traffic-related air pollution and birth outcomes have used a variety of exposure models to estimate exposures for pregnant women. These models are rarely evaluated, let alone specifically for pregnant women. Methods: Measured and modeled personal exposures to air pollutants (nitric oxide: NO, nitrogen dioxide: N0₂ , filter absorbance and fine particles: PM₂․₅) were obtained for 62 pregnant women from 2005-2006 in Vancouver, Canada. Exposures were measured for 48-hours, 1-3 times over the pregnancy. Mobility was assessed using Global Positioning System monitoring and self-reported activity logs; individual factors (dwelling characteristics, socio-economic factors) were assessed using questionnaires. Results: Modeled home concentrations using a traffic-based land-use regression model were moderately predictive of personal samples for NO only (Pearson's r=0.49). Models for NO including home and work locations explained more between subject variance than using home only (4% home only, 2 0 % with home and work). Modeled exposures using ambient monitoring stations were predictive of personal samples for NO (Pearson's r=0.54), absorbance (r=0.29) and PM₂․₅ (r=0.12) mainly due to temporal correlations (within subject variance: NO = 37% , absorbance = 11%, PM, 5 = 9%). Home gas stove was an important determinant of personal exposure for all pollutants. There was a significant (1 hour/day/trimester) increase in time spent at home with increased trimester of pregnancy. Conclusions: In this evaluation, based upon repeated 48-hour exposure measurements, models currently used in air pollution studies were moderately reflective of personal exposures, depending on the specific pollutant and model. Land-use regression shows promise for capturing spatial variability, especially when including mobility (work or school locations) in exposures, whereas monitor-based models are better for capturing temporal variability. Future models should include mobility, where possible, and consider the implications of increasing time at home over pregnancy in assessing exposures for pregnant women.
Medicine, Faculty of
Population and Public Health (SPPH), School of
Graduate
APA, Harvard, Vancouver, ISO, and other styles
3

Tang, Chia-Hsi. "Development of Satellite-Based Emission Inventories and Indoor Exposure Prediction Models for PM2.5." Thesis, Harvard University, 2016. http://nrs.harvard.edu/urn-3:HUL.InstRepos:32644532.

Full text
Abstract:
Epidemiological studies have documented significant relationships between outdoor particle exposure and adverse health effects whilst indoor residential PM2.5 exposure is found to be the most influential on total PM2.5 exposure as people spend more than 80% of their time indoors. Accurate exposure assessments for ambient and indoor environments are therefore of equal importance. In this dissertation, we aim to develop methodologies that enhance our ability to quantify fine particulate matter (PM2.5) exposure in both macro and micro perspective. With advanced remote sensing technologies becoming more prevalent and less expensive, there is great potential in employing satellite date to analyze and illustrate ambient air quality in real-time over large geographic areas. In Chapter 1, we introduced a top down approach to construct PM2.5 emission inventory through the integration of mass balance and satellite retrieved daily Aerosol Optical Depth (AOD) at 1km x 1km resolution. The satellite-based inventory provides spatially- and temporally-resolved emission estimates as opposed to the conventional source-oriented inventory that has time lag issues with limited spatial variability to the extent of its data source (usually ground monitoring network). Subsequently in Chapter 2, we quantified the temporal and spatial trends of PM2.5 emission in the North East U.S. using the satellite-based emission inventory. Satellite-based emission trends are in agreement with that of the source-oriented inventories released by the US EPA, showing major reductions achieved in urban areas as well as along important traffic corridors. The technique of this part of the study can be applied to nation-wide or global remote sensing data for estimating PM2.5 emissions and hence improving the quantification of fine particles effects on climate, air quality and human health. While big data may be the game changer for resolving ambient air quality problems, we still face the challenge of data scarcity in microenvironments. In Chapter 3, prediction models that utilize few available samples to assess indoor PM2.5 exposure were developed. We estimated infiltration rate of ambient particles penetrating indoors using sulfur as the tracer element at 95 residences in the Greater Boston Area. Mixed effects model was employed in order to predict infiltration for individual residences. We then estimate indoor levels of PM2.5 and its black carbon (BC) content using outdoor measurements and the estimated infiltration factor. We cross validated the aforementioned models to evaluate their predictive power specifically at dates without indoor information. Cross validation results of the infiltration model (R2=0.89) indicates that mixed effects captured infiltration rates for individual households adequately. We also found strong predictability when sulfur infiltration surrogate and outdoor measurements of PM2.5 and BC were used in predicting indoor exposure levels (R2= 0.79 [PM2.5], 0.76 [BC]). Altogether, the methodologies introduced in this dissertation may serve as frameworks to (1) quantify and illustrate ambient emission of PM2.5 or other pollutants in a macro perspective and (2) determine the relationships between outdoor and indoor air quality and to predict indoor air pollution which are critical information for developing solutions of micro-level air quality problems.
APA, Harvard, Vancouver, ISO, and other styles
4

Zhu, Zheng. "A Unified Exposure Prediction Approach for Multivariate Spatial Data: From Predictions to Health Analysis." University of Cincinnati / OhioLINK, 2019. http://rave.ohiolink.edu/etdc/view?acc_num=ucin155437434818942.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Vuong, Kylie. "Transforming melanoma prevention: The development, validation and efficacy of model-generated risk predictions in Australian primary care." Thesis, The University of Sydney, 2017. http://hdl.handle.net/2123/17876.

Full text
Abstract:
Personalised model generated risk predictions that incorporate several risk factors may motivate people to increase sun protection. The aim of the thesis is to evaluate and build the quality of the evidence for melanoma risk prediction models, contribute to knowledge of melanoma risk factors for inclusion in risk prediction models, and evaluate their effectiveness as preventive tools inclinical practice. Chapter 1 presents an overview of the epidemiology and role of melanoma risk prediction models in prevention. Chapter 2 presents the results of a systematic review of melanoma risk prediction models. The systematic review identified 28 melanoma risk prediction models. However, there was limited reporting of model development and performance measures, and few studies were externally validated or prospectively evaluated in clinical settings. Chapter 3 evaluates occupational sun exposure and melanoma risk to improve understanding of whether this risk factor should be considered for inclusion in risk prediction models by use of two population based case control studies. There was no association between occupational sun exposure and melanoma risk overall or according to anatomical site. Chapters 4 and 5 presents the development and validation of two melanoma risk prediction models, one using self assessed risk factors and the other using clinicallyassessed risk factors. Chapter 6 presents the results of a pragmatic randomised controlled trial, in which 272 Australian general practice patients were randomly allocated to receive (1) real time personalised model generated risk predictions based on self assessed risk factors and tailored prevention advice, or (2) generic prevention advice. There were no statistically significant differences between intervention and control patients in sun protection practices (p=0.13). However average risk patients in the intervention group appeared to show greater sun protection at 6 weeks (mean difference=0.23, on a scale of 1 to 5; 95% confidence interval: 0.01 to 0.45; p=0.04). This thesis adds high quality evidence relevant to the prevention of me lanoma from the development and validation of model generated risk predictions to their implementation and efficacy in clinical practice and is likely to have an impact on preventative care in Australia and internationally.
APA, Harvard, Vancouver, ISO, and other styles
6

Abbassi, Maggie Magdi. "DRUG MILK TO SERUM RATIO PREDICTION AND ONTOGENY OF CYP3A CLEARANCE PATHWAY AS A MODEL OF DRUG EXPOSURE IN THE DEVELOPING RAT." UKnowledge, 2007. http://uknowledge.uky.edu/gradschool_diss/532.

Full text
Abstract:
Transfer of drugs into milk and the clearance of drugs in neonates are critical determinants of the exposure of infants to drugs in breast milk. Models predicting both parameters have been proposed. The objective of this dissertation is to test two models predicting milk to serum ratio and an ontogeny clearance model predicting clearance in the neonate. Predicted milk to serum ratio (M/S) values were generated according to the Atkinson and Begg model. The model did not adequately predict M/S when comparing the predicted values to observed values in the literature. The Fleishaker model was also tested. The model was able to predict whether the drugs appeared in milk by passive diffusion only or whether active transport processes were involved. This model, together with appropriate animal models, is useful in understanding the mechanism of drug transfer into milk. An ontogeny model that predicts clearance was proposed earlier by our laboratory. In order to test the model prediction and assumptions of constant microsomal protein and constant Km for an enzyme-substrate system with age, the male rat was used as an animal model. The ontogeny of Cyp3a1, Cyp3a2, Mdr1a and Mdr1b mRNA was examined in the male rat liver and intestine. The ontogeny pattern of Cyp3a2 mRNA, protein and in vitro Cyp3a activity were found to be similar in male rat liver. The microsomal protein content was found to vary with age in the liver. Km was found to be constant with age for the midazolam 4-hydroxylation by male rat liver microsomes. Scaling factors that extrapolate adult clearance to infant clearance were calculated from in vitro data. The model did not predict the in vivo oral clearance of midazolam for day 7 and 21 age groups from the 112 day age group (adult). The assumption that intestinal availability in the rat pups and adults was equal to unity might not be true resulting in overprediction of rat pup clearance when compared to the adult. Intestinal first pass effect for midazolam in adult rats might be significant. More experiments are needed to further test the model adequacy in clearance prediction.
APA, Harvard, Vancouver, ISO, and other styles
7

Bourdon, Julie A. "Use of Systems Biology in Deciphering Mode of Action and Predicting Potentially Adverse Health Outcomes of Nanoparticle Exposure, Using Carbon Black as a Model." Thèse, Université d'Ottawa / University of Ottawa, 2012. http://hdl.handle.net/10393/23105.

Full text
Abstract:
Nanoparticles (particles less than 100 nm in at least one dimension) exhibit chemical properties that differ from their bulk counterparts. Furthermore, they exhibit increased potential for systemic toxicities due to their deposition deep within pulmonary tissue upon inhalation. Thus, standard regulatory assays alone may not always be appropriate for evaluation of their full spectrum of toxicity. Systems biology (e.g., the study of molecular processes to describe a system as a whole) has emerged as a powerful platform proposed to provide insight in potential hazard, mode of action and human disease relevance. This work makes use of systems biology to characterize carbon black nanoparticle-induced toxicities in pulmonary and extra-pulmonary tissues (i.e., liver and heart) in mice over dose and time. This includes investigations of gene expression profiles, microRNA expression profiles, tissue-specific phenotypes and plasma proteins. The data are discussed in the context of potential use in human health risk assessment. In general, the work provides an example of how toxicogenomics can be used to support human health risk assessment.
APA, Harvard, Vancouver, ISO, and other styles
8

Narinesingh, Pramenath. "A sinuous gravel-bedded river with frequent bedrock exposures the statistics of its planform compared with a freely meandering river and the suitability of a processed-based hydraulic model predicting its erosion /." Access to citation, abstract and download form provided by ProQuest Information and Learning Company; downloadable PDF file, 182 p, 2010. http://proquest.umi.com/pqdweb?did=1993328581&sid=7&Fmt=2&clientId=8331&RQT=309&VName=PQD.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Chiu, Hsien-Jane, and 邱獻章. "Predictive models on weight gain among schizophrenic patients with an exposure to anti-psychotics in Taiwan." Thesis, 2004. http://ndltd.ncl.edu.tw/handle/7h77kb.

Full text
Abstract:
博士
國立陽明大學
公共衛生研究所
92
Schizophrenia, with 0.3-0.5% life prevalence in Taiwan, usually deteriorates cognitive function of patients in its chronic natural history of the disease. Schizophrenic patients not only have an increased risk of morbidity and mortality from different physical illness and accidents, but also are minor to acquire general medical cares. Furthermore, most patients received long term treatment of conventional and atypical antipsychotics, and up to 50% of them had significant weight gain problem. Weight gain will increase the health risk, impair of quality of life, and lead to noncompliance, even relapse. The prevention and management of health risk factors resulting from schizophrenia itself or from antipsychotics treatment are essential in caring for schizophrenic patients. However, the degree of weight gain may depend on individual vulnerability, personal behaviors, and environmental factors. The aim of this study is to establish a predictive model of body weight gain in antipsychotics-treated schizophrenic patients. This dissertation try to elucidate prediction of clinical outcomes can be predicted based on algorithms with an acceptable coverage of variance we are interested in. The weight gain due to antipsychotics exposure was chosen as the main clinical outcome in this dissertation by two different forms: dichotomous and continuous data type. Two hundred chronic schizophrenic patients were enrolled with at least 6 months hospitalization while approaching from Yu-Li Veterans Hospital (YLVH) and Tao-Yuan Psychiatric Center (TYPC). The dichotomous outcome for weight gain was predicted by the logistic regression model which was established from 67 schizophrenic patients recruited from YLVH. The reliability of this prediction algorithm is warranted by good sensitivity (90%) and specificity (83%). Two hundred thirty schizophrenic patients participating in TYPC were utilized to establish the linear regression model and to test its accuracy of weight gain prediction, which reached 92% compared to the observed values (within 5% confidence interval). For the convenience of users, Neuro-fuzzy techniques were applied to simplify the whole procedure of prediction on the clinical outcome for most clinicians with no thorough knowledge background of biostatistics. The prediction rate will improve from 80% to 98% after appropriate equation learning and training. Throughout these three different approaches, the clinical outcome prediction by algorithms for decision-making is proven effective and it really affords an evidence-based way in medical practice.
APA, Harvard, Vancouver, ISO, and other styles
10

Chiu, Fen-Fen, and 邱芬芬. "Exposure assessment and predictive models for respirable dust and crystalline free silica among foundry-industry workers." Thesis, 2015. http://ndltd.ncl.edu.tw/handle/g96wd2.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Predictive exposure models"

1

National Exposure Research Laboratory (U.S.). Ecosystems Research Division, ed. State-of-the-science report on predictive models and modeling approaches for characterizing and evaluating exposure to nanomaterials. Washington, DC: National Exposure Research Laboratory, Office of Research and Development, Ecosystems Research Division, 2010.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Radhakrishnan, V. Application of an energy-based life prediction model to bithermal and thermomechanical fatigue. [Washington, DC]: National Aeronautics and Space Administration, 1994.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

Radhakrishnan, V. Application of an energy-based life prediction model to bithermal and thermomechanical fatigue. [Washington, DC]: National Aeronautics and Space Administration, 1994.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

Sreeramesh, Kalluri, Halford Gary R, and United States. National Aeronautics and Space Administration., eds. Application of an energy-based life prediction model to bithermal and thermomechanical fatigue. [Washington, DC]: National Aeronautics and Space Administration, 1994.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

Sreeramesh, Kalluri, Halford Gary R, and United States. National Aeronautics and Space Administration., eds. Application of an energy-based life prediction model to bithermal and thermomechanical fatigue. [Washington, DC]: National Aeronautics and Space Administration, 1994.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

Huang, Ruili, and Menghang Xia, eds. Tox21 Challenge to Build Predictive Models of Nuclear Receptor and Stress Response Pathways as Mediated by Exposure to Environmental Toxicants and Drugs. Frontiers Media SA, 2017. http://dx.doi.org/10.3389/978-2-88945-197-5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Low Choy, Samantha, Justine Murray, Allan James, and Kerrie Mengersen. Combining monitoring data and computer model output in assessing environmental exposure. Edited by Anthony O'Hagan and Mike West. Oxford University Press, 2018. http://dx.doi.org/10.1093/oxfordhb/9780198703174.013.18.

Full text
Abstract:
This article discusses an approach that combines monitoring data and computer model outputs for environmental exposure assessment. It describes the application of Bayesian data fusion methods using spatial Gaussian process models in studies of weekly wet deposition data for 2001 from 120 sites monitored by the US National Atmospheric Deposition Program (NADP) in the eastern United States. The article first provides an overview of environmental computer models, with a focus on the CMAQ (Community Multi-Scale Air Quality) Eta forecast model, before considering some algorithmic and pseudo-statistical approaches in weather prediction. It then reviews current state of the art fusion methods for environmental data analysis and introduces a non-dynamic downscaling approach. The static version of the dynamic spatial model is used to analyse the NADP weekly wet deposition data.
APA, Harvard, Vancouver, ISO, and other styles
8

Validation of aircraft noise models at lower levels of exposure. Hampton, Va: National Aeronautics and Space Administration, Langley Research Center, 1996.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
9

Yu, Xianhong. Using different models to analyze the effects of measurement precision of ozone exposure on prediction of acute pulmonary function. 1992.

Find full text
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Predictive exposure models"

1

Matoba, Yoshihide, and Mark P. van Veen. "Predictive Residential Models." In Occupational and Residential Exposure Assessment for Pesticides, 209–42. Chichester, UK: John Wiley & Sons, Ltd, 2005. http://dx.doi.org/10.1002/0470012218.ch6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Pignocchino, Gianmarco, Alessandro Pezzoli, and Angelo Besana. "Satellite Data and Epidemic Cartography: A Study of the Relationship Between the Concentration of NO2 and the COVID-19 Epidemic." In Communications in Computer and Information Science, 55–67. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-94426-1_5.

Full text
Abstract:
AbstractSatellite data are widely used to study the spatial component of epidemics: to monitor their evolution, to create epidemiological risk maps and predictive models. The improvement of data quality, not only in technical terms but also of scientific relevance and robustness, represents in this context one of the most important aspects for health information technology that can make further significant and useful progress in monitoring and managing epidemics. In this regard, this paper intends to address an issue that is not always adequately considered in the use of satellite data for the creation of maps and spatial models of epidemics, namely the preliminary verification of the level of spatial correlation between remote sensing environmental variables and epidemics. Specifically, we intend to evaluate the contribution of exposure to the pollutant nitrogen dioxide (NO2) on the spatial spread of the virus and the severity of the current COVID infection.
APA, Harvard, Vancouver, ISO, and other styles
3

Moss, Gary P., Darren R. Gullick, and Simon C. Wilkinson. "Finite-Dose Models of Transient Exposures and Volatile Formulation Components." In Predictive Methods in Percutaneous Absorption, 141–58. Berlin, Heidelberg: Springer Berlin Heidelberg, 2015. http://dx.doi.org/10.1007/978-3-662-47371-9_8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Brown, V. K. "Animal Models of Responses Resulting from Short-term Exposures." In The Future of Predictive Safety Evaluation, 47–55. Dordrecht: Springer Netherlands, 1987. http://dx.doi.org/10.1007/978-94-009-3201-2_3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Magri, Antoni, Douglas A. Haith, A. Martin Petrovic, Laosheng Wu, and Robert L. Green. "Development and Testing of a Comprehensive Model of Pesticide Losses from Turf." In Turf Grass: Pesticide Exposure Assessment and Predictive Modeling Tools, 183–96. Washington DC: American Chemical Society, 2009. http://dx.doi.org/10.1021/bk-2009-1028.ch013.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Madhavan, Selvakumar, and S. Geetha. "Predicting Particulate Air Pollution Using Line Source Models." In Urban Air Quality Monitoring, Modelling and Human Exposure Assessment, 137–53. Singapore: Springer Singapore, 2020. http://dx.doi.org/10.1007/978-981-15-5511-4_10.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Hamey, Paul Y. "A Comparison of the Pesticide Handlers Exposure Database (PHED) and the European Predictive Operator Exposure Model (EUROPOEM) Database." In Methods of Pesticide Exposure Assessment, 103–9. Boston, MA: Springer US, 1995. http://dx.doi.org/10.1007/978-1-4899-0973-2_11.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Gonzalez-Martinez, Sergio, María Fernanda Cabrera-Umpiérrez, Manuel Ottaviano, Vladimir Urošević, Nikola Vojičić, Stefan Spasojević, and Ognjen Milićević. "Novel Interactive BRAINTEASER Tools for Amyotrophic Lateral Sclerosis (ALS) and Multiple Sclerosis (MS) Management." In Lecture Notes in Computer Science, 302–10. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-031-09593-1_26.

Full text
Abstract:
AbstractThe presented demonstrated working tools in the initial version constitute the foundation of the novel ALS and MS management and monitoring, leveraging extended IoT sensing and emerging instruments infrastructure, and a basis for integration of more advanced and effective AI models (in development) for disease progression prediction, patient stratification and ambiental exposure assessment.
APA, Harvard, Vancouver, ISO, and other styles
9

Trapp, St, R. Brüggemann, and B. Münzer. "Exposure Analysis of the Phosphate Substitutes NTA and EDTA by Use of the Surface Water Model EXWAT." In Water Pollution: Modelling, Measuring and Prediction, 195–209. Dordrecht: Springer Netherlands, 1991. http://dx.doi.org/10.1007/978-94-011-3694-5_14.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Urošević, Vladimir, Nikola Vojičić, Aleksandar Jovanović, Borko Kostić, Sergio Gonzalez-Martinez, María Fernanda Cabrera-Umpiérrez, Manuel Ottaviano, Luca Cossu, Andrea Facchinetti, and Giacomo Cappon. "BRAINTEASER Architecture for Integration of AI Models and Interactive Tools for Amyotrophic Lateral Sclerosis (ALS) and Multiple Sclerosis (MS) Progression Prediction and Management." In Digital Health Transformation, Smart Ageing, and Managing Disability, 16–25. Cham: Springer Nature Switzerland, 2023. http://dx.doi.org/10.1007/978-3-031-43950-6_2.

Full text
Abstract:
AbstractThe presented platform architecture and deployed implementation in real-life clinical and home care settings on four Amyotrophic Lateral Sclerosis (ALS) and Multiple Sclerosis (MS) study sites, integrates the novel working tools for improved disease management with the initial releases of the AI models for disease monitoring. The described robust industry-standard scalable platform is to be a referent example of the integration approach based on loose coupling APIs and industry open standard human-readable and language-independent interface specifications, and its successful baseline implementation for further upcoming releases of additional and more advanced AI models and supporting pipelines (such as for ALS and MS progression prediction, patient stratification, and ambiental exposure modelling) in the following development.
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Predictive exposure models"

1

Engel, Ryan, and Gilchan Park. "Evaluating Large Language Models for Predicting Protein Behavior under Radiation Exposure and Disease Conditions." In Proceedings of the 23rd Workshop on Biomedical Natural Language Processing, 427–39. Stroudsburg, PA, USA: Association for Computational Linguistics, 2024. http://dx.doi.org/10.18653/v1/2024.bionlp-1.34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Greenwood, Eric. "Helicopter Flight Procedures for Community Noise Reduction." In Vertical Flight Society 73rd Annual Forum & Technology Display, 1–14. The Vertical Flight Society, 2017. http://dx.doi.org/10.4050/f-0073-2017-12278.

Full text
Abstract:
A computationally efficient, semiempirical noise model suitable for maneuvering flight noise prediction is used to evaluate the community noise impact of practical variations on several helicopter flight procedures typical of normal operations. Turns, "quick-stops," approaches, climbs, and combinations of these maneuvers are assessed. Relatively small variations in flight procedures are shown to cause significant changes to Sound Exposure Levels over a wide area. Guidelines are developed for helicopter pilots intended to provide effective strategies for reducing the negative effects of helicopter noise on the community. Finally, direct optimization of flight trajectories is conducted to identify low noise optimal flight procedures and quantify the magnitude of community noise reductions that can be obtained through tailored helicopter flight procedures. Physically realizable optimal turns and approaches are identified that achieve global noise reductions of as much as 10 dBA Sound Exposure Level.
APA, Harvard, Vancouver, ISO, and other styles
3

Greenwood, Eric. "Estimating Helicopter Noise Abatement Information with Machine Learning." In Vertical Flight Society 74th Annual Forum & Technology Display, 1–14. The Vertical Flight Society, 2018. http://dx.doi.org/10.4050/f-0074-2018-12666.

Full text
Abstract:
Machine learning techniques are applied to the NASA Langley Research Center's expansive database of helicopter noise measurements containing over 1500 steady flight conditions for ten different helicopters. These techniques are then used to develop models capable of predicting the operating conditions under which significant Blade-Vortex Interaction noise will be generated for any conventional helicopter. A measure for quantifying the overall ground noise exposure of a particular helicopter operating condition is developed. This measure is then used to classify the measured flight conditions as noisy or not-noisy. These data are then parameterized on a non-dimensional basis that defines the main rotor operating condition and are then scaled to remove bias. Several machine learning methods are then applied to these data. The developed models show good accuracy in identifying the noisy operating region for helicopters not included in the training data set. Noisy regions are accurately identified for a variety of different helicopters. One of these models is applied to estimate changes in the noisy operating region as vehicle drag and ambient atmospheric conditions are varied.
APA, Harvard, Vancouver, ISO, and other styles
4

Garg, Priya, and Deepti Aggarwal. "Application of Swarm-Based Feature Selection and Extreme Learning Machines in Lung Cancer Risk Prediction." In Intelligent Computing and Technologies Conference. AIJR Publisher, 2021. http://dx.doi.org/10.21467/proceedings.115.1.

Full text
Abstract:
Lung cancer risk prediction models help in identifying high-risk individuals for early CT screening tests. These predictive models can play a pivotal role in healthcare by decreasing lung cancer's mortality rate and saving many lives. Although many predictive models have been developed that use various features, no specific guidelines have been provided regarding the crucial features in lung cancer risk prediction. This study proposes novel risk prediction models using bio-inspired swarm-based techniques for feature selection and extreme learning machines for classification. The proposed models are applied on a public dataset consisting of 1000 patient records and 23 variables, including sociodemographic factors, smoking status, and lung cancer clinical symptoms. The models, validated using 10-fold cross-validation, achieve an AUC score in the range of 0.985 to 0.989, accuracy in the range of 0.986 to 0.99 and F-Measure in range of 0.98 to 0.985. The study also identifies smoking habits, exposure to air pollution, occupational hazards and some clinical symptoms as the most commonly selected lung cancer risk prediction features. The study concludes that the developed lung cancer risk prediction models can be successfully applied for early screening, diagnosis and treatment of high-risk individuals.
APA, Harvard, Vancouver, ISO, and other styles
5

Gernand, Jeremy M. "Limitations on the Reliability of In Vitro Predictive Toxicity Models to Predict Pulmonary Toxicity in Rodents." In ASME 2016 International Mechanical Engineering Congress and Exposition. American Society of Mechanical Engineers, 2016. http://dx.doi.org/10.1115/imece2016-67151.

Full text
Abstract:
Given the rapidly proliferating varieties of nanomaterials and ongoing concerns that these novel materials may pose emerging occupational and environmental risks, combined with the possibility that each variety might pose a different unique risk due to the unique combination of material properties, researchers and regulators have been searching for methods to identify hazards and prioritize materials for further testing. While several screening tests and toxic risk models have been proposed, most have relied on cellular-level in vitro data. This foundation enables answers to be developed quickly for any material, but it is yet unclear how this information may translate to more realistic exposure scenarios in people or other more complex animals. A quantitative evaluation of these models or at least the inputs variables to these models in the context of rodent or human health outcomes is necessary before their classifications may be believed for the purposes of risk prioritization. This paper presents the results of a machine learning enabled meta-analysis of animal studies attempting to use significant descriptors from in vitro nanomaterial risk models to predict the relative toxicity of nanomaterials following pulmonary exposures in rodents. A series of highly non-linear random forest models (each made up of an ensemble of 1,000 regression tree models) were created to assess the maximum possible information value of the in vitro risk models and related methods of describing nanomaterial variants and their toxicity in rat and mouse experiments. The variety of chemical descriptors or quantitative chemical property measurements such as bond strength, surface charge, and dissolution potential, while important in describing observed differences with in vitro experiments, proved to provide little indication of the relative magnitude of inflammation in rodents (explained variance amounted to less than 32%). Important factors in predicting rodent pulmonary inflammation such as primary particle size and chemical type demonstrate that there are critical differences between these two toxicity assays that cannot be captured by a series of in vitro tests alone. Predictive models relying primarily on these descriptors alone explained more than 62% of the variance of the short term in vivo toxicity results. This means that existing proposed nanomaterial toxicity screening methods are inadequate as they currently stand, and either the community must be content with the slower and more expensive animal testing to evaluate nanomaterial risks, or further conceptual development of improved alternative in vitro screening methodologies is necessary before manufacturers and regulators can rely on them to promote safer use of nanotechnology.
APA, Harvard, Vancouver, ISO, and other styles
6

Lall, Pradeep, and Madhu Kasturi. "Sequential High Temperature and Hygrothermal Exposure on the Evolution of Interfacial Fracture Toughness of TIM-Copper and EMC Interfaces." In ASME 2024 International Technical Conference and Exhibition on Packaging and Integration of Electronic and Photonic Microsystems. American Society of Mechanical Engineers, 2024. http://dx.doi.org/10.1115/ipack2024-141853.

Full text
Abstract:
Abstract In automotive electronics, where high-performance semiconductors are increasingly utilized for advanced driver assistance systems, effective thermal management is crucial to optimize performance and integration density. One significant aspect is managing thermal interface materials (TIM) to enhance heat dissipation from chips to packages and to heat sinks. Exposure to automotive conditions, characterized by wide temperature ranges and humidity conditions, can lead to early fractures at TIM-to-Copper interfaces, escalating thermal resistance and device thermal runaway. The ongoing downsizing and integration of semiconductor devices has resulted in an increase in heat generated per unit volume of the chip. Insufficient understanding of the development of TIM material fracture toughness during extensive environments in automotive applications hampers the creation of predictive models for estimating time to failure. This paper aims to investigate the impact of temperature storage, and hygrothermal exposure on TIM-Copper and EMC-Substrate interfaces, along with evolving fracture toughness, to develop reliable damage models for assessing performance degradation and predicting interface survivability. The bi-material samples are stored first at high temperatures of between 100°C and 150°C for 60 days then stored in a humidity chamber for moisture absorption at 85°C/85% RH, this is called the sequential exposure. The bi-material samples are subjected to monotonic and fatigue loading to measure mode-1 and mode-2 critical stress intensity factors and Paris’s Power Law Exponents. And, the thermal diffusivity, activation energy, and acceleration factors are being investigated for hygrothermal exposure. The % weight loss/gain of the TIMs for all exposed conditions are presented.
APA, Harvard, Vancouver, ISO, and other styles
7

Chen, Zhong, Kinwah Wong, Wei Li, David A. Stephenson, and Steven Y. Liang. "Cutting Fluid Aerosol Generation due to Spin-Off in Turning Operation: Analysis for Environmentally Conscious Machining." In ASME 1999 International Mechanical Engineering Congress and Exposition. American Society of Mechanical Engineers, 1999. http://dx.doi.org/10.1115/imece1999-0684.

Full text
Abstract:
Abstract This paper presents an analytical model for the prediction of shop floor aerosol concentration and size distribution due to the spinoff of cutting fluid from a rotational workpiece in a turning operation. Based on atomization theory and principles of fluid motion, the model analyzes the generation of cutting fluid aerosol associated with the rotary disk and liquid sheet formations on the workpiece surface. In coupling with fluid flow rate analysis and Rosin and Rammler distribution model, the airborne particulate concentration and size distribution are expressed in terms of fluid properties, fluid application conditions, and machining process parameters. Experiments were performed with the use of light scattering particle measurement device to calibrate and verify the analytical models. Under various fluid flow rates and workpiece rotational speeds, experimental data have shown reasonable agreement with model predictions. The predictive models developed in this paper can be used as a basis for human exposure and health hazard analysis. It can also facilitate the control and optimization of the use of cutting fluids in achieving a balanced consideration of process productivity and environmental consciousness.
APA, Harvard, Vancouver, ISO, and other styles
8

Ramchandran, Vignesh, and Jeremy M. Gernand. "Examining Pulmonary Toxicity of Engineered Nanoparticles Using Clustering for Safe Exposure Limits." In ASME 2018 International Mechanical Engineering Congress and Exposition. American Society of Mechanical Engineers, 2018. http://dx.doi.org/10.1115/imece2018-87431.

Full text
Abstract:
Experimental toxicology studies for the purposes of setting occupational exposure limits for aerosols have drawbacks including excessive time and cost which could be overcome or limited by the development of computational approaches. A quantitative, analytical relationship between the characteristics of emerging nanomaterials and related toxicity is desired to better assist in the subsequent mitigation of toxicity by design. Quantitative structure activity relationships (QSAR’s) and meta-analyses are popular methods used to develop predictive toxicity models. A meta-analysis for investigation of the dose-response and recovery relationship in a variety of engineered nanoparticles was performed using a clustering-based approach. The primary objective of the clustering is to categorize groups of similarly behaving nanoparticles leading to the identification of any physicochemical differences between the various clusters and evaluate their contributions to toxicity. The studies are grouped together based on their similarity of their dose-response and recovery relationship, the algorithm utilizes hierarchical clustering to classify the different nanoparticles. The algorithm uses the Akaike information criterion (AIC) as the performance metric to ensure there is no overfitting in the clusters. The results from the clustering analysis of 2 types of engineered nanoparticles namely Carbon nanotubes (CNTs) and Metal oxide nanoparticles (MONPs) for 5 response variables revealed that there are at least 4 or more toxicologically distinct groups present among the nanoparticles on the basis of similarity of dose-response. Analysis of the attributes of the clusters reveals that they also differ on the basis of their length, diameter and impurity content. The analysis was further extended to derive no-observed-adverse-effect-levels (NOAEL’s) for the clusters. The NOAELs for the “Long and Thin” variety of CNTs were found to be the lowest, indicating that those CNTs showed the earliest signs of adverse effects.
APA, Harvard, Vancouver, ISO, and other styles
9

Sinofsky, Edward. "Internal biological tissue temperature measurements using zirconium fluoride IR fibers." In International Laser Science Conference. Washington, D.C.: Optica Publishing Group, 1986. http://dx.doi.org/10.1364/ils.1986.fb4.

Full text
Abstract:
The availability of zirconium fluoride fibers with improved midinfrared transmission has extended the range of remote thermography. One application is measurement of the internal temperature of biological tissue during and after exposure to laser energy. Such measurements are important for verification and calibration of predictive thermal models allowing rational selection of such parameters as power, wavelength, pulse duration, and total fluence. This technique may also help to determine when vaporization will occur with a minimum zone of thermal injury. The infrared signal can be sensed by circuitry to terminate the exposure automatically when the selected tissue temperature is reached. This scheme should be more accurate than visual interpretation of the degree of blanching. We report predictions of the signal level as a function of tissue temperature for detection through the fiber by a thermoelectrically cooled lead selenide photodetector. We also explore the resolution limits in space, time, and temperature, and compare our measured values with results of modeling calculations performed at USCI.
APA, Harvard, Vancouver, ISO, and other styles
10

Lall, Pradeep, Kalyan Dornala, Jeff Suhling, and John Deep. "Interfacial Delamination and Fracture Properties of Potting Compounds and PCB/Epoxy Interfaces Under Flexure Loading After Exposure to Multiple Cure Temperatures." In ASME 2017 International Technical Conference and Exhibition on Packaging and Integration of Electronic and Photonic Microsystems collocated with the ASME 2017 Conference on Information Storage and Processing Systems. American Society of Mechanical Engineers, 2017. http://dx.doi.org/10.1115/ipack2017-74322.

Full text
Abstract:
Electronics components operating under extreme thermo-mechanical stresses are often protected with conformal coating and potting encapsulation to isolate the thermal and vibration shock loads. Development of predictive models for high-g shock survivability of electronics requires the measurement of the interface properties of the potting compounds with the printed circuit board materials. There is scarcity of interface fracture properties of porting compounds with printed circuit board materials. Potting and encapsulation resins are commonly two-part systems which when mixed together form a solid, fully cured material, with no by-products. The cured potting materials are prone to interfacial delamination under dynamic shock loading which in turn potentially cause failures in the package interconnects. The study of interfacial fracture resistance in PCB/epoxy potting systems under dynamic shock loading is important in mitigating the risk of system failure in mission critical applications. In this paper three types of epoxy potting compounds were used as an encapsulation on PCB samples. The potting compounds were selected on the basis of their ultimate elongation under quasi-static loading. Potting compound, A is stiffer material with 5% of ultimate elongation before failure. Potting compound, B is a moderately stiff material with 12% ultimate elongation. Finally potting compound C is a softer material with 90% ultimate elongation before failure. The fracture properties and interfacial crack delamination of the PCB/epoxy interface was determined using three-point bend loading with a pre-crack in the epoxy near the interface. The fracture toughness and crack initiation of the three epoxy systems was compared with the cure schedule and temperature. Fracture modeling was performed with crack tip elements in ABAQUS finite element models to determine the crack initiation and interfacial stresses. A comparison of the fracture properties and the performance of epoxy system resistance to delamination was shown through the three-point bend tests. The finite element model results were correlated with the experimental findings.
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Predictive exposure models"

1

Sriraj, P. S., Kazuya Kawamura, Paul Metaxatos, Joseph Fazio, Chaitanya Pujari, Nahid Parvez Farazi, and Pooria Choobchian. Railroad-Highway Crossing Safety Improvement Evaluation and Prioritization Tool. Illinois Center for Transportation, June 2023. http://dx.doi.org/10.36501/0197-9191/23-009.

Full text
Abstract:
The expected crash frequency model of Illinois Department of Transportation’s Bureau of Design and Environment needed improvement to incorporate track circuitry as well as pedestrian exposure at railroad-highway grade crossings to make the model more comprehensive. The researchers developed, calibrated, and validated three models to predict collision rates at public, at-grade railroad-highway crossings in Illinois’ six-county northeast region for prioritizing railroad-highway crossings for safety improvements. The first model updated B-factors in the existing Illinois model, which was last validated with data from 1968. The second model modified B-factors to include circuitry types given the active maximum traffic control device at the crossing and added another factor (i.e., P-factor) to account for pedestrian daily traffic using the crossing. The third model added a P-factor to the existing US Department of Transportation’s web accident prediction system model to account for daily pedestrian traffic. Using year 2018 validation data, the first model had an r2 of 0.20 with reported collision rates. The second model had an r2 of 0.58 with reported collision rates, while the existing BDE model had an r2 of 0.17 with year 2018 reported collision rates. The third model had an r2 of 0.70 with reported collision rates using 2018 validation data whereas the existing US Department of Transportation’s web-based accident prediction system model had an r2 of 0.50 using year 2018 validation data. The three models are presented in this report along with a digital tool using the second model for illustrative purposes.
APA, Harvard, Vancouver, ISO, and other styles
2

Committee on Toxicology. New Approach Methodologies (NAMs) In Regulatory Risk Assessment Workshop Report 2020- Exploring Dose Response. Food Standards Agency, March 2024. http://dx.doi.org/10.46756/sci.fsa.cha679.

Full text
Abstract:
The UK Food Standards Agency (FSA) and the Committee on Toxicity of Chemicals in Food, Consumer Products and the Environment (COT) held an “Exploring Dose Response” workshop in a multidisciplinary setting inviting regulatory agencies, government bodies, academia and industry. The workshop provided a platform from which to address and enable expert discussions on the latest in silico prediction models, new approach methodologies (NAMs), physiologically based pharmacokinetics (PBPK), future methodologies, integrated approaches to testing and assessment (IATA) as well as methodology validation. Using a series of presentations from external experts and case study (plastic particles, polymers, tropane alkaloids, selective androgen receptor modulators) discussions, the workshop outlined and explored an approach that is fit for purpose applied to future human health risk assessment in the context of food safety. Furthermore, possible future research opportunities were explored to establish points of departure (PODs) using non-animal alternative models and to improve the use of exposure metrics in risk assessment.
APA, Harvard, Vancouver, ISO, and other styles
3

DiDomizio, Matthew, and Jonathan Butta. Measurement of Heat Transfer and Fire Damage Patterns on Walls for Fire Model Validation. UL Research Institutes, July 2024. http://dx.doi.org/10.54206/102376/hnkr9109.

Full text
Abstract:
Fire models are presently employed by fire investigators to make predictions of fire dynamics within structures. Predictions include the evolution of gas temperatures and velocities, smoke movement, fire growth and spread, and thermal exposures to surrounding objects, such as walls. Heat flux varies spatially over exposed walls based on the complex thermal interactions within the fire environment, and is the driving factor for thermally induced fire damage. A fire model predicts the temperature and heat transfer through walls based on field predictions, such as radiative and convective heat flux, and is also subject to the boundary condition represen-tation, which is at the discretion of model practitioners. At the time of writing, Fire Dynamics Simulator can represent in-depth heat transfer through walls, and transverse heat transfer is in a preliminary development stage. Critically, limited suitable data exists for validation of heat trans-fer through walls exposed to fires. Mass loss and discoloration fire effects are directly related to the heat transfer and thermal decomposition of walls, therefore it is crucial that the representation of transverse heat transfer in walls in fire models be validated to ensure that fire investigators can produce accurate simulations and reconstructions with these tools. The purpose of this study was to conduct a series of experiments to obtain data that addresses three validation spaces: 1) thermal exposure to walls from fires; 2) heat transfer within walls exposed to fires; and 3) fire damage patterns arising on walls exposed to fires. Fire Safety Research Institute, part of UL Research Institutes, in collaboration with the Bureau of Alcohol, Tobacco, Firearms and Explosives Fire Research Laboratory, led this novel research endeavor. Experiments were performed on three types of walls to address the needs in this validation space: 1. Steel sheet (304 stainless steel, 0.793 mm thick, coated in high-emissivity high-temperature paint on both sides). This wall type was used to support the heat flux validation objective. By combining measurements of gas temperatures near the wall with surface temperatures obtained using infrared thermography, estimates of the incident heat flux to the wall were produced. 2. Calcium silicate board (BNZ Marinite I, 12.7 mm thick). This wall type was used to support the heat transfer validation objective. Since calcium silicate board is a noncombustible material with well-characterized thermophysical properties at elevated temperatures, measurements of surface temperature may be used to validate transverse heat transfer in a fire model without the need to account for a decomposition mechanism. 3. Gypsum wallboard (USG Sheetrock Ultralight, 12.7 mm thick, coated in white latex paint on the exposed side). This wall type was used to support the fire damage patterns validation objective. Two types of fire effects were considered: 1) discoloration and charring of the painted paper facing of the gypsum wallboard; and 2) mass loss of the gypsum wallboard (which is related to the calcination of the core material). In addition to temperature and heat flux measurements, high resolution photographs of fire patterns were recorded, and mass loss over the entirety of the wall was measured by cutting the wall into smaller samples and measuring the mass of each individual sample. A total of 63 experiments were conducted, encompassing seven fire sources and three wall types (each combination conducted in triplicate). Fire sources included a natural gas burner, gasoline and heptane pools, wood cribs, and upholstered furniture. A methodology was developed for obtaining estimates of field heat flux to a wall using a large plate heat flux sensor. This included a numerical optimization scheme to account for convection heat transfer. These data characterized the incident heat flux received by calcium silicate board and gypsum wallboard in subsequent experiments. Fire damage patterns on the gypsum wallboard, attributed to discoloration and mass loss fire effects, were measured. It was found that heat flux and mass loss fields were similar for a given fire type, but the relationship between these measurements was not consistent across all fire types. Therefore, it was concluded that cumulative heat flux does not adequately describe the mass loss fire effect. Fire damage patterns attributed to the discoloration fire effect were defined as the line of demarcation separating charred and uncharred regions of the wall. It was found that the average values of cumulative heat flux and mass loss ratio coinciding with the fire damage patterns were 10.41 ± 1.51 MJ m−2 and 14.86 ± 2.08 %, respectively. These damage metrics may have utility in predicting char delineation damage patterns in gypsum wallboard using a fire model, with the mass loss ratio metric being overall the best fit over all exposures considered. The dataset produced in this study has been published to a public repository, and may be accessed from the following URL: <https://doi.org/10.5281/zenodo.10543089>.
APA, Harvard, Vancouver, ISO, and other styles
4

Fitzpatrick, Patrick, and Yee Lau. CONCORDE Meteorological Analysis (CMA) - Data Guide. The University of Southern Mississippi, 2023. http://dx.doi.org/10.18785/sose.003.

Full text
Abstract:
CONCORDE is the CONsortium for oil spill exposure pathways in COastal River-Dominated Ecosystems (CONCORDE), and is an interdisciplinary research program funded by the Gulf of Mexico Research Initiative (GoMRI) to conduct scientific studies of the impacts of oil, dispersed oil and dispersant on the Gulf’s ecosystem (Greer et al. 2018). A CONCORDE goal is to implement a synthesis model containing circulation and biogeochemistry components of the Northern Gulf of Mexico shelf system which can ultimately aid in prediction of oil spill transport and impacts. The CONCORDE Meteorological Analysis (CMA) is an hourly gridded NetCDF dataset which provides atmospheric forcing for the synthesis model. CMA includes a variety of parameters from multiple sources. The Real-Time Mesoscale Analysis (RTMA; De Pondeca et al. 2011) provides the surface momentum and the thermodynamic atmospheric data. The radiation parameters and total cloud cover percentage are from the North American Mesoscale (NAM) Forecast System fields. The hourly precipitation is extracted from the Next Generation Weather Radar (NEXRAD) Level-III. Gridded sea surface temperature fields (SST) are computed daily using a 10-day running mean of the Advanced Very High-Resolution Radiometer (AVHRR) SST product. The Coupled Ocean-Atmosphere Response Experiment flux (COARE) algorithm calculates sensible heat flux and surface momentum stresses (Fairall et al. 2003). CMA’s spatial domain’s lowest west grid point is at 90.13°W, 29°N, and the highest east grid point is at 87.05°W, 30.94°N. The grid spacing is 0.01 degree, and the grid dimension is 309 by 195.
APA, Harvard, Vancouver, ISO, and other styles
5

Dashtey, Ahmed, Patrick Mormile, Sandra Pedre, Stephany Valdaliso, and Walter Tang. Prediction of PFOA and PFOS Toxicity through Log P and Number of Carbon with CompTox and Machine Learning Tools. Florida International University, July 2024. http://dx.doi.org/10.25148/ceefac.2024.00202400.

Full text
Abstract:
Perfluorooctanoic acid (PFOA) and perfluorooctane sulfonic acid (PFOS) are two major groups of PFAS will be subjected to the Maximal Contamination Concentration (MCL) of 4 ng/l in drinking water to be implemented by the U.S. EPA by 2025. How to accurately predict toxicity of PFAS with varied carbon chain length is important for treatment and sequential removal from drinking water. This study presents Quantitative Structure and Activity Relationship (QSAR) models developed through both linear regression and two order regression. Log P is compiled from reference and carbon content is counted as the molecule represented. Bioconcentration potential is predicted from CompTox.The results suggest that as log P and carbon content increase, the bioconcentration potential of PFCAs also increases. In other words, larger PFCA molecules tend to be easier to bioaccumulate in living organisms. This finding is crucial because bioconcentration refers to the accumulation of substances from water directly into living organisms through the process of passive diffusion across cell membranes. On the other hand, 96-hour fathead minnow LC50 has an inverse relationship, with higher LC50 values associated with lower log P and fewer carbons. The varying R-squared values across methods indicate differing degrees of correlation, underscoring the impact of compound structure on aquatic toxicity. Similarly, for oral rat LD50 and 48-hour D. magna LC50, the R-squared values reflect moderate to strong correlations with log P and the number of carbons. As the log P and carbon content decrease,the toxicity expressed in LC50 or LD50 increases. This relationship underscores the role of chemical properties in influencing the toxicity of PFCAs across different organisms and exposure routes. For instance, the negative correlation between log P and aquatic toxicity (96-hour fathead minnow LC50 and 48-hour D. magna LC50) suggests that compounds with higher hydrophobicity (higher log P) and more carbons may exhibit lower acute toxicity to aquatic organisms.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography