Добірка наукової літератури з теми "Globally optimal inference"

Оформте джерело за APA, MLA, Chicago, Harvard та іншими стилями

Оберіть тип джерела:

Ознайомтеся зі списками актуальних статей, книг, дисертацій, тез та інших наукових джерел на тему "Globally optimal inference".

Біля кожної праці в переліку літератури доступна кнопка «Додати до бібліографії». Скористайтеся нею – і ми автоматично оформимо бібліографічне посилання на обрану працю в потрібному вам стилі цитування: APA, MLA, «Гарвард», «Чикаго», «Ванкувер» тощо.

Також ви можете завантажити повний текст наукової публікації у форматі «.pdf» та прочитати онлайн анотацію до роботи, якщо відповідні параметри наявні в метаданих.

Статті в журналах з теми "Globally optimal inference"

1

Clarke, J., and M. Lapata. "Global Inference for Sentence Compression: An Integer Linear Programming Approach." Journal of Artificial Intelligence Research 31 (March 11, 2008): 399–429. http://dx.doi.org/10.1613/jair.2433.

Повний текст джерела
Анотація:
Sentence compression holds promise for many applications ranging from summarization to subtitle generation. Our work views sentence compression as an optimization problem and uses integer linear programming (ILP) to infer globally optimal compressions in the presence of linguistically motivated constraints. We show how previous formulations of sentence compression can be recast as ILPs and extend these models with novel global constraints. Experimental results on written and spoken texts demonstrate improvements over state-of-the-art models.
Стилі APA, Harvard, Vancouver, ISO та ін.
2

Chan, Schultz, and Malay Ghosh. "The Geometry of Estimating Functions in the Presence of Nuisance Parameters 1." Calcutta Statistical Association Bulletin 48, no. 3-4 (September 1998): 123–38. http://dx.doi.org/10.1177/0008068319980301.

Повний текст джерела
Анотація:
The paper studies the geometry of estimating functions in the presence of nuisance parameters. The basic technique involves an idea of orthogonal projection first introduced by Small and McLeish (1989, 1991, 1992, 1994) in this context. The three main topics are : (A) globally optimal estimating functons; (B) locally optimal estimating functions; (C) conditionally optimal estimating functions. A general result is derived in each case. As special cases, we extend and unify some of the results already available in the literature. In particular, as special cases of our result on globally optimal estimating functions, we find the results of Godambe and Thompson (1974) and Godambe (1976) with nuisance parameters. We provide also a geometric interpretation of conditional and marginal inference of Bhapkar (1989, 1991) and Lloyd (1987) . As application of our result on locally optimal estimating functions, Godambe's (1985) result on optimal estimating functions for stochastic processes is extended to nuisance parameters. Finally, our general result on conditionally optimal estimating function helps to generalize the findings of Godambe and Thompson (1989) to situations which admit the presence of nuisance parameters.
Стилі APA, Harvard, Vancouver, ISO та ін.
3

Li, Xingjie Helen, Fei Lu, and Felix X. F. Ye. "ISALT: Inference-based schemes adaptive to large time-stepping for locally Lipschitz ergodic systems." Discrete & Continuous Dynamical Systems - S 15, no. 4 (2022): 747. http://dx.doi.org/10.3934/dcdss.2021103.

Повний текст джерела
Анотація:
<p style='text-indent:20px;'>Efficient simulation of SDEs is essential in many applications, particularly for ergodic systems that demand efficient simulation of both short-time dynamics and large-time statistics. However, locally Lipschitz SDEs often require special treatments such as implicit schemes with small time-steps to accurately simulate the ergodic measures. We introduce a framework to construct inference-based schemes adaptive to large time-steps (ISALT) from data, achieving a reduction in time by several orders of magnitudes. The key is the statistical learning of an approximation to the infinite-dimensional discrete-time flow map. We explore the use of numerical schemes (such as the Euler-Maruyama, the hybrid RK4, and an implicit scheme) to derive informed basis functions, leading to a parameter inference problem. We introduce a scalable algorithm to estimate the parameters by least squares, and we prove the convergence of the estimators as data size increases.</p><p style='text-indent:20px;'>We test the ISALT on three non-globally Lipschitz SDEs: the 1D double-well potential, a 2D multiscale gradient system, and the 3D stochastic Lorenz equation with a degenerate noise. Numerical results show that ISALT can tolerate time-step magnitudes larger than plain numerical schemes. It reaches optimal accuracy in reproducing the invariant measure when the time-step is medium-large.</p>
Стилі APA, Harvard, Vancouver, ISO та ін.
4

Rosen, David M., Luca Carlone, Afonso S. Bandeira, and John J. Leonard. "SE-Sync: A certifiably correct algorithm for synchronization over the special Euclidean group." International Journal of Robotics Research 38, no. 2-3 (August 29, 2018): 95–125. http://dx.doi.org/10.1177/0278364918784361.

Повний текст джерела
Анотація:
Many important geometric estimation problems naturally take the form of synchronization over the special Euclidean group: estimate the values of a set of unknown group elements [Formula: see text] given noisy measurements of a subset of their pairwise relative transforms [Formula: see text]. Examples of this class include the foundational problems of pose-graph simultaneous localization and mapping (SLAM) (in robotics), camera motion estimation (in computer vision), and sensor network localization (in distributed sensing), among others. This inference problem is typically formulated as a non-convex maximum-likelihood estimation that is computationally hard to solve in general. Nevertheless, in this paper we present an algorithm that is able to efficiently recover certifiably globally optimal solutions of the special Euclidean synchronization problem in a non-adversarial noise regime. The crux of our approach is the development of a semidefinite relaxation of the maximum-likelihood estimation (MLE) whose minimizer provides an exact maximum-likelihood estimate so long as the magnitude of the noise corrupting the available measurements falls below a certain critical threshold; furthermore, whenever exactness obtains, it is possible to verify this fact a posteriori, thereby certifying the optimality of the recovered estimate. We develop a specialized optimization scheme for solving large-scale instances of this semidefinite relaxation by exploiting its low-rank, geometric, and graph-theoretic structure to reduce it to an equivalent optimization problem defined on a low-dimensional Riemannian manifold, and then design a Riemannian truncated-Newton trust-region method to solve this reduction efficiently. Finally, we combine this fast optimization approach with a simple rounding procedure to produce our algorithm, SE-Sync. Experimental evaluation on a variety of simulated and real-world pose-graph SLAM datasets shows that SE-Sync is capable of recovering certifiably globally optimal solutions when the available measurements are corrupted by noise up to an order of magnitude greater than that typically encountered in robotics and computer vision applications, and does so significantly faster than the Gauss–Newton-based approach that forms the basis of current state-of-the-art techniques.
Стилі APA, Harvard, Vancouver, ISO та ін.
5

Inyang, Udoinyang Godwin, Emem Etok Akpan, and Oluwole Charles Akinyokun. "A Hybrid Machine Learning Approach for Flood Risk Assessment and Classification." International Journal of Computational Intelligence and Applications 19, no. 02 (June 2020): 2050012. http://dx.doi.org/10.1142/s1469026820500121.

Повний текст джерела
Анотація:
Communities globally experience devastating effects, high monetary loss and loss of lives due to incidents of flood and other hazards. Inadequate information and awareness of flood hazard make the management of flood risks arduous and challenging. This paper proposes a hybridized analytic approach via unsupervised and supervised learning methodologies, for the discovery of pieces of knowledge, clustering and prediction of flood severity levels (FSL). A two-staged unsupervised learning based on [Formula: see text]-means and self-organizing maps (SOM) was performed on the unlabeled flood dataset. [Formula: see text]-means based on silhouette criterion discovered top three representatives of the optimal numbers of clusters inherent in the flood dataset. Experts’ judgment favored four clusters, while Squared Euclidean distance was the best performing distance measure. SOM provided cluster visuals of the input attributes within the four different groups and transformed the dataset into a labeled one. A 5-layered Adaptive Neuro Fuzzy Inference System (ANFIS) driven by hybrid learning algorithm was applied to classify and predict FSL. ANFIS optimized by Genetic Algorithm (GA) produced root mean squared error (RMSE) of 0.323 and Error Standard Deviation of 0.408 while Particle Swarm Optimized ANFIS model produced 0.288 as the RMSE, depicting 11% improvement when compared with GA optimized model. The result shows significant improvement in the classification and prediction of flood risks using single ML tool.
Стилі APA, Harvard, Vancouver, ISO та ін.
6

Huang, Shu-Chiang, Shui-Kai Chang, Chi-Chang Lai, Tzu-Lun Yuan, Jinn-Shing Weng, and Jia-Sin He. "Length–Weight Relationships, Growth Models of Two Croakers (Pennahia macrocephalus and Atrobucca nibe) off Taiwan and Growth Performance Indices of Related Species." Fishes 7, no. 5 (October 11, 2022): 281. http://dx.doi.org/10.3390/fishes7050281.

Повний текст джерела
Анотація:
Information on age and growth is essential to modern stock assessment and the development of management plans for fish resources. To provide quality otolith-based estimates of growth parameters, this study performed five types of analyses on the two important croakers that were under high fishing pressure in southwestern Taiwan: Pennahia macrocephalus (big-head pennah croaker) and Atrobucca nibe (blackmouth croaker): (1) Estimation of length–weight relationships (LWR) with discussion on the differences with previous studies; (2) validation of the periodicity of ring formation using edge analysis; (3) examination of three age determination methods (integral, quartile and back-calculation methods) and selection of the most appropriate one using a k-fold cross-validation simulation; (4) determination of the representative growth models from four candidate models using a multimodel inference approach; and, (5) compilation of growth parameters for all Pennahia and Atrobucca species published globally for reviewing the clusters of estimates using auximetric plots of logged growth parameters. The study observed that features of samples affected the LWR estimates. Edge analysis supported the growth rings were formed annually, and the cross-validation study supported the quartile method (age was determined as the number of opaque bands on otolith plus the quartile of the width of the marginal translucent band) provided more appropriate estimates of age. The multimodel inference approach suggested the von Bertalanffy growth model as the optimal model for P. macrocephalus and logistic growth model for A. nibe, with asymptotic lengths and relative growth rates of 18.0 cm TL and 0.789 year−1 and 55.21 cm, 0.374 year−1, respectively. Auximetric plots of global estimates showed a downward trend with clusters by species. Growth rates of the two species were higher than in previous studies using the same aging structure (otolith) and from similar locations conducted a decade ago, suggesting a possible effect of increased fishing pressure and the need to establish a management framework. This study adds updated information to the global literature and provides an overview of growth parameters for the two important croakers.
Стилі APA, Harvard, Vancouver, ISO та ін.
7

Tian, Luogeng, Bailong Yang, Xinli Yin, Kai Kang, and Jing Wu. "Multipath Cross Graph Convolution for Knowledge Representation Learning." Computational Intelligence and Neuroscience 2021 (December 28, 2021): 1–13. http://dx.doi.org/10.1155/2021/2547905.

Повний текст джерела
Анотація:
In the past, most of the entity prediction methods based on embedding lacked the training of local core relationships, resulting in a deficiency in the end-to-end training. Aiming at this problem, we propose an end-to-end knowledge graph embedding representation method. It involves local graph convolution and global cross learning in this paper, which is called the TransC graph convolutional network (TransC-GCN). Firstly, multiple local semantic spaces are divided according to the largest neighbor. Secondly, a translation model is used to map the local entities and relationships into a cross vector, which serves as the input of GCN. Thirdly, through training and learning of local semantic relations, the best entities and strongest relations are found. The optimal entity relation combination ranking is obtained by evaluating the posterior loss function based on the mutual information entropy. Experiments show that this paper can obtain local entity feature information more accurately through the convolution operation of the lightweight convolutional neural network. Also, the maximum pooling operation helps to grasp the strong signal on the local feature, thereby avoiding the globally redundant feature. Compared with the mainstream triad prediction baseline model, the proposed algorithm can effectively reduce the computational complexity while achieving strong robustness. It also increases the inference accuracy of entities and relations by 8.1% and 4.4%, respectively. In short, this new method can not only effectively extract the local nodes and relationship features of the knowledge graph but also satisfy the requirements of multilayer penetration and relationship derivation of a knowledge graph.
Стилі APA, Harvard, Vancouver, ISO та ін.
8

Bauer-Marschallinger, Bernhard, Senmao Cao, Mark Edwin Tupas, Florian Roth, Claudio Navacchi, Thomas Melzer, Vahid Freeman, and Wolfgang Wagner. "Satellite-Based Flood Mapping through Bayesian Inference from a Sentinel-1 SAR Datacube." Remote Sensing 14, no. 15 (July 31, 2022): 3673. http://dx.doi.org/10.3390/rs14153673.

Повний текст джерела
Анотація:
Spaceborne Synthetic Aperture Radar (SAR) are well-established systems for flood mapping, thanks to their high sensitivity towards water surfaces and their independence from daylight and cloud cover. Particularly able is the 2014-launched Copernicus Sentinel-1 C-band SAR mission, with its systematic monitoring schedule featuring global land coverage in a short revisit time and a 20 m ground resolution. Yet, variable environment conditions, low-contrasting land cover, and complex terrain pose major challenges to fully automated flood monitoring. To overcome these issues, and aiming for a robust classification, we formulate a datacube-based flood mapping algorithm that exploits the Sentinel-1 orbit repetition and a priori generated probability parameters for flood and non-flood conditions. A globally applicable flood signature is obtained from manually collected wind- and frost-free images. Through harmonic analysis of each pixel’s full time series, we derive a local seasonal non-flood signal comprising the expected backscatter values for each day-of-year. From those predefined probability distributions, we classify incoming Sentinel-1 images by simple Bayes inference, which is computationally slim and hence suitable for near-real-time operations, and also yields uncertainty values. The datacube-based masking of no-sensitivity resulting from impeding land cover and ill-posed SAR configuration enhances the classification robustness. We employed the algorithm on a 6-year Sentinel-1 datacube over Greece, where a major flood hit the region of Thessaly in 2018. In-depth analysis of model parameters and sensitivity, and the evaluation against microwave and optical reference flood maps, suggest excellent flood mapping skill, and very satisfying classification metrics with about 96% overall accuracy and only few false positives. The presented algorithm is part of the ensemble flood mapping product of the Global Flood Monitoring (GFM) component of the Copernicus Emergency Management Service (CEMS).
Стилі APA, Harvard, Vancouver, ISO та ін.
9

Hou, Jiawei, Albert I. J. M. van Dijk, Hylke E. Beck, Luigi J. Renzullo, and Yoshihide Wada. "Remotely sensed reservoir water storage dynamics (1984–2015) and the influence of climate variability and management at a global scale." Hydrology and Earth System Sciences 26, no. 14 (July 19, 2022): 3785–803. http://dx.doi.org/10.5194/hess-26-3785-2022.

Повний текст джерела
Анотація:
Abstract. Many thousands of large dam reservoirs have been constructed worldwide during the last 70 years to increase reliable water supplies and support economic growth. Because reservoir storage measurements are generally not publicly available, so far there has been no global assessment of long-term dynamic changes in reservoir water volumes. We overcame this by using optical (Landsat) and altimetry remote sensing to reconstruct monthly water storage for 6695 reservoirs worldwide between 1984 and 2015. We relate reservoir storage to resilience and vulnerability and investigate interactions between precipitation, streamflow, evaporation, and reservoir water storage. This is based on a comprehensive analysis of streamflow from a multi-model ensemble and as observed at ca. 8000 gauging stations, precipitation from a combination of station, satellite and forecast data, and open water evaporation estimates. We find reservoir storage has diminished substantially for 23 % of reservoirs over the three decades, but increased for 21 %. The greatest declines were for dry basins in southeastern Australia (−29 %), southwestern USA (−10 %), and eastern Brazil (−9 %). The greatest gains occurred in the Nile Basin (+67 %), Mediterranean basins (+31 %) and southern Africa (+22 %). Many of the observed reservoir changes could be explained by changes in precipitation and river inflows, emphasizing the importance of multi-decadal precipitation changes for reservoir water storage. Uncertainty in the analysis can come from, among others, the relatively low Landsat imaging frequency for parts of the Earth and the simple geo-statistical bathymetry model used. Our results also show that there is generally little impact from changes in net evaporation on storage trends. Based on the reservoir water balance, we deduce it is unlikely that water release trends dominate global trends in reservoir storage dynamics. This inference is further supported by different spatial patterns in water withdrawal and storage trends globally. A more definitive conclusion about the impact of changes in water releases at the global or local scale would require data that unfortunately are not publicly available for the vast majority of reservoirs globally.
Стилі APA, Harvard, Vancouver, ISO та ін.
10

Sonali, P., and D. Nagesh Kumar. "Review of recent advances in climate change detection and attribution studies: a large-scale hydroclimatological perspective." Journal of Water and Climate Change 11, no. 1 (February 5, 2020): 1–29. http://dx.doi.org/10.2166/wcc.2020.091.

Повний текст джерела
Анотація:
Abstract The rapid changes in global average surface temperature have unfathomed influences on human society, environment, ecosystem, availability of food and fresh water. Multiple lines of evidence indicate that warming of the climate system is unequivocal, and human-induced effects are playing an enhanced role in climate change. It is of utmost importance to ascertain the hydroclimatological changes in order to ascertain the characteristics of detection and attribution (D&A) of human-induced anthropogenic influences on recent warming. Climate change D&A are interrelated. Their study enhances our understanding about the rudimentary causes leading to climate changes and hence, considered as a decisive element in all Intergovernmental Panel on Climate Change Assessment Reports. An extensive discussion of the concerned scientific literature on climate change D&A is indispensably needed for the scientific community to assess climate change threats in clear terms. This study has reviewed various processes and advances in climate change D&A analyses at global/regional scales during the past few decades. Regression-based optimal fingerprint approach is majorly employed in climate change D&A studies. The accumulation of inferences presented in this study from numerous studies could be extremely helpful for the scientific community and policymakers as they deal with climate change adaptation and mitigation challenges.
Стилі APA, Harvard, Vancouver, ISO та ін.

Дисертації з теми "Globally optimal inference"

1

Maicas, Suso Gabriel. "Pre-hoc and Post-hoc Diagnosis and Interpretation of Breast Magnetic Resonance Volumes." Thesis, 2018. http://hdl.handle.net/2440/120330.

Повний текст джерела
Анотація:
Breast cancer is among the leading causes of death in women. Aiming at reducing the number of casualties, breast screening programs have been implemented to diagnose asymptomatic cancers due to the correlation of higher survival rates with earlier tumour detection. Although these programs are normally based on mammography, magnetic resonance imaging (MRI) is recommended for patients at high-risk. The interpretation of such MRI volumes is timeconsuming and prone to inter-observer variability, leading to missed cancers and a relatively high number of false positives provoking unnecessary biopsies. Consequently, computeraided diagnosis systems are being designed to help improve the efficiency and the diagnosis outcomes of radiologists in breast screening programs. Traditional automated breast screening systems are based on a two-stage pipeline consisting of the localization of suspicious regions of interest (ROIs) and their classification to perform the diagnosis (i.e. decide about their malignancy). This process is typically ineffective due to the usual expensive inference involved in the exhaustive search for ROIs and the employment of non-optimal hand-crafted features in both stages. These issues have been partially addressed with the introduction of deep learning methods that unfortunately need large strongly annotated training datasets (voxel-wise labelling of each lesion), which tend to be expensive to acquire. Alternatively, the use of weakly labelled datasets (i.e volume-level labels) allows diagnosis to become a supervised classification problem, where a malignancy probability is estimated after examining the entire volume. However, large weakly labelled training sets are still required. Additionally, to facilitate the adoption of such weakly trained systems in clinical practice, it is desirable that they are capable of providing the localization of lesions that justifies the automatically produced diagnosis for the whole volume. Nonetheless, current methods lack the precision required for the problem of weakly supervised lesion detection. Motivated by these limitations, we propose a number of methods that address these deficiencies. First, we propose two strongly supervised deep learning approaches that not only can be trained with relatively small datasets, but are efficient in the localization of suspicious tissue. In particular, we propose: 1) the global minimization of an energy functional containing information from the semantic segmentation produced by a deep learning model for lesion segmentation, and 2) a reinforcement learning model for suspicious region detection. Diagnosis is performed by classifying suspicious regions yielded by the reinforcement learning model. Second, aiming to reduce the burden associated to strongly annotating datasets, we propose a novel training methodology to improve the diagnosis performance on systems trained with weakly labelled datasets that contain a relatively small number of training samples. We further propose a novel 1-class saliency detector to automatically localize lesions associated with the diagnosis outcome of this model. Finally, we present a comparison between both of our proposed approaches for diagnosis and lesion detection. Experiments show that whole volume analysis with weakly labelled datasets achieves better performance for malignancy diagnosis than the strongly supervised methods. However, strongly supervised methods show better accuracy for lesion detection.
Thesis (Ph.D.) -- University of Adelaide, School of Computer Science, 2018
Стилі APA, Harvard, Vancouver, ISO та ін.

Тези доповідей конференцій з теми "Globally optimal inference"

1

Habeeb, Haroun, Ankit Anand, Mausam, and Parag Singla. "Coarse-to-Fine Lifted MAP Inference in Computer Vision." In Twenty-Sixth International Joint Conference on Artificial Intelligence. California: International Joint Conferences on Artificial Intelligence Organization, 2017. http://dx.doi.org/10.24963/ijcai.2017/641.

Повний текст джерела
Анотація:
There is a vast body of theoretical research on lifted inference in probabilistic graphical models (PGMs). However, few demonstrations exist where lifting is applied in conjunction with top of the line applied algorithms. We pursue the applicability of lifted inference for computer vision (CV), with the insight that a globally optimal (MAP) labeling will likely have the same label for two symmetric pixels. The success of our approach lies in efficiently handling a distinct unary potential on every node (pixel), typical of CV applications. This allows us to lift the large class of algorithms that model a CV problem via PGM inference. We propose a generic template for coarse-to-fine (C2F) inference in CV, which progressively refines an initial coarsely lifted PGM for varying quality-time trade-offs. We demonstrate the performance of C2F inference by developing lifted versions of two near state-of-the-art CV algorithms for stereo vision and interactive image segmentation. We find that, against flat algorithms, the lifted versions have a much superior anytime performance, without any loss in final solution quality.
Стилі APA, Harvard, Vancouver, ISO та ін.
2

Krajsek, Kai, and Rudolf Mester. "Marginalized Maximum a Posteriori Hyper-parameter Estimation for Global Optical Flow Techniques." In Bayesian Inference and Maximum Entropy Methods In Science and Engineering. AIP, 2006. http://dx.doi.org/10.1063/1.2423289.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
3

Zhang, Zhen, Julian McAuley, Yong Li, Wei Wei, Yanning Zhang, and Qinfeng Shi. "Dynamic Programming Bipartite Belief Propagation For Hyper Graph Matching." In Twenty-Sixth International Joint Conference on Artificial Intelligence. California: International Joint Conferences on Artificial Intelligence Organization, 2017. http://dx.doi.org/10.24963/ijcai.2017/650.

Повний текст джерела
Анотація:
Hyper graph matching problems have drawn attention recently due to their ability to embed higher order relations between nodes. In this paper, we formulate hyper graph matching problems as constrained MAP inference problems in graphical models. Whereas previous discrete approaches introduce several global correspondence vectors, we introduce only one global correspondence vector, but several local correspondence vectors. This allows us to decompose the problem into a (linear) bipartite matching problem and several belief propagation sub-problems. Bipartite matching can be solved by traditional approaches, while the belief propagation sub-problem is further decomposed as two sub-problems with optimal substructure. Then a newly proposed dynamic programming procedure is used to solve the belief propagation sub-problem. Experiments show that the proposed methods outperform state-of-the-art techniques for hyper graph matching.
Стилі APA, Harvard, Vancouver, ISO та ін.
4

Kuzman, Boris, and Biljana Petković. "ADAPTIVE NEURO FUZZY ESTIMATION OF THE OPTIMAL COVID-19 PREDICTORS FOR GLOBAL TOURISM." In The Sixth International Scientific Conference - TOURISM CHALLENGES AMID COVID-19, Thematic Proceedings. FACULTY OF HOTEL MANAGEMENT AND TOURISM IN VRNJAČKA BANJA UNIVERSITY OF KRAGUJEVAC, 2021. http://dx.doi.org/10.52370/tisc2194bk.

Повний текст джерела
Анотація:
COVID-19 is a pandemic that has emerged as a result of 2019-novel coronavirus droplet infection (2019-nCoV). Recognition of its risk and prognostic factor is critical due to its rapid dissemination and high casefatality rate. Tourism industry as one of the greatest industries has suffered a lot in the pandemic situation. The main aim of the study was to present travelers’ reaction during the pandemic by data mining methodology. The effect of eleven predictors for COVID-19 was also analyzed. The used predictors are: population density, urban population percentage, number of hospital beds, female and male lung size, median age, crime index, population number, smoking index and percentage of females. As the output factors, infection rate, death rate and recovery rate were used. The analyzing procedure was performed by adaptive neuro fuzzy inference system (ANFIS). The results revealed that the frequency of the used words in the pandemic show the highest impact on the travelers’ reactions. Number of hospital beds and population number is the optimal combination for the best prediction of infection rate of COVID-19.
Стилі APA, Harvard, Vancouver, ISO та ін.
5

Rinderle, James R., and V. Krishnan. "Constraint Reasoning in Concurrent Design." In ASME 1990 Design Technical Conferences. American Society of Mechanical Engineers, 1990. http://dx.doi.org/10.1115/detc1990-0108.

Повний текст джерела
Анотація:
Abstract One paradigm of concurrent design is based on the simultaneous consideration of a broad range of life-cycle constraints including those arising from function, manufacturing and maintenance. This simultaneous treatment of life-cycle issues results in a multitude of constraints, which not only increase the complexity of finding a design solution, but also make it difficult to understand the trends and interactions underlying the design. It is our goal to enhance the designer’s ability to identify and discriminate those constraints that critically impact the design from those that are irrelevant. We propose an interval analysis based approach, which is augmented with monotonicity and dominance principles. The approach helps in identifying regions of the design space where constraints possess certain desirable properties. It also enables reasoning with constraints in these regions. The regional inferences can then be reassembled to obtain global results. These ideas have been applied in the concurrent design of a fan blade, to identify the dominant, active and redundant constraints, enabling the designer to more clearly perceive and base his decisions on the critical design consideration. Furthermore, the identification of dominant constraints permits the easy evaluation of the significance of newly asserted constraints and frequently facilitates the automatic formulation of noniterative constraint satisfaction methods which guarantee a globally optimal design.
Стилі APA, Harvard, Vancouver, ISO та ін.
6

Yang, Chaoqi, Cao Xiao, Fenglong Ma, Lucas Glass, and Jimeng Sun. "SafeDrug: Dual Molecular Graph Encoders for Recommending Effective and Safe Drug Combinations." In Thirtieth International Joint Conference on Artificial Intelligence {IJCAI-21}. California: International Joint Conferences on Artificial Intelligence Organization, 2021. http://dx.doi.org/10.24963/ijcai.2021/514.

Повний текст джерела
Анотація:
Medication recommendation is an essential task of AI for healthcare. Existing works focused on recommending drug combinations for patients with complex health conditions solely based on their electronic health records. Thus, they have the following limitations: (1) some important data such as drug molecule structures have not been utilized in the recommendation process. (2) drug-drug interactions (DDI) are modeled implicitly, which can lead to sub-optimal results. To address these limitations, we propose a DDI-controllable drug recommendation model named SafeDrug to leverage drugs’ molecule structures and model DDIs explicitly. SafeDrug is equipped with a global message passing neural network (MPNN) module and a local bipartite learning module to fully encode the connectivity and functionality of drug molecules. SafeDrug also has a controllable loss function to control DDI level in the recommended drug combinations effectively. On a benchmark dataset, our SafeDrug is relatively shown to reduce DDI by 19.43% and improves 2.88% on Jaccard similarity between recommended and actually prescribed drug combinations over previous approaches. Moreover, SafeDrug also requires much fewer parameters than previous deep learning based approaches, leading to faster training by about 14% and around 2× speed-up in inference.
Стилі APA, Harvard, Vancouver, ISO та ін.

Звіти організацій з теми "Globally optimal inference"

1

Rajarajan, Kunasekaran, Alka Bharati, Hirdayesh Anuragi, Arun Kumar Handa, Kishor Gaikwad, Nagendra Kumar Singh, Kamal Prasad Mohapatra, et al. Status of perennial tree germplasm resources in India and their utilization in the context of global genome sequencing efforts. World Agroforestry, 2020. http://dx.doi.org/10.5716/wp20050.pdf.

Повний текст джерела
Анотація:
Tree species are characterized by their perennial growth habit, woody morphology, long juvenile period phase, mostly outcrossing behaviour, highly heterozygosity genetic makeup, and relatively high genetic diversity. The economically important trees have been an integral part of the human life system due to their provision of timber, fruit, fodder, and medicinal and/or health benefits. Despite its widespread application in agriculture, industrial and medicinal values, the molecular aspects of key economic traits of many tree species remain largely unexplored. Over the past two decades, research on forest tree genomics has generally lagged behind that of other agronomic crops. Genomic research on trees is motivated by the need to support genetic improvement programmes mostly for food trees and timber, and develop diagnostic tools to assist in recommendation for optimum conservation, restoration and management of natural populations. Research on long-lived woody perennials is extending our molecular knowledge and understanding of complex life histories and adaptations to the environment, enriching a field that has traditionally drawn its biological inference from a few short-lived herbaceous species. These concerns have fostered research aimed at deciphering the genomic basis of complex traits that are related to the adaptive value of trees. This review summarizes the highlights of tree genomics and offers some priorities for accelerating progress in the next decade.
Стилі APA, Harvard, Vancouver, ISO та ін.
Ми пропонуємо знижки на всі преміум-плани для авторів, чиї праці увійшли до тематичних добірок літератури. Зв'яжіться з нами, щоб отримати унікальний промокод!

До бібліографії