Journal articles on the topic 'Over-constrained temporal problems'

To see the other types of publications on this topic, follow the link: Over-constrained temporal problems.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 23 journal articles for your research on the topic 'Over-constrained temporal problems.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Yu, Peng, Brian Williams, Cheng Fang, Jing Cui, and Patrik Haslum. "Resolving Over-Constrained Temporal Problems with Uncertainty through Conflict-Directed Relaxation." Journal of Artificial Intelligence Research 60 (October 30, 2017): 425–90. http://dx.doi.org/10.1613/jair.5431.

Full text
Abstract:
Over-subscription, that is, being assigned too many things to do, is commonly encountered in temporal scheduling problems. As human beings, we often want to do more than we can actually do, and underestimate how long it takes to perform each task. Decision makers can benefit from aids that identify when these failure situations are likely, the root causes of these failures, and resolutions to these failures. In this paper, we present a decision assistant that helps users resolve over-subscribed temporal problems. The system works like an experienced advisor that can quickly identify the cause of failure underlying temporal problems and compute resolutions. The core of the decision assistant is the Best-first Conflict-Directed Relaxation (BCDR) algorithm, which can detect conflicting sets of constraints within temporal problems, and computes continuous relaxations for them that weaken constraints to the minimum extent, instead of removing them completely. BCDR is an extension to the Conflict-Directed A* algorithm, first developed in the model-based reasoning community to compute most likely system diagnoses or reconfigurations. It generalizes the discrete conflicts and relaxations, to hybrid conflicts and relaxations, which denote minimal inconsistencies and minimal relaxations to both discrete and continuous relaxable constraints. In addition, BCDR is capable of handling temporal uncertainty, expressed as either set-bounded or probabilistic durations, and can compute preferred trade-offs between the risk of violating a schedule requirement, versus the loss of utility by weakening those requirements. BCDR has been applied to several decision support applications in different domains, including deep-sea exploration, urban travel planning and transit system management. It has demonstrated its effectiveness in helping users resolve over-subscribed scheduling problems and evaluate the robustness of existing solutions. In our benchmark experiments, BCDR has also demonstrated its efficiency on solving large-scale scheduling problems in the aforementioned domains. Thanks to its conflict-driven approach for computing relaxations, BCDR achieves one to two orders of magnitude improvements on runtime performance when compared to state-of-the-art numerical solvers.
APA, Harvard, Vancouver, ISO, and other styles
2

Yu, Peng, Cheng Fang, and Brian Williams. "Resolving Uncontrollable Conditional Temporal Problems Using Continuous Relaxations." Proceedings of the International Conference on Automated Planning and Scheduling 24 (May 11, 2014): 341–48. http://dx.doi.org/10.1609/icaps.v24i1.13623.

Full text
Abstract:
Uncertainty is commonly encountered in temporal scheduling and planning problems, and can often lead to over-constrained situations. Previous relaxation algorithms for over-constrained temporal problems only work with requirement constraints, whose outcomes can be controlled by the agents. When applied to uncontrollable durations, these algorithms may only satisfy a subset of the random outcomes and hence their relaxations may fail during execution. In this paper, we present a new relaxation algorithm, Conflict-Directed Relaxation with Uncertainty (CDRU), which generates relaxations that restore the controllability of conditional temporal problems with uncontrollable durations. CDRU extends the Best-first Conflict-Directed Relaxation (BCDR) algorithm to uncontrollable temporal problems. It generalizes the conflict-learning process to extract conflicts from strong and dynamic controllability checking algorithms, and resolves the conflicts by both relaxing constraints and tightening uncontrollable durations. Empirical test results on a range of trip scheduling problems show that CDRU is efficient in resolving large scale uncontrollable problems: computing strongly controllable relaxations takes the same order of magnitude in time compared to consistent relaxations that do not account for uncontrollable durations. While computing dynamically controllable relaxations takes two orders of magnitude more time, it provides significant improvements in solution quality when compared to strongly controllable relaxations.
APA, Harvard, Vancouver, ISO, and other styles
3

Mouhoub, Malek. "Stochastic search versus genetic algorithms for solving real time and over-constrained temporal constraint problems." International Journal of Knowledge-based and Intelligent Engineering Systems 9, no. 1 (January 29, 2005): 33–41. http://dx.doi.org/10.3233/kes-2005-9104.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Chen, Jingkai, Yuening Zhang, Cheng Fang, and Brian C. Williams. "Generalized Conflict-Directed Search for Optimal Ordering Problems." Proceedings of the International Symposium on Combinatorial Search 12, no. 1 (July 22, 2021): 46–54. http://dx.doi.org/10.1609/socs.v12i1.18550.

Full text
Abstract:
Solving planning and scheduling problems for multiple tasks with highly coupled state and temporal constraints is notoriously challenging. An appealing approach to effectively decouple the problem is to judiciously order the events such that decisions can be made over sequences of tasks. As many problems encountered in practice are over-constrained, we must instead find relaxed solutions in which certain requirements are dropped. This motivates a formulation of optimality with respect to the costs of relaxing constraints and the problem of finding an optimal ordering under which this relaxing cost is minimum. In this paper, we present Generalized Conflict-directed Ordering (GCDO), a branch-and-bound ordering method that generates an optimal total order of events by leveraging the generalized conflicts of both inconsistency and suboptimality from sub-solvers for cost estimation and solution space pruning. Due to its ability to reason over generalized conflicts, GCDO is much more efficient in finding high-quality total orders than the previous conflict-directed approach CDITO. We demonstrate this by benchmarking on temporal network configuration problems, which involves managing networks over time and makes necessary tradeoffs between network flows against CDITO and Mixed Integer-Linear Programing (MILP). Our algorithm is able to solve two orders of magnitude more benchmark problems to optimality and twice the problems compared to CDITO and MILP within a runtime limit, respectively.
APA, Harvard, Vancouver, ISO, and other styles
5

MOUHOUB, MALEK. "A HOPFIELD-TYPE NEURAL NETWORK BASED MODEL FOR TEMPORAL CONSTRAINTS." International Journal on Artificial Intelligence Tools 13, no. 03 (September 2004): 533–45. http://dx.doi.org/10.1142/s0218213004001673.

Full text
Abstract:
In this paper we present an approximation method based on discrete Hopfield neural network (DHNN) for solving temporal constraint satisfaction problems. This method is of interest for problems involving numeric and symbolic temporal constraints and where a solution satisfying the constraints of the problem needs to be found within a given deadline. More precisely the method has the ability to provide a solution with a quality proportional to the allocated process time. The quality of the solution corresponds here to the number of satisfied constraints. This property is very important for real world applications including reactive scheduling and planning and also for over constrained problems where a complete solution cannot be found. Experimental study, in terms of time cost and quality of the solution provided, of the DHNN based method we propose provides promising results comparing to the other exact methods based on branch and bound and approximation methods based on stochastic local search.
APA, Harvard, Vancouver, ISO, and other styles
6

Fang, Cheng, Andrew J. Wang, and Brian C. Williams. "Chance-constrained Static Schedules for Temporally Probabilistic Plans." Journal of Artificial Intelligence Research 75 (December 7, 2022): 1323–72. http://dx.doi.org/10.1613/jair.1.13636.

Full text
Abstract:
Time management under uncertainty is essential to large scale projects. From space exploration to industrial production, there is a need to schedule and perform activities. given complex specifications on timing. In order to generate schedules that are robust to uncertainty in the duration of activities, prior work has focused on a problem framing that uses an interval-bounded uncertainty representation. However, such approaches are unable to take advantage of known probability distributions over duration. In this paper we concentrate on a probabilistic formulation of temporal problems with uncertain duration, called the probabilistic simple temporal problem. As distributions often have an unbounded range of outcomes, we consider chance-constrained solutions, with guarantees on the probability of meeting temporal constraints. By considering distributions over uncertain duration, we are able to use risk as a resource, reason over the relative likelihood of outcomes, and derive higher utility solutions. We first demonstrate our approach by encoding the problem as a convex program. We then develop a more efficient hybrid algorithm whose parent solver generates risk allocations and whose child solver generates schedules for a particular risk allocation. The child is made efficient by leveraging existing interval-bounded scheduling algorithms, while the parent is made efficient by extracting conflicts over risk allocations. We perform numerical experiments to show the advantages of reasoning over probabilistic uncertainty, by comparing the utility of schedules generated with risk allocation against those generated from reasoning over bounded uncertainty. We also empirically show that solution time is greatly reduced by incorporating conflict-directed risk allocation.
APA, Harvard, Vancouver, ISO, and other styles
7

Yoshimoto, Atsushi, and Patrick Asante. "Inter-Temporal Aggregation for Spatially Explicit Optimal Harvest Scheduling under Area Restrictions." Forest Science 67, no. 5 (September 17, 2021): 587–606. http://dx.doi.org/10.1093/forsci/fxab025.

Full text
Abstract:
Abstract We propose a new approach to solve inter-temporal unit aggregation issues under maximum opening size requirements using two models. The first model is based on Model I formulation with static harvest treatments for harvest activities. This model identifies periodic harvest activities using a set of constraints for inter-temporal aggregation. The second model is based on Model II formulation, which uses dynamic harvest treatments and incorporates periodic harvest activities directly into the model formulation. The proposed approach contributes to the literature on spatially constrained harvest scheduling problems as it allows a pattern of unit aggregation to change across multiple harvests over time, as inter-temporal aggregation under a maximum opening size requirement over period-specific duration. The main idea of the proposed approach for inter-temporal aggregation is to use a multiple layer scheme for a set of spatial constraints, which is adapted from a maximum flow specification in a spatial forest unit network and a sequential triangle connection to create fully connected feasible clusters. By dividing the planning horizon into period-specific durations for different spatial aggregation patterns, the models can complete inter-temporal spatial aggregation over the planning horizon under a maximum opening size requirement per duration.
APA, Harvard, Vancouver, ISO, and other styles
8

Snyder, Stephanie, and Charles ReVelle. "Temporal and spatial harvesting of irregular systems of parcels." Canadian Journal of Forest Research 26, no. 6 (June 1, 1996): 1079–88. http://dx.doi.org/10.1139/x26-119.

Full text
Abstract:
Spatial management issues have assumed a central position in planning for forest ecosystems in the United States on both public and private lands. The arrangement of management activities, especially harvesting activities, can often have adverse impacts on other neighboring areas of the forest. Thus, spatially explicit programming models, which can account for or prevent certain arrangements of activities or land allocations through the use of harvest adjacency constraints, have received considerable attention in the literature. The need for spatial specificity in programming models has led to the development of integer programming or mixed integer programming models. Given that integer programming problems are often viewed as a difficult class of problems to solve, heuristic solution methods have most often been used to solve spatially constrained forest management models. In this paper, a discrete (0–1) integer programming model that maximizes harvested timber volume over a multiperiod time horizon subject to harvest adjacency constraints is developed and tested for irregular, realistic systems of parcels. This model performed well computationally for many example configurations and was solved exactly using the simplex algorithm and limited branching and bounding. Certain spatial configurations with long time horizons did, however, require a nontrivial amount of branching and bounding. The model was tested using both contrived and real spatial data sets.
APA, Harvard, Vancouver, ISO, and other styles
9

Abbasian, Reza, and Malek Mouhoub. "A New Parallel GA-Based Method for Constraint Satisfaction Problems." International Journal of Computational Intelligence and Applications 15, no. 03 (September 2016): 1650017. http://dx.doi.org/10.1142/s1469026816500176.

Full text
Abstract:
Despite some success of Genetic Algorithms (GAs) when tackling Constraint Satisfaction Problems (CSPs), they generally suffer from poor crossover operators. In order to overcome this limitation in practice, we propose a novel crossover specifically designed for solving CSPs including Temporal CSPs (TCSPs). Together with a variable ordering heuristic and an integration into a parallel architecture, this proposed crossover enables the solving of large and hard problem instances as demonstrated by the experimental tests conducted on randomly generated CSPs and TCSPs based on the model RB. We will indeed demonstrate, through these tests, that our proposed method is superior to the known GA-based techniques for CSPs. In addition, we will show that we are able to compete with the efficient MAC-based Abscon 109 solver for random problem instances as well as those instances taken from Lecoutre’s CSP library. Finally, we conducted additional tests on very large consistent and over constrained CSPs and TCSPs instances in order to show the ability of our method to deal with constraint problems in real time. This corresponds to solving the CSP or the TCSP by giving a solution with a quality (number of solved constraints) depending on the time allocated for computation.
APA, Harvard, Vancouver, ISO, and other styles
10

Xi, Yanxin, Luyan Ji, and Xiurui Geng. "Pen Culture Detection Using Filter Tensor Analysis with Multi-Temporal Landsat Imagery." Remote Sensing 12, no. 6 (March 22, 2020): 1018. http://dx.doi.org/10.3390/rs12061018.

Full text
Abstract:
Aquaculture plays an important role in China’s total fisheries production nowadays, and it leads to a few problems, for example water quality degradation, which has damaging effect on the sustainable development of environment. Among the many forms of aquaculture that deteriorate the water quality, disorderly pen culture is especially severe. Pen culture began very early in Yangchenghu Lake and Taihu Lake in China and part of the pen culture still exists. Thus, it is of great significance to evaluate the distribution and area of the pen culture in the two lakes. However, the traditional method for pen culture detection is based on the factual measurement, which is labor and time consuming. At present, with the development of remote sensing technologies, some target detection algorithms for multi/hyper-spectral data have been used in the pen culture detection, but most of them are intended for the single-temporal remote sensing data. Recently, a target detection algorithm called filter tensor analysis (FTA), which is specially designed for multi-temporal remote sensing data, has been reported and has achieved better detection results compared to the traditional single-temporal methods in many cases. This paper mainly aims to investigate the pen culture in Yangchenghu Lake and Taihu Lake with FTA implemented on the multi-temporal Landsat imagery, by determining the optimal time phases combination of the Landsat data in advance. Furthermore, the suitability and superiority of FTA over Constrained Energy Minimization (CEM) in the process of pen culture detection were tested. It was observed in the experiments on the data of those two lakes that FTA can detect the pen culture much more accurately than CEM with Landsat data of selected bands and of limited number of time phases.
APA, Harvard, Vancouver, ISO, and other styles
11

Gourdji, S. M., A. I. Hirsch, K. L. Mueller, V. Yadav, A. E. Andrews, and A. M. Michalak. "Regional-scale geostatistical inverse modeling of North American CO<sub>2</sub> fluxes: a synthetic data study." Atmospheric Chemistry and Physics 10, no. 13 (July 8, 2010): 6151–67. http://dx.doi.org/10.5194/acp-10-6151-2010.

Full text
Abstract:
Abstract. A series of synthetic data experiments is performed to investigate the ability of a regional atmospheric inversion to estimate grid-scale CO2 fluxes during the growing season over North America. The inversions are performed within a geostatistical framework without the use of any prior flux estimates or auxiliary variables, in order to focus on the atmospheric constraint provided by the nine towers collecting continuous, calibrated CO2 measurements in 2004. Using synthetic measurements and their associated concentration footprints, flux and model-data mismatch covariance parameters are first optimized, and then fluxes and their uncertainties are estimated at three different temporal resolutions. These temporal resolutions, which include a four-day average, a four-day-average diurnal cycle with 3-hourly increments, and 3-hourly fluxes, are chosen to help assess the impact of temporal aggregation errors on the estimated fluxes and covariance parameters. Estimating fluxes at a temporal resolution that can adjust the diurnal variability is found to be critical both for recovering covariance parameters directly from the atmospheric data, and for inferring accurate ecoregion-scale fluxes. Accounting for both spatial and temporal a priori covariance in the flux distribution is also found to be necessary for recovering accurate a posteriori uncertainty bounds on the estimated fluxes. Overall, the results suggest that even a fairly sparse network of 9 towers collecting continuous CO2 measurements across the continent, used with no auxiliary information or prior estimates of the flux distribution in time or space, can be used to infer relatively accurate monthly ecoregion scale CO2 surface fluxes over North America within estimated uncertainty bounds. Simulated random transport error is shown to decrease the quality of flux estimates in under-constrained areas at the ecoregion scale, although the uncertainty bounds remain realistic. While these synthetic data inversions do not consider all potential issues associated with using actual measurement data, e.g. systematic transport errors or problems with the boundary conditions, they help to highlight the impact of inversion setup choices, and help to provide a baseline set of CO2 fluxes for comparison with estimates from future real-data inversions.
APA, Harvard, Vancouver, ISO, and other styles
12

Möhrle, Kathrin, Hugo E. Reyes-Aldana, Johannes Kollmann, and Leonardo H. Teixeira. "Suppression of an Invasive Native Plant Species by Designed Grassland Communities." Plants 10, no. 4 (April 15, 2021): 775. http://dx.doi.org/10.3390/plants10040775.

Full text
Abstract:
Grassland biodiversity is declining due to climatic change, land-use intensification, and establishment of invasive plant species. Excluding or suppressing invasive species is a challenge for grassland management. An example is Jacobaea aquatica, an invasive native plant in wet grasslands of Central Europe, that is causing problems to farmers by being poisonous, overabundant, and fast spreading. This study aimed at testing designed grassland communities in a greenhouse experiment, to determine key drivers of initial J. aquatica suppression, thus dismissing the use of pesticides. We used two base communities (mesic and wet grasslands) with three plant traits (plant height, leaf area, seed mass), that were constrained and diversified based on the invader traits. Native biomass, community-weighted mean trait values, and phylogenetic diversity (PD) were used as explanatory variables to understand variation in invasive biomass. The diversified traits leaf area and seed mass, PD, and native biomass significantly affected the invader. High native biomass permanently suppressed the invader, while functional traits needed time to develop effects; PD effects were significant at the beginning of the experiment but disappeared over time. Due to complexity and temporal effects, community weighted mean traits proved to be moderately successful for increasing invasion resistance of designed grassland communities.
APA, Harvard, Vancouver, ISO, and other styles
13

Zolles, Tobias, Fabien Maussion, Stephan Peter Galos, Wolfgang Gurgiser, and Lindsey Nicholson. "Robust uncertainty assessment of the spatio-temporal transferability of glacier mass and energy balance models." Cryosphere 13, no. 2 (February 8, 2019): 469–89. http://dx.doi.org/10.5194/tc-13-469-2019.

Full text
Abstract:
Abstract. Energy and mass-balance modelling of glaciers is a key tool for climate impact studies of future glacier behaviour. By incorporating many of the physical processes responsible for surface accumulation and ablation, they offer more insight than simpler statistical models and are believed to suffer less from problems of stationarity when applied under changing climate conditions. However, this view is challenged by the widespread use of parameterizations for some physical processes which introduces a statistical calibration step. We argue that the reported uncertainty in modelled mass balance (and associated energy flux components) are likely to be understated in modelling studies that do not use spatio-temporal cross-validation and use a single performance measure for model optimization. To demonstrate the importance of these principles, we present a rigorous sensitivity and uncertainty assessment workflow applied to a modelling study of two glaciers in the European Alps, extending classical best guess approaches. The procedure begins with a reduction of the model parameter space using a global sensitivity assessment that identifies the parameters to which the model responds most sensitively. We find that the model sensitivity to individual parameters varies considerably in space and time, indicating that a single stated model sensitivity value is unlikely to be realistic. The model is most sensitive to parameters related to snow albedo and vertical gradients of the meteorological forcing data. We then apply a Monte Carlo multi-objective optimization based on three performance measures: model bias and mean absolute deviation in the upper and lower glacier parts, with glaciological mass balance data measured at individual stake locations used as reference. This procedure generates an ensemble of optimal parameter solutions which are equally valid. The range of parameters associated with these ensemble members are used to estimate the cross-validated uncertainty of the model output and computed energy components. The parameter values for the optimal solutions vary widely, and considering longer calibration periods does not systematically result in better constrained parameter choices. The resulting mass balance uncertainties reach up to 1300 kg m−2, with the spatial and temporal transfer errors having the same order of magnitude. The uncertainty of surface energy flux components over the ensemble at the point scale reached up to 50 % of the computed flux. The largest absolute uncertainties originate from the short-wave radiation and the albedo parameterizations, followed by the turbulent fluxes. Our study highlights the need for due caution and realistic error quantification when applying such models to regional glacier modelling efforts, or for projections of glacier mass balance in climate settings that are substantially different from the conditions in which the model was optimized.
APA, Harvard, Vancouver, ISO, and other styles
14

Wei, Yujiao, Lin Zhu, Yun Chen, Xinyu Cao, and Huilin Yu. "Spatiotemporal Variations in Drought and Vegetation Response in Inner Mongolia from 1982 to 2019." Remote Sensing 14, no. 15 (August 7, 2022): 3803. http://dx.doi.org/10.3390/rs14153803.

Full text
Abstract:
Drought events cause ecological problems, including reduced water resources and degraded vegetation. Quantifying vegetation responses to drought is essential for ecological management. However, in existing research, the response relationships (correlations and lags) were typically determined based on Pearson correlation coefficient and the resultant lag times were constrained by the spatial and temporal resolutions of the analyzed data. Inner Mongolia is an important ecological barrier in northern China. Ecological security is one of the most concerned issues of the region’s sustainable development. Herein, we combined Global Inventory Modeling and Mapping Studies (GIMMS) normalized difference vegetation index (NDVI3g) with Systeme Probatoire d’Observation de la Terra-vegetation (SPOT-VGT) NDVI data through spatial downscaling. The obtained 1 km-resolution NDVI dataset spanning Inner Mongolia from 1982 to 2019 was used to represent the refined vegetation distribution. The standardized precipitation evapotranspiration index (SPEI) derived from gridded meteorological data was used to measure drought over the same period. We investigated the spatiotemporal characteristics of vegetation and drought in the region in the past 38 years. We then discussed changes in different vegetation responses to drought across eastern Inner Mongolia using cross wavelet transform (XWT) and wavelet coherence (WTC). The results reveal that in 82.4% of the study area, NDVI exhibited rising trends, and the SPEI values exhibited declining trends in 78.5% of the area. In eastern Inner Mongolia, the grassland NDVI was positively correlated with SPEI and significantly affected by drought events, while NDVI in forestlands, including shrubs, broad-leaved forests, and coniferous forests, was negatively correlated with SPEI in the short term and weakly affected by drought. The NDVI lag times behind SPEI in grasslands, coniferous forests, and broad-leaved forests were 1–1.5, 4.5, and 7–7.5 months, respectively. These findings provide a scientific foundation for environmental preservation in the region.
APA, Harvard, Vancouver, ISO, and other styles
15

Hussain, Md Muzakkir, Ahmad Taher Azar, Rafeeq Ahmed, Syed Umar Amin, Basit Qureshi, V. Dinesh Reddy, Irfan Alam, and Zafar Iqbal Khan. "SONG: A Multi-Objective Evolutionary Algorithm for Delay and Energy Aware Facility Location in Vehicular Fog Networks." Sensors 23, no. 2 (January 6, 2023): 667. http://dx.doi.org/10.3390/s23020667.

Full text
Abstract:
With the emergence of delay- and energy-critical vehicular applications, forwarding sense-actuate data from vehicles to the cloud became practically infeasible. Therefore, a new computational model called Vehicular Fog Computing (VFC) was proposed. It offloads the computation workload from passenger devices (PDs) to transportation infrastructures such as roadside units (RSUs) and base stations (BSs), called static fog nodes. It can also exploit the underutilized computation resources of nearby vehicles that can act as vehicular fog nodes (VFNs) and provide delay- and energy-aware computing services. However, the capacity planning and dimensioning of VFC, which come under a class of facility location problems (FLPs), is a challenging issue. The complexity arises from the spatio-temporal dynamics of vehicular traffic, varying resource demand from PD applications, and the mobility of VFNs. This paper proposes a multi-objective optimization model to investigate the facility location in VFC networks. The solutions to this model generate optimal VFC topologies pertaining to an optimized trade-off (Pareto front) between the service delay and energy consumption. Thus, to solve this model, we propose a hybrid Evolutionary Multi-Objective (EMO) algorithm called Swarm Optimized Non-dominated sorting Genetic algorithm (SONG). It combines the convergence and search efficiency of two popular EMO algorithms: the Non-dominated Sorting Genetic Algorithm (NSGA-II) and Speed-constrained Particle Swarm Optimization (SMPSO). First, we solve an example problem using the SONG algorithm to illustrate the delay–energy solution frontiers and plotted the corresponding layout topology. Subsequently, we evaluate the evolutionary performance of the SONG algorithm on real-world vehicular traces against three quality indicators: Hyper-Volume (HV), Inverted Generational Distance (IGD) and CPU delay gap. The empirical results show that SONG exhibits improved solution quality over the NSGA-II and SMPSO algorithms and hence can be utilized as a potential tool by the service providers for the planning and design of VFC networks.
APA, Harvard, Vancouver, ISO, and other styles
16

Belazi, Akram, Héctor Migallón, Daniel Gónzalez-Sánchez, Jorge Gónzalez-García, Antonio Jimeno-Morenilla, and José-Luis Sánchez-Romero. "Enhanced Parallel Sine Cosine Algorithm for Constrained and Unconstrained Optimization." Mathematics 10, no. 7 (April 3, 2022): 1166. http://dx.doi.org/10.3390/math10071166.

Full text
Abstract:
The sine cosine algorithm’s main idea is the sine and cosine-based vacillation outwards or towards the best solution. The first main contribution of this paper proposes an enhanced version of the SCA algorithm called as ESCA algorithm. The supremacy of the proposed algorithm over a set of state-of-the-art algorithms in terms of solution accuracy and convergence speed will be demonstrated by experimental tests. When these algorithms are transferred to the business sector, they must meet time requirements dependent on the industrial process. If these temporal requirements are not met, an efficient solution is to speed them up by designing parallel algorithms. The second major contribution of this work is the design of several parallel algorithms for efficiently exploiting current multicore processor architectures. First, one-level synchronous and asynchronous parallel ESCA algorithms are designed. They have two favors; retain the proposed algorithm’s behavior and provide excellent parallel performance by combining coarse-grained parallelism with fine-grained parallelism. Moreover, the parallel scalability of the proposed algorithms is further improved by employing a two-level parallel strategy. Indeed, the experimental results suggest that the one-level parallel ESCA algorithms reduce the computing time, on average, by 87.4% and 90.8%, respectively, using 12 physical processing cores. The two-level parallel algorithms provide extra reductions of the computing time by 91.4%, 93.1%, and 94.5% with 16, 20, and 24 processing cores, including physical and logical cores. Comparison analysis is carried out on 30 unconstrained benchmark functions and three challenging engineering design problems. The experimental outcomes show that the proposed ESCA algorithm behaves outstandingly well in terms of exploration and exploitation behaviors, local optima avoidance, and convergence speed toward the optimum. The overall performance of the proposed algorithm is statistically validated using three non-parametric statistical tests, namely Friedman, Friedman aligned, and Quade tests.
APA, Harvard, Vancouver, ISO, and other styles
17

Fang, Cheng, Peng Yu, and Brian Williams. "Chance-Constrained Probabilistic Simple Temporal Problems." Proceedings of the AAAI Conference on Artificial Intelligence 28, no. 1 (June 21, 2014). http://dx.doi.org/10.1609/aaai.v28i1.9048.

Full text
Abstract:
Scheduling under uncertainty is essential to many autonomous systems and logistics tasks. Probabilistic methods for solving temporal problems exist which quantify and attempt to minimize the probability of schedule failure. These methods are overly conservative, resulting in a loss in schedule utility. Chance constrained formalism address over-conservatism by imposing bounds on risk, while maximizing utility subject to these risk bounds. In this paper we present the probabilistic Simple Temporal Network (pSTN), a probabilistic formalism for representing temporal problems with bounded risk and a utility over event timing. We introduce a constrained optimisation algorithm for pSTNs that achieves compactness and efficiency through a problem encoding in terms of a parameterised STNU and its reformulation as a parameterised STN. We demonstrate through a car sharing application that our chance-constrained approach runs in the same time as the previous probabilistic approach, yields solutions with utility improvements of at least 5% over previous arts, while guaranteeing operation within the specified risk bound.
APA, Harvard, Vancouver, ISO, and other styles
18

Yu, Peng, Cheng Fang, and Brian Williams. "Resolving Over-Constrained Probabilistic Temporal Problems through Chance Constraint Relaxation." Proceedings of the AAAI Conference on Artificial Intelligence 29, no. 1 (March 4, 2015). http://dx.doi.org/10.1609/aaai.v29i1.9652.

Full text
Abstract:
When scheduling tasks for field-deployable systems, our solutions must be robust to the uncertainty inherent in the real world. Although human intuition is trusted to balance reward and risk, humans perform poorly in risk assessment at the scale and complexity of real world problems. In this paper, we present a decision aid system that helps human operators diagnose the source of risk and manage uncertainty in temporal problems. The core of the system is a conflict-directed relaxation algorithm, called Conflict-Directed Chance-constraint Relaxation (CDCR), which specializes in resolving over-constrained temporal problems with probabilistic durations and a chance constraint bounding the risk of failure. Given a temporal problem with uncertain duration, CDCR proposes execution strategies that operate at acceptable risk levels and pinpoints the source of risk. If no such strategy can be found that meets the chance constraint, it can help humans to repair the over-constrained problem by trading off between desirability of solution and acceptable risk levels. The decision aid has been incorporated in a mission advisory system for assisting oceanographers to schedule activities in deep-sea expeditions, and demonstrated its effectiveness in scenarios with realistic uncertainty.
APA, Harvard, Vancouver, ISO, and other styles
19

Mallick, Antik, Mohammad Khairul Bashar, Daniel S. Truesdell, Benton H. Calhoun, Siddharth Joshi, and Nikhil Shukla. "Using synchronized oscillators to compute the maximum independent set." Nature Communications 11, no. 1 (September 17, 2020). http://dx.doi.org/10.1038/s41467-020-18445-1.

Full text
Abstract:
Abstract Not all computing problems are created equal. The inherent complexity of processing certain classes of problems using digital computers has inspired the exploration of alternate computing paradigms. Coupled oscillators exhibiting rich spatio-temporal dynamics have been proposed for solving hard optimization problems. However, the physical implementation of such systems has been constrained to small prototypes. Consequently, the computational properties of this paradigm remain inadequately explored. Here, we demonstrate an integrated circuit of thirty oscillators with highly reconfigurable coupling to compute optimal/near-optimal solutions to the archetypally hard Maximum Independent Set problem with over 90% accuracy. This platform uniquely enables us to characterize the dynamical and computational properties of this hardware approach. We show that the Maximum Independent Set is more challenging to compute in sparser graphs than in denser ones. Finally, using simulations we evaluate the scalability of the proposed approach. Our work marks an important step towards enabling application-specific analog computing platforms to solve computationally hard problems.
APA, Harvard, Vancouver, ISO, and other styles
20

Buck, W. Chris, Hanh Nguyen, Mariana Siapka, Lopa Basu, Jessica Greenberg Cowan, Maria Inês De Deus, Megan Gleason, et al. "Integrated TB and HIV care for Mozambican children: temporal trends, site-level determinants of performance, and recommendations for improved TB preventive treatment." AIDS Research and Therapy 18, no. 1 (January 9, 2021). http://dx.doi.org/10.1186/s12981-020-00325-9.

Full text
Abstract:
Abstract Background Pediatric tuberculosis (TB), human immunodeficiency virus (HIV), and TB-HIV co-infection are health problems with evidence-based diagnostic and treatment algorithms that can reduce morbidity and mortality. Implementation and operational barriers affect adherence to guidelines in many resource-constrained settings, negatively affecting patient outcomes. This study aimed to assess performance in the pediatric HIV and TB care cascades in Mozambique. Methods A retrospective analysis of routine PEPFAR site-level HIV and TB data from 2012 to 2016 was performed. Patients 0–14 years of age were included. Descriptive statistics were used to report trends in TB and HIV indicators. Linear regression was done to assess associations of site-level variables with performance in the pediatric TB and HIV care cascades using 2016 data. Results Routine HIV testing and cotrimoxazole initiation for co-infected children in the TB program were nearly optimal at 99% and 96% in 2016, respectively. Antiretroviral therapy (ART) initiation was lower at 87%, but steadily improved from 2012 to 2016. From the HIV program, TB screening at the last consultation rose steadily over the study period, reaching 82% in 2016. The percentage of newly enrolled children who received either TB treatment or isoniazid preventive treatment (IPT) also steadily improved in all provinces, but in 2016 was only at 42% nationally. Larger volume sites were significantly more likely to complete the pediatric HIV and TB care cascades in 2016 (p value range 0.05 to < 0.001). Conclusions Mozambique has made significant strides in improving the pediatric care cascades for children with TB and HIV, but there were missed opportunities for TB diagnosis and prevention, with IPT utilization being particularly problematic. Strengthened TB/HIV programming that continues to focus on pediatric ART scale-up while improving delivery of TB preventive therapy, either with IPT or newer rifapentine-based regimens for age-eligible children, is needed.
APA, Harvard, Vancouver, ISO, and other styles
21

Buck, W. Chris, Hanh Nguyen, Mariana Siapka, Lopa Basu, Jessica Greenberg Cowan, Maria Inês De Deus, Megan Gleason, et al. "Integrated TB and HIV care for Mozambican children: temporal trends, site-level determinants of performance, and recommendations for improved TB preventive treatment." AIDS Research and Therapy 18, no. 1 (January 9, 2021). http://dx.doi.org/10.1186/s12981-020-00325-9.

Full text
Abstract:
Abstract Background Pediatric tuberculosis (TB), human immunodeficiency virus (HIV), and TB-HIV co-infection are health problems with evidence-based diagnostic and treatment algorithms that can reduce morbidity and mortality. Implementation and operational barriers affect adherence to guidelines in many resource-constrained settings, negatively affecting patient outcomes. This study aimed to assess performance in the pediatric HIV and TB care cascades in Mozambique. Methods A retrospective analysis of routine PEPFAR site-level HIV and TB data from 2012 to 2016 was performed. Patients 0–14 years of age were included. Descriptive statistics were used to report trends in TB and HIV indicators. Linear regression was done to assess associations of site-level variables with performance in the pediatric TB and HIV care cascades using 2016 data. Results Routine HIV testing and cotrimoxazole initiation for co-infected children in the TB program were nearly optimal at 99% and 96% in 2016, respectively. Antiretroviral therapy (ART) initiation was lower at 87%, but steadily improved from 2012 to 2016. From the HIV program, TB screening at the last consultation rose steadily over the study period, reaching 82% in 2016. The percentage of newly enrolled children who received either TB treatment or isoniazid preventive treatment (IPT) also steadily improved in all provinces, but in 2016 was only at 42% nationally. Larger volume sites were significantly more likely to complete the pediatric HIV and TB care cascades in 2016 (p value range 0.05 to < 0.001). Conclusions Mozambique has made significant strides in improving the pediatric care cascades for children with TB and HIV, but there were missed opportunities for TB diagnosis and prevention, with IPT utilization being particularly problematic. Strengthened TB/HIV programming that continues to focus on pediatric ART scale-up while improving delivery of TB preventive therapy, either with IPT or newer rifapentine-based regimens for age-eligible children, is needed.
APA, Harvard, Vancouver, ISO, and other styles
22

Drozdz, Maya. "Waiting for Instantaneity." M/C Journal 3, no. 3 (June 1, 2000). http://dx.doi.org/10.5204/mcj.1848.

Full text
Abstract:
In "Speed and Information: Cyberspace Alarm!", Paul Virilio claims that telecommunications are ushering in the "invention of a perspective of real time" which results in "some kind of choking of the senses, a loss of control over reason". As users of this new technology, as the receptors of the stream of computer-mediated information, we need to figure out the terms and conditions of our acceptance of cyberspace as a space and realtime as a form of time, to understand the implications of this new mode of communication, for "no information exists without dis-information". In "Speed and Information: Cyberspace Alarm!", Paul Virilio claims that telecommunications are ushering in the "invention of a perspective of real time" which results in "some kind of choking of the senses, a loss of control over reason". As users of this new technology, as the receptors of the stream of computer-mediated information, we need to figure out the terms and conditions of our acceptance of cyberspace as a space and realtime as a form of time, to understand the implications of this new mode of communication, for "no information exists without dis-information". Even Virilio proclaims apocalyptically that "our history will happen in universal time, itself the outcome of instantaneity -- and there only". In fact, time also governs narrative choices: their availability, viability, desirability, relevance. Despite the hype surrounding the instantaneity of virtual travel, narrative in cyberspace is inherently subordinate to connection speed and loadtime. This is why the "loading screen" has become the standard welcome on Shockwave-heavy sites, developing into a kind of mini-genre of low-bandwidth animation. The possiblity of using temporality as a narrative catalyst has been exploited in cinema, as in classic Hollywood dissolves and fades. Metaphors of the passing of time are a familiar cliche: the pages of a daily calendar fluttering away, the changing of the seasons. Stanley Kubrick's bold cut from a spinning bone to a space station in orbit in 2001: A Space Odyssey is an extreme and unusual example of this sort of metaphor. These all function as temporal ellipses. The passage of time can also function as plot, as in Warhol's Blow Job and Richard Linklater's Slacker, both of which are ostensibly about merely the passage of time. Slacker lacks a traditional narrative and instead progresses through a series of vignettes, each one following seemingly random characters through seemingly random events (an idea developed further by the recent Magnolia). The change from one vignette to the next is motivated simply by the camera's movement from one character or event to another. The camera is like a nosy passerby, a voyeur, showing noncommittal interest in one thing, then another, and the viewer must give up interest in each vignette without the satisfaction of narrative closure. As the filmmaker tells the cabdriver in the beginning, each turn of events, each decision made, results in all possibilities going on to live out their potential realities. We, the viewers, in turn, follow the camera's gaze with the frustrating knowledge that other, unresolved realities are continuing without that gaze. The recent Timecode uses the same hypertext-type approach with four simultaneous screens, each a single shot capturing one part of an interlocked world. These are all extreme, overt examples of Deleuze's time-image. Online, similar moves have been made in Mark Napier's Shredder and Maciej Wisniewski's Netomat interfaces. Both function as alternatives to conventional browsing, Netomat even labelled an "anti-browser" that engages "an Internet that is an intelligent application and not simply a large database of static files". The above-mentioned devices for manipulating the perception and understanding of time as represented in film (fade, dissolve, et al.) exploited an inherently filmic problem: simply put, that there is a serious discrepancy between time as it happens and its perception, much like the time it takes to enjoy a Website's multimedia content and the length of its download. In the case of the fade, for instance, an inherent problem of the medium has been internalised in a non-transparent way and used overtly to further the narrative meaning. Likewise, the "loading screen" offers a morsel of content typically focussed on its function ("loading... 5 seconds to go..."). The existence of these filmic conventions makes us aware of when they are broken, as in the "realtime" films Nick of Time, Blow Job, and Timecode, and also in instances of extended time, as in the classic shower scene in Hitchcock's Psycho. Think, too, of the last time you had to wait before you saw any of a Website's content. Just as filmic time is typically compressed for the sake of appearing real, navigational movement on the Web is in fact constrained while seeming free, and delayed while seeming instantaneous. The promise of instant access has instead been fulfilled by erratic connection speeds and server problems. Because Web time ostensibly passes almost in an instant (this is, after all, the industry in which a product might become passé while still at the beta phase), information ages just as quickly: "404 File not found" is a familiar sight, telling us that the link we followed may have been coded, not last year, but maybe even last week, or yesterday. Information loses relevance, even disappears, often in no time at all. These problems have been exploited by JODI, whose experimental online work foregrounds the nuts and bolts (and kinks) of the Web, instead of hiding it beneath a clean "other" design. The desirability of information over time is also the focus of Rhizome's Starry Night interface which, utilizing Java, shifts over time to emphasise popular links, eventually eradicating unpopular ones. In The Work of Art in the Age of Mechanical Reproduction, Walter Benjamin writes that "even the most perfect reproduction of a work of art is lacking in one element: its presence in time and space, its unique existence at the place where it happens to be" (220). But what of the Web art project, whose existence on a server somewhere does not need to be known for the work to be understood, whose duplication yields a copy that is indistinguishable from the original? What of the work that is both static and temporal, which is inherently mediated through time, including time (as in server and connection speed) which cannot be completely accounted for by the author? He goes on to say that technical reproduction "enables the original to meet the beholder halfway" (220), but what is the Website's point of departure? Its creator's computer? The server on which it lives? The end user's computer? How can we map the path from the "original" to its "reproduction" when the two are indistinguishable, when, in fact, there is no confirmable original? As if in response to Benjamin, Paul Virilio writes in "Speed and Information: Cyberspace Alarm!" that "the specific negative aspect of these information superhighways is precisely this loss of orientation regarding alterity (the other), this disturbance in the relationship with the other and with the world". Virilio is concerned with the problem of orientation, that is, of the lack of geographical, historical, and temporal specificity and point of reference when experiencing a Web-based narrative. Compare that to Deleuze's claim that, in the time-image, "the brain has lost its Euclidean co-ordinates, and now emits other signs" (278), a notion similar to the "loading screen", a bit of content which exists merely to inform that content is forthcoming. Virilio sees this as a crucial problem facing us today and adds that "there is talk of substituting the term 'global' by 'glocal,' a concatenation of the words local and global". The Internet yields both seeming temporal instantaneity and spatial compression. We can be everywhere all at once, all the time; but what of the inevitable slippage of time involved? The World Wide Web has created a life of dead moments, of moments spent waiting for the instantaneous to happen. References Baudrillard, Jean. "Radical Thought." Trans. Francois Debrix in CTheory. Collection Morsure. Eds. Sens and Tonka. Paris: 1994. Benjamin, Walter. Illuminations. New York: Shocken Books, 1968. Debord, Guy. The Society of the Spectacle. London: Rebel Press, 1992. Deleuze, Gilles. Cinema 2: The Time-Image. Minneapolis: U of Minnesota P, 1989. Napier, Mark. Shredder. 27 June 2000 <http://www.potatoland.org/shredder/>. Rhizome. 27. June 2000 <http://www.rhizome.org/>. Virilio, Paul. "Speed and Information: Cyberspace Alarm!" Le Monde Diplomatique Aug. 1995. Trans. Patrice Riemens in CTheory. Wisniewski, Maciej. Netomat. 27 June 2000 <http://www.netomat.net/>. Citation reference for this article MLA style: Maya Drozdz. "Waiting for Instantaneity." M/C: A Journal of Media and Culture 3.3 (2000). [your date of access] <http://www.api-network.com/mc/0006/instantaneity.php>. Chicago style: Maya Drozdz, "Waiting for Instantaneity," M/C: A Journal of Media and Culture 3, no. 3 (2000), <http://www.api-network.com/mc/0006/instantaneity.php> ([your date of access]). APA style: Maya Drozdz. (2000) Waiting for instantaneity. M/C: A Journal of Media and Culture 3(3). <http://www.api-network.com/mc/0006/instantaneity.php> ([your date of access]).
APA, Harvard, Vancouver, ISO, and other styles
23

Liu, Zhe, and Albert Reynolds. "Robust Multiobjective Nonlinear Constrained Optimization with Ensemble Stochastic Gradient Sequential Quadratic Programming-Filter Algorithm." SPE Journal, March 1, 2021, 1–16. http://dx.doi.org/10.2118/205366-pa.

Full text
Abstract:
Summary As the crucial step in closed-loop reservoir management, robust life-cycle production optimization is defined as maximizing/minimizing the expected value of a predefined objective (cost) function over geological uncertainties (i.e., uncertainties in the reservoir permeability, porosity, endpoint relative permeability, etc.). However, with robust optimization, there is no control over downside risk defined as the minimum net present value (NPV) among the individual NPVs of the different reservoir models. Yet, field operators generally wish to keep this minimum NPV reasonably large to try to ensure that the reservoir is commercially viable. In addition, the field operator may desire to maximize the NPV of production over a much shorter time period than the life of the reservoir under the limitation of surface facilities (e.g., field liquid and water production rates). Thus, it is important to consider multiobjective robust production optimization with nonlinear constraints and when geological uncertainties are incorporated. The three objectives considered in this paper are; to maximize the average life-cycle NPV, to maximize the average short-term NPV, and to maximize the minimum NPV of the set of realizations. Generally, these objectives are in conflict; for example, the well controls that give a global maximum for robust life-cycle production optimization do not usually correspond to the controls that maximize the short-term average NPV of production. Moreover, handling the nonlinear state constraints (e.g., field liquid production rates and field water production rates for the bottom-hole pressure controlled producers in the robust production optimization) is also a challenge because those nonlinear constraints should be satisfied at each control steps for each geological realization. To provide potential solutions to the multiobjective robust optimization problem with state constraints, we developed a modified lexicographic method with a minimizing-maximum scheme to attempt to obtain a set of Pareto optimal solutions and to satisfy all nonlinear constraints. We apply the sequential quadratic programming filter with modified stochastic gradients to solve a sequence of optimization problems, where each solution is designed to generate a single point on the Pareto front. In the modified lexicographic method, the objective is always considered to be the primary objective, and the other objectives are considered by specifying bounds on them to convert them to state constraints. The temporal damping and truncation schemes are applied to improve the quality of the stochastic gradient on nonlinear constraints, and the minimizing–maximum procedure is applied to enforce constraints on the normal state constraints. The main advantage that the modified lexicographic method has over the standard lexicographic method is that it allows for the generation of potential Pareto optimal points, which are uniformly spaced in the values of the second and/or third objective that one wishes to improve by multiobjective optimization.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography