Academic literature on the topic 'Variants of the p-Center problem'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Variants of the p-Center problem.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Variants of the p-Center problem":

1

Dupin, Nicolas, Frank Nielsen, and El-Ghazali Talbi. "Unified Polynomial Dynamic Programming Algorithms for P-Center Variants in a 2D Pareto Front." Mathematics 9, no. 4 (February 23, 2021): 453. http://dx.doi.org/10.3390/math9040453.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
With many efficient solutions for a multi-objective optimization problem, this paper aims to cluster the Pareto Front in a given number of clusters K and to detect isolated points. K-center problems and variants are investigated with a unified formulation considering the discrete and continuous versions, partial K-center problems, and their min-sum-K-radii variants. In dimension three (or upper), this induces NP-hard complexities. In the planar case, common optimality property is proven: non-nested optimal solutions exist. This induces a common dynamic programming algorithm running in polynomial time. Specific improvements hold for some variants, such as K-center problems and min-sum K-radii on a line. When applied to N points and allowing to uncover M<N points, K-center and min-sum-K-radii variants are, respectively, solvable in O(K(M+1)NlogN) and O(K(M+1)N2) time. Such complexity of results allows an efficient straightforward implementation. Parallel implementations can also be designed for a practical speed-up. Their application inside multi-objective heuristics is discussed to archive partial Pareto fronts, with a special interest in partial clustering variants.
2

Eskandari, Marzieh, Bhavika B. Khare, Nirman Kumar, and Bahram Sadeghi Bigham. "Red–Blue k-Center Clustering with Distance Constraints." Mathematics 11, no. 3 (February 2, 2023): 748. http://dx.doi.org/10.3390/math11030748.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
We consider a variant of the k-center clustering problem in IRd, where the centers can be divided into two subsets—one, the red centers of size p, and the other, the blue centers of size q, such that p+q=k, and each red center and each blue center must be a distance of at least some given α≥0 apart. The aim is to minimize the covering radius. We provide a bi-criteria approximation algorithm for the problem and a polynomial time algorithm for the constrained problem where all centers must lie on a given line ℓ. Additionally, we present a polynomial time algorithm for the case where only the orientation of the line is fixed in the plane (d=2), although the algorithm works even in IRd by constraining the line to lie in a plane and with a fixed orientation.
3

BAREQUET, GILL, and SARIEL HAR-PELED. "POLYGON CONTAINMENT AND TRANSLATIONAL IN-HAUSDORFF-DISTANCE BETWEEN SEGMENT SETS ARE 3SUM-HARD." International Journal of Computational Geometry & Applications 11, no. 04 (August 2001): 465–74. http://dx.doi.org/10.1142/s0218195901000596.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
The 3SUM problem represents a class of problems conjectured to require Ω(n2) time to solve, where n is the size of the input. Given two polygons P and Q in the plane, we show that some variants of the decision problem, whether there exists a transformation of P that makes it contained in Q, are 3SUM-hard. In the first variant P and Q are any simple polygons and the allowed transformations are translations only; in the second and third variants both polygons are convex and we allow either rotations only or any rigid motion. We also show that finding the translation in the plane that minimizes the Hausdorff distance between two segment sets is 3SUM-hard.
4

Andrei, Ionica. "Existence Theorems for Some Classes of Boundary Value Problems Involving the P(X)-Laplacian." Nonlinear Analysis: Modelling and Control 13, no. 2 (April 25, 2008): 145–58. http://dx.doi.org/10.15388/na.2008.13.2.14575.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
We prove an alternative for a nonlinear eigenvalue problem involving the p(x)-Laplacian and study a subcritical boundary value problem for the same operator. The theoretical approach is the Mountain Pass Lemma and one of its variants, which is very useful in the study of eigenvalue problems.
5

Wen, Fayuan, Angela Rock, Juan Salomon-Andonie, Gulriz Kurban, Xiaomei Niu, Songping Wang, Xu Zhang, et al. "Genome Wide Association Analysis of Iron Overload in the Trans-Omics for Precision Medicine (TOPMed) Sickle Cell Disease Cohorts." Blood 136, Supplement 1 (November 5, 2020): 52. http://dx.doi.org/10.1182/blood-2020-142809.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Introduction: Transfusional iron (Fe) overload is a significant problem among patients with chronic, transfusion-dependent anemias. Iron overload is an important problem in pediatric sickle cell disease (SCD) patients on chronic transfusion regimens predominantly for primary and secondary prevention of stroke. Coexistent hereditary iron overload conditions contribute to the iron overload phenotype in SCD. For example, the Q248H mutation (rs11568350) in SLC40A1, which encodes ferroportin (FPN), is associated with a mild tendency to increase serum ferritin in the general population and with increased ferritin levels in SCD patients. Nevertheless, the molecular mechanisms underlying the progression to iron overload in SCD patients are poorly understood and more sensitive markers for outcome prediction that can be applied at early clinical stages are lacking. We hypothesize that genetic variation modifies the risk for iron overload in SCD patients and seek to validate previously identified mutation and identify novel genetic markers of iron overload among participants from TOPMed SCD cohorts by performing whole genome sequencing (WGS) association analyses. Methods: The WGS was performed by several national sequencing centers sponsored by NHLBI's TOPMed program at an average depth of 30× using DNA from SCD patient blood samples. Variant calling was performed jointly across TOPMed studies for all samples using the GotCloud pipeline by the TOPMed Informatics Research Center. The TOPMed data Coordinating Center performed quality control for sample identity. The data across the following studies were shared through the database of Genotype and Phenotype (dbGaP) exchange area: Howard PUSH SCD (N=370), OMG SCD (N=636), Walk PHaSST SCD (N=381) and REDS-III Brazil SCD (N=2620) with a total sample size of 4007. The study was approved by the appropriate institutional review boards (IRB) and informed consent was obtained from all participants. Genome Wide Association Analysis of iron overload was carried out using the University of Michigan ENCORE server. We performed single variant tests to test the association of log-transformed serum ferritin levels with single nucleotide variants (SNVs) while adjusting for sex, age, self-reported race, numbers of lifetime red blood transfusions and genetic substructure (PC's 1-10). We used a significance threshold of p&lt;5×10-8 to report anassociation as genome-wide significant for common and rare genetic variants. Results: We first included PUSH SCD, OMG SCD and Walk PHaSST SCD cohorts with 840 serum ferritin samples in the WGS association analyses, which revealed at the genome-wide level a new rare variant rs137929759 (chr7:49538810 (GRCh38.p12), MAF=0.0043532, p=2.25×10−8). A few variants such as rs80097634 in gene AL163195.3 and RNASE11 (Chr14:20579417. MAF=0.050373, p=3.1×10-7) were close to the genome-wide significance level. We confirmed previously identified associations in SLC40A1 for ferritin (rs11568350, Chr2:189565370, MAF = 0.16853, p = 5.2×10−4). We also found several variants in AC105411.1, TJP1 and DCC that were close to genome-wide significance level. Further analysis will be carried out on the cloud-based platform provided by NHLBI BioData Catalyst using data from all the four cohorts to validate the previous analysis and expand to related phenotype such as transferrin, iron-overload status. Discussion: In this study we identified common and rare variants that associate with serum ferritin concentration. The results from this pilot study point to novel gene variants that might contribute to iron overload in SCD patients and serve as new biomarkers. Future analysis is needed to determine whether the identified variants can also help with therapeutics and outcome prediction for early stages of SCD-associated iron overload. Our findings will be useful for the future treatment of SCD patients and design of novel SCD therapeutics. ACKNOWLEDGMENTS: This work was supported by NIH Research Grants (1P50HL118006, and 1R01HL125005). The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health. Disclosures Gordeuk: Imara: Research Funding; Global Blood Therapeutics: Consultancy, Research Funding; CSL Behring: Consultancy, Research Funding; Ironwood: Research Funding; Novartis: Consultancy. Telen:CSL Behring: Membership on an entity's Board of Directors or advisory committees, Research Funding; Forma Therapeutics: Research Funding; Novartis: Consultancy, Membership on an entity's Board of Directors or advisory committees; Pfizer: Consultancy, Membership on an entity's Board of Directors or advisory committees, Research Funding; GlycoMimetics Inc.: Consultancy.
6

Arya, Akschat, Boominathan Perumal, and Santhi Krishnan. "Parallelized solution to the asymmetric travelling salesman problem using central processing unit acceleration." Indonesian Journal of Electrical Engineering and Computer Science 25, no. 3 (March 1, 2022): 1795. http://dx.doi.org/10.11591/ijeecs.v25.i3.pp1795-1802.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
<p><span>Travelling salesman problem is a well researched problem in computer science and has many practical applications. It is classified as a NP-hard problem as its exact solution can only be obtained in exponential time unless P = NP. There are different variants of the travelling salesman problem (TSP) and in this paper, asymmetric travelling salesman problem is addressed since this variant is quite often observed in real world scenarios. There are a number of heuristic approaches to this problem which provides approximate solutions in polynomial time, however this paper proposes an exact optimal solution which is accelerated with the help of multi-threading-based parallelization. In order to find the exact optimal solution, we have used the held-karp algorithm involving dynamic programming and to reduce the time taken to find the optimal path, we have used a multi-threaded approach to parallelize the processing of sub-problems by leveraging the central processing unit cores (CPUs). This method is an extension of a well researched solution to the TSP; however, this method shows that solutions to computationally intensive problems involving sub-problems such as the asymmetic travelling salesman problem (ATSP) can be accelerated with the help of modern CPUs.</span></p>
7

SEITBEKOVA, А., and А. AMIRBEKOVA. "PHONETIC VARIANTS OF LOANWORDS." Iasaýı ýnıversıtetіnіń habarshysy 125, no. 3 (September 15, 2022): 37–47. http://dx.doi.org/10.47526/2022-3/2664-0686.03.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Unification of different spelling of words is one of the most important problem in Kazakh linguistics. Especially noticeable is the number of words that occur in different spelling variants, in loanwords from Arabic and Persian origin. The heterogeneity and diversity of the spelling of borrowed words in the practice of writing remains an unsolved problem among linguists. In this regard, the purpose of our study is to identify the reasons for the use of Arabic, Persian words that are written in different versions of the same meaning, and an attempt to designate the main variant when switching to a new Latin alphabet. It is known that different spellings of words often occur between modern literary language and dialects. Regardless of the types of phonetic phenomena found in the dialects of our language, in general they are intertwined with the phonetic development of the Turkic languages. Therefore, the article analyzes the processes of mastering, starting from ancient and medieval Turkic written monuments and the features of the transition to modern Turkic languages. In order to determine the main variant, linguistic and statistical studies of phonetic variants of b/p, p/f are carried out on the basis of the «Large Explanatory Dictionary of the Kazakh literary language». The results of the study can serve in the unification of spelling norms of borrowed words, as well as the necessary linguistic information in the development of various dictionaries.
8

Păun, Gheorghe. "One More Universality Result for P Systems with Objects on Membranes." International Journal of Computers Communications & Control 1, no. 1 (January 1, 2006): 25. http://dx.doi.org/10.15837/ijccc.2006.1.2269.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
<p>We continue here the attempt to bridge brane calculi with membrane computing, following the investigation started in [2]. Specifically, we consider P systems with objects placed on membranes, and processed by membrane operations. The operations used in this paper are membrane creation (cre), and membrane dissolution (dis), defined in a way which reminds the operations pino, exo from a brane calculus from [1]. For P systems based on these operations we prove the universality, for one of the two possible variants of the operations; for the other variant the problem remains open.</p>
9

Aksenova, Irina Vasil’evna, Yuliya Igorevna Naumova, and Vladimir Valentinovich Gridyushko. "Perspectives of the contemporary usage of circular locomotive depot buildings." Vestnik MGSU, no. 2 (February 2016): 9–19. http://dx.doi.org/10.22227/1997-0935.2016.2.9-19.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Variants of reshaping the objects of the industrial heritage, including the buildings of transport infrastructure located in central districts of historical towns are analyzed in the article. The evolution of the development of depots for maintaining and repairing the locomotives is represented. The uniqueness of the complex of buildings of Nikolaevskaya Railway in Moscow, an integrated historical and architectural ensemble, is noted. At the present moment one of few preserved buildings is a circular depot in the center of Moscow. The loss of this unique specimen of industrial architecture of the middle of 19th century would be an irreplaceable loss for the cultural heritage of the nation. The only way of its rescue from full destruction is its restoration and inclusion in the contemporary life of the city. The method of possible variants of the contemporary usage of historical building-monuments of the industrial heritage is proposed, which secures their safety on the basis of self-repayment. The preferable variants for reshaping the building of circular depot in Moscow are considered on the basis of qualitative criteria. Keeping in mind the location of the depot near railway stations - the sources of the main contingent being in need of short-term rent - the variant of placing a hotel-touristic center in the depot was chosen. This corresponds to the basic direction of the State Program of the City of Moscow for the period of 2012-2016, which provides the development of the hotel chain at the expense of the reconstruction and the creation of the touristic infrastructure. The authors considered in the article the variant of usage of the depot as a multifunctional hotel complex gives the possibility to solve the problem of shortage of two-stars hotels in the center of Moscow and, what is very important, to preserve the monument in an undistorted appearance.
10

Adamuthe, Amol C., and Smita M. Kagwade. "Hybrid and adaptive harmony search algorithm for optimizing energy efficiency in VMP problem in cloud environment." Decision Science Letters 11, no. 2 (2022): 113–26. http://dx.doi.org/10.5267/j.dsl.2022.1.001.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Data Center energy usage has risen dramatically because of the rapid growth and demand for cloud computing. This excessive energy usage is a challenge from an economic and environmental point. Virtual Machine Placement (VMP) along with virtualization technologies is widely used to manage power utilization in data centers. The assignment of virtual machines to physical machines affects energy consumption. VMP is a process of mapping VMs onto a set of PMs in a data center to minimize total power consumption and maximize resource utilization. The VMP is an NP-hard problem due to its constraints and huge combinations. In this paper, we formulated the problem as a single objective optimization problem in which the objective is to minimize the energy consumption in cloud data centers. The main contribution of this paper is hybrid and adaptive harmony search algorithm for optimal placements of VMs to PMs. HSA with adaptive PAR settings, simulated annealing and local search strategy aims at minimizing energy consumption in cloud data centers with satisfying given constraints. Experiments are conducted to validate the performance of these variations. Results show that these hybrid HSA variations produce better results than basic HSA and adaptive HSA. Hybrid HS with simulated annealing, and local search strategy gives better results than other variants for 80 percent datasets.

Dissertations / Theses on the topic "Variants of the p-Center problem":

1

Sim, Thaddeus Kim Teck. "The hub covering flow problem and the stochastic p-hub center problem." Diss., University of Iowa, 2007. http://ir.uiowa.edu/etd/124.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Haddad, Marcel Adonis. "Nouveaux modèles robustes et probabilistes pour la localisation d'abris dans un contexte de feux de forêt." Electronic Thesis or Diss., Université Paris sciences et lettres, 2020. http://www.theses.fr/2020UPSLD021.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
A cause du réchauffement climatique, le nombre et l’intensité des feux de forêts augmentent autour du globe. Dansce contexte, la construction de refuges contre le feu est une solution de plus en plus envisagée. Le problème consisteessentiellement à localiser p refuges de sorte à minimiser la distance maximale qui sépare un usager du plus procherefuge accessible en cas de feux. Le territoire considéré est divisé en zones et est modélisé comme un graphe auxarêtes pondérées. Un départ de feux sur une seule zone (c’est-à-dire sur un sommet). La principale conséquence d’unfeu est que les chemins d’évacuation sont modifiés de deux manières. Premièrement, un chemin d’évacuation ne peutpas traverser le sommet en feu. Deuxièmement, le fait qu’une personne proche de l’incendie puisse avoir un choix limitéde direction d’évacuation, ou être sous stress, est modélisé à l’aide d’une stratégie d’évacuation nouvellement définie.Cette stratégie d’évacuation induit des distances d’évacuation particulières qui rendent notre modèle spécifique. Selon letype de données considéré et l’objectif recherché, nous proposons deux problèmes avec ce modèle: le Robust p-CenterUnder Pressure et le Probabilistic p-Center Under Pressure. Nous prouvons que ces deux problèmes sont NP-difficilessur des classes de graphes pertinentes pour notre contexte. Nous proposons également des résultats d’approximationet d’inapproximation. Finalement, nous développons des algorithmes polynomiaux sur des classes de graphes simples,et nous développons des algorithmes mathématiques basés sur la programmation linéaire
The location of shelters in different areas threatened by wildfires is one of the possible ways to reduce fatalities in acontext of an increasing number of catastrophic and severe forest fires. The problem is basically to locate p sheltersminimizing the maximum distance people will have to cover to reach the closest accessible shelter in case of fire. Thelandscape is divided in zones and is modeled as an edge-weighted graph with vertices corresponding to zones andedges corresponding to direct connections between two adjacent zones. Each scenario corresponds to a fire outbreak ona single zone (i.e., on a vertex) with the main consequence of modifying evacuation paths in two ways. First, an evacuationpath cannot pass through the vertex on fire. Second, the fact that someone close to the fire may have limited choice, ormay not take rational decisions, when selecting a direction to escape is modeled using a new kind of evacuation strategy.This evacuation strategy, called Under Pressure, induces particular evacuation distances which render our model specific.We propose two problems with this model: the Robust p-Center Under Pressure problem and the Probabilistic p-CenterUnder Pressure problem. First we prove hardness results for both problems on relevant classes of graphs for our context.In addition, we propose polynomial exact algorithms on simple classes of graphs and we develop mathematical algorithmsbased on integer linear programming
3

Eraslan, Demirci Sukran. "A Genetic Algorithm For The P-hub Center Problem With Stochastic Service Level Constraints." Master's thesis, METU, 2010. http://etd.lib.metu.edu.tr/upload/12612940/index.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
ABSTRACT A GENETIC ALGORITHM FOR p-HUB CENTER PROBLEM WITH STOCHASTIC SERVICE LEVEL CONSTRAINTS Eraslan Demirci, Sü
kran M.Sc., Department of Industrial Engineering Supervisor: Asst. Prof. Dr. Sedef Meral December 2010, 170 pages The emphasis on minimizing the costs and travel times in a network of origins and destinations has led the researchers to widely study the hub location problems in the area of location theory in which locating the hub facilities and designing the hub networks are the issues. The p-hub center problem considering these issues is the subject of this study. p-hub center problem with stochastic service level constraints and a limitation on the travel times between the nodes and hubs is addressed, which is an uncapacitated, single allocation problem with a complete hub network. Both a mathematical model and a genetic algorithm are proposed for the problem. We discuss the general framework of the genetic algorithm as well as the problem-specific components of algorithm. The computational studies of the proposed algorithm are realized on a number of problem instances from Civil Aeronautics Board (CAB) data set and Turkish network data set. The computational results indicate that the proposed genetic algorithm gives satisfactory results when compared with the optimum solutions and solutions obtained with other heuristic methods.
4

Calik, Hatice. "Exact solution methodologies for the p-center problem under single and multiple allocation strategies." Doctoral thesis, Bilkent University, Ankara, Turkey, 2013. http://hdl.handle.net/2013/ULB-DIPOT:oai:dipot.ulb.ac.be:2013/238641.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Wei, Hu. "SOLVING CONTINUOUS SPACE LOCATION PROBLEMS." The Ohio State University, 2008. http://rave.ohiolink.edu/etdc/view?acc_num=osu1205514715.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Silav, Ahmet. "Bi-objective Facility Location Problems In The Presence Of Partial Coverage." Master's thesis, METU, 2009. http://etd.lib.metu.edu.tr/upload/2/12610681/index.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
In this study, we propose a bi-objective facility location model that considers both partial coverage and service to uncovered demands. In this model, it is assumed that the demand nodes within the predefined distance of opened facilities are fully covered and after that distance the coverage level linearly decreases. The objectives are the maximization of the sum of full and partial coverage the minimization of the maximum distance between uncovered demand nodes and their closest opened facilities. We apply two existing Multi Objective Genetic Algorithms (MOGAs), NSGA-II and SPEA-II to the problem. We determine the drawbacks of these MOGAs and develop a new MOGA called modified SPEA-II (mSPEA-II) to avoid the drawbacks. In this method, the fitness function of SPEA-II is modified and the crowding distance calculation of NSGA-II is used. The performance of mSPEA-II is tested on randomly generated problems of different sizes. The results are compared with the solutions resulting from NSGA-II and SPEA-II. Our experiments show that mSPEA-II outperforms both NSGA-II and SPEA-II.
7

Yang, Chih-Shiang, and 楊智翔. "New Algorithmic Results on the Connected p-Center Problem and Its Variants." Thesis, 2010. http://ndltd.ncl.edu.tw/handle/91313818201500460028.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
碩士
世新大學
資訊管理學研究所(含碩專班)
98
The essential p-Center problem is to determine a set of p vertices of a graph G for building facilities. The objective is to minimize the maximum access distance of clients at all vertices. Let G(V, E, l, w) be a n-vertex and m-edge graph with lengths on edges and weights on vertices. Given a graph G(V, E, l, w), a practical variant, called the Weighted Connected p-Center problem (the WCpC problem), is to find a p-center of G such that the maximum weighted access distance of clients at all vertices is minimized under the additional restriction in which requires the selected p-center induce a connected subgraph of G. If w(v) = 1, for all v in V, then the problem is abbreviated as the CpC problem. We first prove that the CpC problem is NP-Hard on planar graphs and interval graphs, respectively. Second, we propose two algorithms for the WCpC problem on trees with time-complexities O(pn) and O(n log2n), respectively, by different approaches. Meanwhile, if w(v) ? C, for all v in V, where C is a set of k numbers, for some small integer k, then another algorithm with time-complexity O(kn) is proposed. Next, the extension to graphs with forbidden vertices, called the Forbidden Weighted Connected p-Center problem (the FWCpC problem) is discussed. We show that the FWCpC problem can be also solved in O(n log2n) time. Finally, we propose an O(n) time algorithm for the FCpC problem on interval graphs with unit vertex-weights and unit edge-lengths.
8

Chen, Chien Tsai, and 陳建材. "The p-center problem with some practical constrints." Thesis, 2006. http://ndltd.ncl.edu.tw/handle/80686676030458876484.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
碩士
世新大學
資訊管理學研究所(含碩專班)
94
This thesis addresses the p-Center problem with some practical constraints. Let G(V, E, W) denote a graph with n-vertex-set V and m-edge-set E in which W is a function mapping each edge e to a positive distance W(e). The traditional p-Center problem is to locate some kind of facilities at p vertices of G to minimize the maximum distance between any vertex and the nearest facility corresponding to that vertex. This thesis considers some more practical constraints. We first require that the p vertices in which the facilities are located must be connected, i.e., the subgraph induced by the p facility vertices must be connected. The resulting problem is called the Connected p-Center problem (the CpC problem). Meanwhile, we deal with further restriction in which all vertices in F cannot be included in any feasible solution, for any given subset F of V. The vertices in F are called forbidden vertices and the problem is called the Forbidden Connected p-Center problem (the FCpC problem). We first show that the CpC problem is NP-Hard on bipartite graphs. Second, O(n)-time and O(pn)-time algorithms for the CpC problem on trees and 3-cactus graphs are proposed, respectively. Finally, the algorithmic results are extended to the FCpC problem on trees and 3-cactus graphs. The time-complexities remain O(n) and O(pn), respectively.
9

Chen, Sen-Miao, and 陳森淼. "A Study on Total p-Center Problem on Graphs." Thesis, 2011. http://ndltd.ncl.edu.tw/handle/95469863086488951842.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
碩士
世新大學
資訊管理學研究所(含碩專班)
99
This thesis introduces a new useful and interesting variation of the traditional p-Center problem on graphs, called the Total p-Center problem (the TpC problem). Its goal is to find a p-vertex set Q of a graph G(V, E, w, l) with weights on vertices and lengths on edges such that the maximum weighted access distance of all vertices not in Q to their nearest vertices in Q is minimized, and the induced subgraph by Q cannot include any isolated vertex. In this research, we concentrate on the situation that the induced subgraph by the center vertices is formed by exactly k components, called the k-Com TpC problem. We first show that the k-Com TpC problem on planar graphs and bipartite graphs with {1, 2}-edge-length is NP-hard, respectively. Meanwhile, O(pnlogn) time algorithms are proposed for the 2-Com p-Center problem and the 2-Com TpC problem on paths, respectively. After then, on graphs with forbidden vertices, we show that both 2-Com p-Center problem and the 2-Com TpC problem on paths are also O(pnlogn) time solvable.
10

Wu, Tzung-Shiun, and 吳宗勳. "The p-center location problem on undirected connected graphs." Thesis, 2015. http://ndltd.ncl.edu.tw/handle/08117752500540605601.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
碩士
國立臺北商業大學
資訊與決策科學研究所
103
Public facilities is a quite practical and important issue. Facility plans need to fit all kinds of demands at various times and places. Generally speaking, fundamental facilities are mostly permanent building structures, which provide long-term services to people; and of course, it cannot be renovated or relocated frequently. Moreover, it is presumed that, the better the location the infrastructures are located at, the less costs and transportation it required, and the bigger cost-efficient. The p-center problem have applications in the location allocation problem, the p-center problem is to give a undirected connected graph G = (V,E) and p,then find a subset S∈E of at most p which minimizes the maximum distances from points in V to S. We employ recursive algorithm and branch-and-bound algorithm for solving the p-center problem and attempt to find out the more efficient algorithm among these two. From the experimental results, the recursive algorithm has affected by the number of edges, which growing quite fast. On the other hand, branch-and-bound algorithm is opposite; and the effect of the number of edge is not obvious. But instead, increasing the number of points and p values has largely affected. When the point of the graph of the inner edge is less than 1.3 times, it will be more efficient solving the p-center problem by using the recursive algorithm method than the branch-and-bound algorithm.

Books on the topic "Variants of the p-Center problem":

1

El trabajo inmaterial como problema de la filosofía política. Teseo, 2017. http://dx.doi.org/10.55778/ts870515906.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
<p>Las transformaciones económicas, tecnológicas y sociales acaecidas en las últimas décadas a nivel global han sido de tal profundidad que han llevado a replantear varias cuestiones fundamentales de la filosofía política, en especial aquella referida a la naturaleza del capitalismo. La relación entre trabajo y propiedad, base del modo de producción, es entonces fuertemente interpelada por la creciente hegemonía del trabajo inmaterial y de la propiedad intelectual. La presente obra procura, en función de este nuevo escenario, retomar los fundamentos económico-políticos del liberalismo y del marxismo como punto de partida para repensar el problema de la transformación en un contexto de subsunción, no ya del trabajo sino de la vida, al capital.</p>
2

Psicoanálisis y educación: un diálogo de encuentros y desencuentros. Teseo, 2016. http://dx.doi.org/10.55778/ts877230888.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
<p>En este libro se explora el malestar emergente en el escenario educativo relacionado con la problemática de la violencia en las escuelas, a partir de contribuciones conceptuales y prácticas que se van entretejiendo en los diversos aspectos de un tema que inquieta a docentes, padres y alumnos. El sujeto es empujado cada vez más a encontrar respuestas a sus problemas por fuera de la regulación por la palabra, a través de variadas formas de violencia que se manifiestan en el vínculo social. En este contexto, la tendencia a las clasificaciones de la ciencia moderna deja sus huellas en la cotidianeidad de la vida escolar, haciéndo oír de manera persistente demandas de diagnósticos e intervenciones para la adaptación de los niños y adolescentes. La perspectiva aplicada es la de la interrogación de las soluciones subjetivas y sociales a los problemas de la época dentro del escenario educativo, enmarcada en la ética de la singularidad que sostiene la praxis del Psicoanálisis de orientación lacaniana.</p>
3

Busdygan, Daniel. Rostros del igualitarismo. Teseo, 2020. http://dx.doi.org/10.55778/ts878633695.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
<p>La igualdad ha constituido un nutrido espacio temático para el derecho, la ética y la filosofía política a lo largo del siglo XX. Alrededor deella se han desarrollado distintas perspectivas teóricas desde las que se ha buscado echar luz a la gran cantidad de problemas que suscita.</p><p>Este libro se propone avanzar sobre un importante conjunto de discusiones filosóficas que se inscriben dentro de los desafíos emprendidos por el igualitarismo filosófico.Con la igualdad no sólosurge un campo problemático sino que además se define el horizonte de expectativas razonables en el que se estipulan las soluciones equitativas –y públicamente defendibles– para las demandas que puedan aparecer a partir de desigualdades inmerecidas, ventajas indebidas, resentimientos o situaciones injustificadas de poder o dominación.Las discusiones que trazamos en esta obra están en correspondencia con temas como el trato igualitario, las reglas y los principios distributivos, los criterios de reparto y qué es lo que ha de ser objeto de reparto, las formas de nivelación y la compensación o los presupuestos políticos que se requieren para garantizar el igualitarismo en una sociedad democrática. En estos asuntos, entre otros, tenemos presente que detrás de los múltiples acercamientos que hacemos al tema de la igualdad no sólo señalamos sus variados <i>rostros </i>sino que principalmente<i> </i>buscamos contribuir mínimamente a la construcción democrática de una sociedad menos desigual.</p>
4

Gleń-Karolczyk, Katarzyna. Zabiegi ochronne kształtujące plonowanie zdrowotność oraz różnorodność mikroorganizmów związanych z czernieniem pierścieniowym korzeni chrzanu (Atmoracia rusticana Gaertn.). Publishing House of the University of Agriculture in Krakow, 2019. http://dx.doi.org/10.15576/978-83-66602-39-7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Horseradish roots, due to the content of many valuable nutrients and substances with healing and pro-health properties, are used more and more in medicine, food industry and cosmetics. In Poland, the cultivation of horseradish is considered minor crops. In addition, its limited size causes horseradish producers to encounter a number of unresolved agrotechnical problems. Infectious diseases developing on the leaves and roots during the long growing season reduce the size and quality of root crops. The small range of protection products intended for use in the cultivation of horseradish generates further serious environmental problems (immunization of pathogens, low effectiveness, deterioration of the quality of raw materials intended for industry, destruction of beneficial organisms and biodiversity). In order to meet the problems encountered by horseradish producers and taking into account the lack of data on: yielding, occurrence of infectious diseases and the possibility of combating them with methods alternative to chemical ones in the years 2012–2015, rigorous experiments have been carried out. The paper compares the impact of chemical protection and its reduced variants with biological protection on: total yield of horseradish roots and its structure. The intensification of infectious diseases on horseradish leaves and roots was analyzed extensively. Correlations were examined between individual disease entities and total yield and separated root fractions. A very important and innovative part of the work was to learn about the microbial communities involved in the epidemiology of Verticillium wilt of horseradish roots. The effect was examined of treatment of horseradish cuttings with a biological preparation (Pythium oligandrum), a chemical preparation (thiophanate-methyl), and the Kelpak SL biostimulator (auxins and cytokinins from the Ecklonia maxima algae) on the quantitative and qualitative changes occurring in the communities of these microorganisms. The affiliation of species to groups of frequencies was arranged hierarchically, and the biodiversity of these communities was expressed by the following indicators: Simpson index, Shannon–Wiener index, Shannon evenness index and species richness index. Correlations were assessed between the number of communities, indicators of their biodiversity and intensification of Verticillium wilt of horseradish roots. It was shown that the total yield of horseradish roots was on average 126 dt · ha–1. Within its structure, the main root was 56%, whereas the fraction of lateral roots (cuttings) with a length of more than 20 cm accounted for 26%, and those shorter than 20 cm for 12%, with unprofitable yield (waste) of 6%. In the years with higher humidity, the total root yield was higher than in the dry seasons by around 51 dt · ha–1 on average. On the other hand, the applied protection treatments significantly increased the total yield of horseradish roots from 4,6 to 45,3 dt · ha–1 and the share of fractions of more than 30 cm therein. Higher yielding effects were obtained in variants with a reduced amount of foliar application of fungicides at the expense of introducing biopreparations and biostimulators (R1, R2, R3) and in chemical protection (Ch) than in biological protection (B1, B2) and with the limitation of treatments only to the treatment of cuttings. The largest increments can be expected after treating the seedlings with Topsin M 500 SC and spraying the leaves: 1 × Amistar Opti 480 SC, 1 × Polyversum WP, 1 × Timorex Gold 24 EC and three times with biostimulators (2 × Kelpak SL + 1 × Tytanit). In the perspective of the increasing water deficit, among the biological protection methods, the (B2) variant with the treatment of seedlings with auxins and cytokinins contained in the E. maxima algae extract is more recommended than (B1) involving the use of P. oligandrum spores. White rust was the biggest threat on horseradish plantations, whereas the following occurred to a lesser extent: Phoma leaf spot, Cylindrosporium disease, Alternaria black spot and Verticillium wilt. In turn, on the surface of the roots it was dry root rot and inside – Verticillium wilt of horseradish roots. The best health of the leaves and roots was ensured by full chemical protection (cuttings treatment + 6 foliar applications). A similar effect of protection against Albugo candida and Pyrenopeziza brassicae was achieved in the case of reduced chemical protection to one foliar treatment with synthetic fungicide, two treatments with biological preparations (Polyversum WP and Timorex Gold 24 EC) and three treatments with biostimulators (2 × Kelpak SL, 1 × Tytanit). On the other hand, the level of limitation of root diseases comparable with chemical protection was ensured by its reduced variants R3 and R2, and in the case of dry root rot, also both variants of biological protection. In the dry years, over 60% of the roots showed symptoms of Verticillium wilt, and its main culprits are Verticillium dahliae (37.4%), Globisporangium irregulare (7.2%), Ilyonectria destructans (7.0%), Fusarium acuminatum (6.7%), Rhizoctonia solani (6.0%), Epicoccum nigrum (5.4%), Alternaria brassicae (5.17%). The Kelpak SL biostimulator and the Polyversum WP biological preparation contributed to the increased biodiversity of microbial communities associated with Verticillium wilt of horseradish roots. In turn, along with its increase, the intensification of the disease symptoms decreased. There was a significant correlation between the richness of species in the communities of microbial isolates and the intensification of Verticillium wilt of horseradish roots. Each additional species of microorganism contributed to the reduction of disease intensification by 1,19%.
5

Cuevas Arenas, Héctor, Caroline Cunill, Daniela Vásquez Pino, Fredy A. Montoya López, and Paula Daza Tobasura. Conflictos indígenas ante la justicia colonial: los hilos entrelazados de una compleja trama social y legal, siglos XVI-XVIII. Editorial Universidad Santiago de Cali, 2020. http://dx.doi.org/10.35985/9789585147614.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Las experiencias del dominio que tuvo la Corona Hispánica en las Indias Occidentales fueron sumamente complejas y diversas. Igual se debe decir sobre los discursos que hacían inteligibles dichas acciones. Mucho más queda por expresar e investigar respecto a los actores, sus prácticas, valoraciones y actitudes en un tiempo tan extenso y en unos territorios tan vastos y diversos. Afortunadamente, lo que es denominado como el periodo “colonial”, “virreinal” o “monárquico” en América constituye un campo fértil donde hay mayor cantidad de preguntas que respuestas, lo que da lugar a que cada vez surjan nuevas cuestiones que alimenten temas, problemas y debates disciplinares e interdisciplinares. Esto retroalimenta y estimula la producción de los historiadores y confirma la pertinencia del conocimiento histórico en el campo de los estudios sociales y antropológicos. Con esta comprensión de lo pretérito es posible la elaboración de revisiones y críticas que renueven las nociones existentes no solo de pasados distantes, sino también de futuros posibles. Con lo anterior, se puede afirmar que el análisis y la interpretación historiográfica nunca se van a agotar, estimulando el surgimiento de explicaciones acumulativamente más diversas sobre el pasado y en constante diálogo con la comprensión del presente. El pasado se presenta en variadas ocasiones como un tejido heterogéneo, disperso y desigual, constituido por múltiples hilos que lo atraviesan y le dan distintos matices. Presentado de esta manera, estudiar las dinámicas de los siglos XVI al XIX exige una delimitación que articule espacios, actores, procesos y temas de una forma operativa para evitar una dispersión innecesaria, que difumine los alcances de los ejercicios de síntesis que debe tener toda mirada al pasado. Este trabajo compilatorio tiene dicho reto. Uno de sus hilos es la justicia como valor y práctica de gobierno: el orden jurídico hacía parte del orden moral, ambos involucraban lo teologal, lo social y lo jurídico (Traslosheros, 2011, p. 14).

Book chapters on the topic "Variants of the p-Center problem":

1

Ales, Zacharie, and Sourour Elloumi. "Compact MILP Formulations for the p-Center Problem." In Lecture Notes in Computer Science, 14–25. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-96151-4_2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Bai, Chunsong, Liying Kang, and Erfang Shan. "The Connected p-Center Problem on Cactus Graphs." In Combinatorial Optimization and Applications, 718–25. Cham: Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-48749-6_53.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Yang, Kai, Yankui Liu, and Xin Zhang. "Stochastic p-Hub Center Problem with Discrete Time Distributions." In Advances in Neural Networks – ISNN 2011, 182–91. Berlin, Heidelberg: Springer Berlin Heidelberg, 2011. http://dx.doi.org/10.1007/978-3-642-21090-7_22.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Quevedo-Orozco, Dagoberto R., and Roger Z. Ríos-Mercado. "A New Heuristic for the Capacitated Vertex p-Center Problem." In Advances in Artificial Intelligence, 279–88. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-40643-0_29.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

De Walsche, Niels, Carlo S. Sartori, and Hatice Çalık. "A Radius-Based Approach for the Bi-Objective p-Center and p-Dispersion Problem." In Lecture Notes in Computer Science, 533–49. Cham: Springer Nature Switzerland, 2023. http://dx.doi.org/10.1007/978-3-031-43612-3_33.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Chen, Li-Hsuan, Sun-Yuan Hsieh, Ling-Ju Hung, and Ralf Klasing. "The Approximability of the p-hub Center Problem with Parameterized Triangle Inequality." In Lecture Notes in Computer Science, 112–23. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-62389-4_10.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Chen, Li-Hsuan, Sun-Yuan Hsieh, Ling-Ju Hung, and Ralf Klasing. "Approximation Algorithms for the p-Hub Center Routing Problem in Parameterized Metric Graphs." In Lecture Notes in Computer Science, 115–27. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-94667-2_10.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Chen, Li-Hsuan, Sun-Yuan Hsieh, Ling-Ju Hung, Ralf Klasing, Chia-Wei Lee, and Bang Ye Wu. "On the Complexity of the Star p-hub Center Problem with Parameterized Triangle Inequality." In Lecture Notes in Computer Science, 152–63. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-57586-5_14.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Ding, Wei, and Ke Qiu. "Constant-Factor Greedy Algorithms for the Asymmetric p-Center Problem in Parameterized Complete Digraphs." In Algorithmic Aspects in Information and Management, 62–71. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-27195-4_6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Londe, Mariana A., Luciana S. Pessoa, and Carlos E. Andrade. "The P-Next Center Problem with Capacity and Coverage Radius Constraints: Model and Heuristics." In Metaheuristics, 335–49. Cham: Springer International Publishing, 2023. http://dx.doi.org/10.1007/978-3-031-26504-4_24.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Variants of the p-Center problem":

1

de Weerdt, Mathijs, Michael Albert, Vincent Conitzer, and Koos van der Linden. "Complexity of Scheduling Charging in the Smart Grid." In Twenty-Seventh International Joint Conference on Artificial Intelligence {IJCAI-18}. California: International Joint Conferences on Artificial Intelligence Organization, 2018. http://dx.doi.org/10.24963/ijcai.2018/658.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
The problem of optimally scheduling the charging demand of electric vehicles within the constraints of the electricity infrastructure is called the charge scheduling problem. The models of the charging speed, horizon, and charging demand determine the computational complexity of the charge scheduling problem. We show that for about 20 variants the problem is either in P or weakly NP-hard and dynamic programs exist to compute optimal solutions. About 10 other variants of the problem are strongly NP-hard, presenting a potentially significant obstacle to their use in practical situations of scale. An experimental study establishes up to what parameter values the dynamic programs can determine optimal solutions in a couple of minutes.
2

Shrinidhi, M., T. K. Kaushik Jegannathan, and R. Jeya. "Classification of Imbalanced Datasets Using Various Techniques along with Variants of SMOTE Oversampling and ANN." In International Research Conference on IOT, Cloud and Data Science. Switzerland: Trans Tech Publications Ltd, 2023. http://dx.doi.org/10.4028/p-338i7w.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Using Machine Learning and / or Deep Learning for early detection of diseases can help save people’s lives. AI has already been making progress in healthcare as there are newer and improved software to maintain patient records, produce better imaging for error free diagnosis and treatment. One drawback working with real-life datasets is that they are predominantly imbalanced in nature. Most ML and DL algorithms are defined keeping in mind that the dataset is equally distributed. Working on such imbalanced datasets cause the models to end up having high type-1 and type-2 error which is not ideal in the medical field as it can misdiagnose and be fatal. Handling class imbalance thus becomes a necessity lest the ML/DL model fails to learn and starts memorizing the features and noises belonging to the majority class. PIMA Dataset is one such dataset with imbalances in classes as it contains 500 instances of one type and 268 instances of another type. Similarly, the Wisconsin Breast Cancer (Original) Dataset is also a dataset containing imbalanced data related to breast cancer with a total of 699 instances where 458 instances are of one class (Benign tumor images) while 241 instances belong to the other class (Malignant tumor images). Prediction/detection of onset of diabetes or breast cancer with these datasets would be grossly erroneous and hence the need for handling class imbalance increases. We aim at handling the class imbalance problem in this study using various techniques available like weighted class approach, SMOTE (and its variants) with a simple Artificial Neural Network model as the classifier.
3

Hariharan, Smitha, and Venkat Allada. "Uncertain Demand Driven Resource Platform Design for a Service Center." In ASME 2005 International Mechanical Engineering Congress and Exposition. ASMEDC, 2005. http://dx.doi.org/10.1115/imece2005-81191.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Service centers can be viewed as facilities wherein service calls from customers relating to various service families cascade into work packages involving a set of service tasks. A service family is defined as a set of service type variants that have similar or common service tasks and hence may use a common set of resources (called the resource platform) over a given time horizon. In this paper, we present a methodology for determining cost-effective and robust resource platform configuration(s) for a given set of service families offered by a service center. The robust resource platform would be able to handle demand fluctuations from various service types within reasonable limits over a given planning horizon. Several successful studies have been reported in the manufacturing domain on successful application of product platforms to generate customizable product variants with cost advantages. In this paper, we extend the product platform concept to the service domain. The critical parts of the proposed Service Resource Platforming System (SRPS) methodology are: (1) generate rough cut resource selection using dynamic programming and create a resource schedule using linear programming; (2) generate final resource selection using uncertainty linear programming model proposed by Ben-Tal et al. (2003); and (3) construct resource platform using resource sub-sequence clustering concept. The objective for the rough cut and final resource selections is the maximization of the difference between the benefits and costs associated with in-house service processing and outsourcing. The proposed SRPS methodology is applied to an industry motivated problem.
4

Chan, Hau, Aris Filos-Ratsikas, Bo Li, Minming Li, and Chenhao Wang. "Mechanism Design for Facility Location Problems: A Survey." In Thirtieth International Joint Conference on Artificial Intelligence {IJCAI-21}. California: International Joint Conferences on Artificial Intelligence Organization, 2021. http://dx.doi.org/10.24963/ijcai.2021/596.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
The study of approximate mechanism design for facility location has been in the center of research at the intersection of artificial intelligence and economics for the last decade, largely due to its practical importance in various domains, such as social planning and clustering. At a high level, the goal is to select a number of locations on which to build a set of facilities, aiming to optimize some social objective based on the preferences of strategic agents, who might have incentives to misreport their private information. This paper presents a comprehensive survey of the significant progress that has been made since the introduction of the problem, highlighting all the different variants and methodologies, as well as the most interesting directions for future research.
5

Mousa, Amr, and Gerhard Benedikt Weiss. "Advanced Energy Management Strategies for Plug-In Hybrid Electric Vehicles via Deep Reinforcement Learning." In SAE 2022 Intelligent and Connected Vehicles Symposium. 400 Commonwealth Drive, Warrendale, PA, United States: SAE International, 2022. http://dx.doi.org/10.4271/2022-01-7109.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
<div class="section abstract"><div class="htmlview paragraph">Plug-in Hybrid Electric Vehicles (PHEVs) achieve significant fuel economy by utilizing advanced energy management strategies in controlling the power distribution decision in real-time. Traditional heuristic approaches bring no additional benefits, including efficiency and development cost, considering the increasing complexity in control objectives. This paper extends a previous study of the same problem (RL) and vehicle topology to develop a Reinforcement Learning agent by investigating the performance of state-of-the-art algorithms, such as Rainbow-DQN with its variants, PPO and A3C, against the baseline rule-based and Dynamic Programming (DP) strategies. The developed RL agent is optimizing challenging control objectives such as fuel economy, vehicle drivability and driver comfort. The Rainbow-DQN is studied separately to optimize the agent compared to all the algorithm variants and after wards, the best performing variant is compared to tuned PPO and A3C agents. Proper evaluation criteria is defined and the concerned agents are tested with nine different scenarios to examine the generalization capabilities and performance robustness. The results revealed that the A3C agent surplussed both the PPO and the Rainbow-DQN achieving a maximum performance of 98.43% of the DP with a robustness of 97.32% ± 0.78 for the other cycles and an average of 177.7 sec for each engine start compared to 96.3 sec for the rule-based approach. Furthermore, as a future work, the paper investigated and proposed a cloud-based training concept for automated scaled-up training, evaluation and deployment of RL policies for the (P)HEVs of the future.</div></div>
6

Yen, William Chung-Kung, and Chien-Tsai Chen. "The Connected p-Center Problem with Extension." In 9th Joint Conference on Information Sciences. Paris, France: Atlantis Press, 2006. http://dx.doi.org/10.2991/jcis.2006.216.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Zhang, Qingyun, Zhipeng Lü, Zhouxing Su, Chumin Li, Yuan Fang, and Fuda Ma. "Vertex Weighting-Based Tabu Search for p-Center Problem." In Twenty-Ninth International Joint Conference on Artificial Intelligence and Seventeenth Pacific Rim International Conference on Artificial Intelligence {IJCAI-PRICAI-20}. California: International Joint Conferences on Artificial Intelligence Organization, 2020. http://dx.doi.org/10.24963/ijcai.2020/206.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
The p-center problem consists of choosing p centers from a set of candidates to minimize the maximum cost between any client and its assigned facility. In this paper, we transform the p-center problem into a series of set covering subproblems, and propose a vertex weighting-based tabu search (VWTS) algorithm to solve them. The proposed VWTS algorithm integrates distinguishing features such as a vertex weighting technique and a tabu search strategy to help the search to jump out of the local optima. Computational experiments on 138 most commonly used benchmark instances show that VWTS is highly competitive comparing to the state-of-the-art methods in spite of its simplicity. As a well-known NP-hard problem which has already been studied for over half a century, it is a challenging task to break the records on these classic datasets. Yet VWTS improves the best known results for 14 out of 54 large instances, and matches the optimal results for all remaining 84 ones. In addition, the computational time taken by VWTS is much shorter than other algorithms in the literature.
8

Ferone, Daniele, Paola Festa, Antonio Napoletano, and Mauricio G. C. Resende. "On the fast solution of the p-center problem." In 2017 19th International Conference on Transparent Optical Networks (ICTON). IEEE, 2017. http://dx.doi.org/10.1109/icton.2017.8024978.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Shariff, S. Sarifah Radiah, Mohd Omar, and Noor Hasnah Moin. "Location Routing Inventory Problem with Transshipment Points Using p-Center." In 2016 International Conference on Industrial Engineering, Management Science and Application (ICIMSA). IEEE, 2016. http://dx.doi.org/10.1109/icimsa.2016.7504016.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Bashiri, M., and S. Mehrabi. "Stochastic p-hub center covering problem with delivery time constraint." In EM). IEEE, 2010. http://dx.doi.org/10.1109/ieem.2010.5674340.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Variants of the p-Center problem":

1

Moreno, Viviana Carolina, Ximena Castro, Claudia Marcela Muñoz, and Giomar Sichaca Ávila. Situación de salud pública y migración en tiempos de pandemia, Necoclí, Antioquia, 2021. Instituto Nacional de Salud, January 2022. http://dx.doi.org/10.33610/01229907.2022v4n1a1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Introducción: Necoclí es un municipio del departamento de Antioquía, ubicado en la costa Atlántica, con 70 824 habitantes. Desde junio de 2021 se identificó represamiento de migrantes de Suramérica y África. En este artículo se presentan algunos factores que pueden favorecer la ocurrencia de eventos de interés en salud pública. El objetivo de este estudio fue caracterizar la situación de salud pública y migración en Necoclí. Materiales y métodos: estudio descriptivo, para conocer la situación epidemiológica se priorizaron eventos de interés en salud pública y se identificaron comportamientos inusuales. Se realizó búsqueda activa institucional, PCR en tiempo real para detección SARS-CoV-2, vigilancia genómica y georreferenciación de asentamientos de migrantes. Se realizaron canales endémicos, distribución de frecuencias, medidas de tendencia central y dispersión y cálculo de positividad para SARS-CoV-2. Resultados: se identificaron aproximadamente 14 500 migrantes en condiciones de hacinamiento, los principales puntos de ubicación fueron el barrio Caribe y Simón Bolívar. Se observaron comportamientos inusuales para malaria, dengue, infección respiratoria aguda (poisson=0.00) y variación >30 %. Se identificaron 98 casos sin notificar al sistema de vigilancia. Se identificó un brote de malaria con 91 casos. Se tomaron 299 muestras para COVID-19, mediana de edad 40 años (RIQ: 29 - 51), se confirmaron 14 casos de COVID-19 (positividad de 4,7 %), se secuenciaron variantes de interés Mu (B.1.621) y preocupación Gamma (P.1.7). Se establecieron acciones para abordar problemas sanitarios como deficiencia de agua potable, manejo de residuos y hacinamiento; también se apoyó el fortalecimiento de la vigilancia y las estrategias para agilizar paso de migrantes. Conclusiones: se identificó represamiento de migrantes, deficiencia en las condiciones higiénico-sanitarias, brote de COVID-19 y malaria. Se requiere continuar realizando acciones integrales a nivel nacional, departamental y municipal con participación de la comunidad.
2

Vail, Kylin, Bret Lizundia, David Welch, and Evan Reis. Earthquake Damage Workshop (PEER-CEA Project). Pacific Earthquake Engineering Research Center, University of California, Berkeley, CA, November 2020. http://dx.doi.org/10.55461/plbd5536.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
This report is one of a series of reports documenting the methods and findings of a multi-year, multi-disciplinary project coordinated by the Pacific Earthquake Engineering Research Center (PEER and funded by the California Earthquake Authority (CEA). The overall project is titled “Quantifying the Performance of Retrofit of Cripple Walls and Sill Anchorage in Single-Family Wood-Frame Buildings,” henceforth referred to as the “PEER–CEA Project.” The overall objective of the PEER–CEA Project is to provide scientifically based information (e.g., testing, analysis, and resulting loss models) that measure and assess the effectiveness of seismic retrofit to reduce the risk of damage and associated losses (repair costs) of wood-frame houses with cripple wall and sill anchorage deficiencies as well as retrofitted conditions that address those deficiencies. Tasks that support and inform the loss-modeling effort are: (1) collecting and summarizing existing information and results of previous research on the performance of wood-frame houses; (2) identifying construction features to characterize alternative variants of wood-frame houses; (3) characterizing earthquake hazard and ground motions at representative sites in California; (4) developing cyclic loading protocols and conducting laboratory tests of cripple wall panels, wood-frame wall subassemblies, and sill anchorages to measure and document their response (strength and stiffness) under cyclic loading; and (5) the computer modeling, simulations, and the development of loss models as informed by a workshop with claims adjustors. This report is a product of Working Group 6 (WG6): Interaction with Claims Adjustors & Catastrophe Modelers and focuses on a damage workshop effort undertaken to provide repair estimates of representative damaged single-family wood-frame case study buildings to compare the differences in costs between houses with and without retrofits to cripple walls and sill anchorage. At the request of the CEA, 11 experienced claims adjustors from insurance companies volunteered to provide the estimates. Electronic cost estimation files for each case study building were developed by the PEER–CEA Project Team using the Verisk Xactware Xactimate X1 platform and provided to the claims adjustors to complete their estimates. These adjustor estimates served as the baseline for comparison against the FEMA P-58 [FEMA 2012] methodology used on the project for loss estimation. The term “damage workshop effort” is used to emphasize that the scope of work included not just a successful workshop meeting, but the broader development of a damage description package describing case studies and associated Xactimate descriptions before the workshop meeting and revisions after it, two rounds of estimates and survey question responses by adjustors, interpretation and clarification of the estimates for consistency, and synthesizing of estimate findings and survey responses into conclusions and recommendations. Three building types were investigated, each with an unretrofitted and a retrofitted condition. These were then assessed at four levels of damage, resulting in a total of 24 potential scenarios. Because of similarities, only 17 scenarios needed unique Xactimate estimates. Each scenario was typically estimated by three to five adjustors, resulting in a final total of 74 different estimates.
3

Schiller, Brandon, Tara Hutchinson, and Kelly Cobeen. Cripple Wall Small-Component Test Program: Wet Specimens I (PEER-CEA Project). Pacific Earthquake Engineering Research Center, University of California, Berkeley, CA, November 2020. http://dx.doi.org/10.55461/dqhf2112.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
This report is one of a series of reports documenting the methods and findings of a multi-year, multi-disciplinary project coordinated by the Pacific Earthquake Engineering Research Center (PEER and funded by the California Earthquake Authority (CEA). The overall project is titled “Quantifying the Performance of Retrofit of Cripple Walls and Sill Anchorage in Single-Family Wood-Frame Buildings,” henceforth referred to as the “PEER–CEA Project.” The overall objective of the PEER–CEA Project is to provide scientifically based information (e.g., testing, analysis, and resulting loss models) that measure and assess the effectiveness of seismic retrofit to reduce the risk of damage and associated losses (repair costs) of wood-frame houses with cripple wall and sill anchorage deficiencies as well as retrofitted conditions that address those deficiencies. Tasks that support and inform the loss-modeling effort are: (1) collecting and summarizing existing information and results of previous research on the performance of wood-frame houses; (2) identifying construction features to characterize alternative variants of wood-frame houses; (3) characterizing earthquake hazard and ground motions at representative sites in California; (4) developing cyclic loading protocols and conducting laboratory tests of cripple wall panels, wood-frame wall subassemblies, and sill anchorages to measure and document their response (strength and stiffness) under cyclic loading; and (5) the computer modeling, simulations, and the development of loss models as informed by a workshop with claims adjustors. This report is a product of Working Group 4: Testing and focuses on the first phase of an experimental investigation to study the seismic performance of retrofitted and existing cripple walls with sill anchorage. Paralleled by a large-component test program conducted at the University of California [Cobeen et al. 2020], the present study involves the first of multiple phases of small-component tests conducted at the UC San Diego. Details representative of era-specific construction, specifically the most vulnerable pre-1960s construction, are of predominant focus in the present effort. Parameters examined are cripple wall height, finish materials, gravity load, boundary conditions, anchorage, and deterioration. This report addresses the first phase of testing, which consisted of six specimens. Phase 1 including quasi-static reversed cyclic lateral load testing of six 12-ft-long, 2-ft high cripple walls. All specimens in this phase were finished on their exterior with stucco over horizontal sheathing (referred to as a “wet” finish), a finish noted to be common of dwellings built in California before 1945. Parameters addressed in this first phase include: boundary conditions on the top, bottom, and corners of the walls, attachment of the sill to the foundation, and the retrofitted condition. Details of the test specimens, testing protocol, instrumentation; and measured as well as physical observations are summarized in this report. In addition, this report discusses the rationale and scope of subsequent small-component test phases. Companion reports present these test phases considering, amongst other variables, the impacts of dry finishes and cripple wall height (Phases 2–4). Results from these experiments are intended to provide an experimental basis to support numerical modeling used to develop loss models, which are intended to quantify the reduction of loss achieved by applying state-of-practice retrofit methods as identified in FEMA P-1100, Vulnerability-Base Seismic Assessment and Retrofit of One- and Two-Family Dwellings.
4

Schiller, Brandon, Tara Hutchinson, and Kelly Cobeen. Cripple Wall Small-Component Test Program: Wet Specimens II (PEER-CEA Project). Pacific Earthquake Engineering Research Center, University of California, Berkeley, CA, November 2020. http://dx.doi.org/10.55461/ldbn4070.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
This report is one of a series of reports documenting the methods and findings of a multi-year, multi-disciplinary project coordinated by the Pacific Earthquake Engineering Research Center (PEER and funded by the California Earthquake Authority (CEA). The overall project is titled “Quantifying the Performance of Retrofit of Cripple Walls and Sill Anchorage in Single-Family Wood-Frame Buildings,” henceforth referred to as the “PEER–CEA Project.” The overall objective of the PEER–CEA Project is to provide scientifically based information (e.g., testing, analysis, and resulting loss models) that measure and assess the effectiveness of seismic retrofit to reduce the risk of damage and associated losses (repair costs) of wood-frame houses with cripple wall and sill anchorage deficiencies as well as retrofitted conditions that address those deficiencies. Tasks that support and inform the loss-modeling effort are: (1) collecting and summarizing existing information and results of previous research on the performance of wood-frame houses; (2) identifying construction features to characterize alternative variants of wood-frame houses; (3) characterizing earthquake hazard and ground motions at representative sites in California; (4) developing cyclic loading protocols and conducting laboratory tests of cripple wall panels, wood-frame wall subassemblies, and sill anchorages to measure and document their response (strength and stiffness) under cyclic loading; and (5) the computer modeling, simulations, and the development of loss models as informed by a workshop with claims adjustors. This report is a product of Working Group 4 (WG4): Testing, whose central focus was to experimentally investigate the seismic performance of retrofitted and existing cripple walls. This report focuses stucco or “wet” exterior finishes. Paralleled by a large-component test program conducted at the University of California, Berkeley (UC Berkeley) [Cobeen et al. 2020], the present study involves two of multiple phases of small-component tests conducted at the University of California San Diego (UC San Diego). Details representative of era-specific construction, specifically the most vulnerable pre-1960s construction, are of predominant focus in the present effort. Parameters examined are cripple wall height, finish style, gravity load, boundary conditions, anchorage, and deterioration. This report addresses the third phase of testing, which consisted of eight specimens, as well as half of the fourth phase of testing, which consisted of six specimens where three will be discussed. Although conducted in different phases, their results are combined here to co-locate observations regarding the behavior of the second phase the wet (stucco) finished specimens. The results of first phase of wet specimen tests were presented in Schiller et al. [2020(a)]. Experiments involved imposition of combined vertical loading and quasi-static reversed cyclic lateral load onto ten cripple walls of 12 ft long and 2 or 6 ft high. One cripple wall was tested with a monotonic loading protocol. All specimens in this report were constructed with the same boundary conditions on the top and corners of the walls as well as being tested with the same vertical load. Parameters addressed in this report include: wet exterior finishes (stucco over framing, stucco over horizontal lumber sheathing, and stucco over diagonal lumber sheathing), cripple wall height, loading protocol, anchorage condition, boundary condition at the bottom of the walls, and the retrofitted condition. Details of the test specimens, testing protocol, including instrumentation; and measured as well as physical observations are summarized in this report. Companion reports present phases of the tests considering, amongst other variables, impacts of various boundary conditions, stucco (wet) and non-stucco (dry) finishes, vertical load, cripple wall height, and anchorage condition. Results from these experiments are intended to support advancement of numerical modeling tools, which ultimately will inform seismic loss models capable of quantifying the reduction of loss achieved by applying state-of-practice retrofit methods as identified in FEMA P-1100,Vulnerability-Base Seismic Assessment and Retrofit of One- and Two-Family Dwellings.
5

Schiller, Brandon, Tara Hutchinson, and Kelly Cobeen. Comparison of the Response of Small- and Large-Component Cripple Wall Specimens Tested under Simulated Seismic Loading (PEER-CEA Project). Pacific Earthquake Engineering Research Center, University of California, Berkeley, CA, November 2020. http://dx.doi.org/10.55461/iyca1674.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
This report is one of a series of reports documenting the methods and findings of a multi-year, multi-disciplinary project coordinated by the Pacific Earthquake Engineering Research Center (PEER and funded by the California Earthquake Authority (CEA). The overall project is titled “Quantifying the Performance of Retrofit of Cripple Walls and Sill Anchorage in Single-Family Wood-Frame Buildings,” henceforth referred to as the “PEER–CEA Project.” The overall objective of the PEER–CEA Project is to provide scientifically based information (e.g., testing, analysis, and resulting loss models) that measure and assess the effectiveness of seismic retrofit to reduce the risk of damage and associated losses (repair costs) of wood-frame houses with cripple wall and sill anchorage deficiencies as well as retrofitted conditions that address those deficiencies. Tasks that support and inform the loss-modeling effort are: (1) collecting and summarizing existing information and results of previous research on the performance of wood-frame houses; (2) identifying construction features to characterize alternative variants of wood-frame houses; (3) characterizing earthquake hazard and ground motions at representative sites in California; (4) developing cyclic loading protocols and conducting laboratory tests of cripple wall panels, wood-frame wall subassemblies, and sill anchorages to measure and document their response (strength and stiffness) under cyclic loading; and (5) the computer modeling, simulations, and the development of loss models as informed by a workshop with claims adjustors. This report is a product of Working Group 4: Testing, whose central focus was to experimentally investigate the seismic performance of retrofitted and existing cripple walls. Two testing programs were conducted; the University of California, Berkeley (UC Berkeley) focused on large-component tests; and the University of California San Diego (UC San Diego) focused on small-component tests. The primary objectives of the tests were to develop descriptions of the load-deflection behavior of components and connections for use by Working Group 5 in developing numerical models and collect descriptions of damage at varying levels of drift for use by Working Group 6 in developing fragility functions. This report considers two large-component cripple wall tests performed at UC Berkeley and several small-component tests performed at UC San Diego that resembled the testing details of the large-component tests. Experiments involved imposition of combined vertical loading and quasi-static reversed cyclic lateral load on cripple wall assemblies. The details of the tests are representative of era-specific construction, specifically the most vulnerable pre-1945 construction. All cripple walls tested were 2 ft high and finished with stucco over horizontal lumber sheathing. Specimens were tested in both the retrofitted and unretrofitted condition. The large-component tests were constructed as three-dimensional components (with a 20-ft  4-ft floor plan) and included the cripple wall and a single-story superstructure above. The small-component tests were constructed as 12-ft-long two-dimensional components and included only the cripple wall. The pairing of small- and large-component tests was considered to make a direct comparison to determine the following: (1) how closely small-component specimen response could emulate the response of the large-component specimens; and (2) what boundary conditions in the small-component specimens led to the best match the response of the large-component specimens. The answers to these questions are intended to help identify best practices for the future design of cripple walls in residential housing, with particular interest in: (1) supporting the realistic design of small-component specimens that may capture the response large-component specimen response; and (2) to qualitatively determine where the small-component tests fall in the range of lower- to upper-bound estimation of strength and deformation capacity for the purposes of numerical modelling. Through these comparisons, the experiments will ultimately advance numerical modeling tools, which will in turn help generate seismic loss models capable of quantifying the reduction of loss achieved by applying state-of-practice retrofit methods as identified in FEMA P-1100Vulnerability-Base Seismic Assessment and Retrofit of One- and Two-Family Dwellings. To this end, details of the test specimens, measured as well as physical observations, and comparisons between the two test programs are summarized in this report.
6

Vargas-Herrera, Hernando, Juan José Ospina, Carlos Alfonso Huertas-Campos, Adolfo León Cobo-Serna, Edgar Caicedo-García, Juan Pablo Cote-Barón, Nicolás Martínez-Cortés, et al. Informe de Política Monetaria - Julio de 2021. Banco de la República de Colombia, August 2021. http://dx.doi.org/10.32468/inf-pol-mont-eng.tr3.-2021.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
1.1 Resumen macroeconómico En el segundo trimestre la economía enfrentó varios choques, principalmente de oferta y de costos, la mayoría de los cuales no fueron anticipados, o los previstos fueron más persistentes de lo esperado, y que en conjunto interrumpieron la recuperación de la actividad económica observada a comienzos de año y llevaron la inflación total a niveles superiores a la meta. La inflación básica (sin alimentos ni regulados: SAR) aumentó, pero se mantuvo baja y acorde con lo esperado por el equipo técnico. A comienzos de abril se inició una tercera ola de pandemia, más acentuada y prolongada que la anterior, con un elevado costo en vidas humanas y algún impacto negativo en la recuperación económica. Entre mayo y mediados de junio los bloqueos de las carreteras y los problemas de orden público tuvieron un fuerte efecto negativo sobre la actividad económica y la inflación. Se estima que la magnitud de estos dos choques combinados habría generado una caída en niveles en el producto interno bruto (PIB) con respecto al primer trimestre del año. Adicionalmente, los bloqueos causaron un aumento significativo de los precios de los alimentos. A estos choques se sumaron los efectos acumulados de la disrupción global en algunas cadenas de valor y el incremento en los fletes internacionales que desde finales de 2020 vienen generando restricciones de oferta y aumentos de costos. Todos estos factores, que afectaron principalmente el índice de precios al consumidor (IPC) de bienes y de alimentos, explicaron la mayor parte del error de pronóstico del equipo técnico y el aumento de la inflación total a niveles superiores a la meta del 3 %. El incremento en la inflación básica y de los precios de los regulados fue acorde con lo esperado por el equipo técnico, y se explica principalmente por la eliminación de varios alivios de precios otorgados un año atrás. A todo esto se suma la mayor percepción de riesgo soberano y las presiones al alza que esto implica sobre el costo de financiamiento externo y la tasa de cambio. A pesar de los fuertes choques negativos, el crecimiento económico esperado para la primera mitad del año (9,1%), es significativamente mayor que lo proyectado en el informe de abril (7,1%), signo de una economía más dinámica que se recuperaría más rápido de lo previsto. Desde finales de 2020 las diferentes cifras de actividad económica han mostrado un crecimiento mayor que el esperado. Esto sugiere que los efectos negativos sobre el producto de las recurrentes olas de contagio estarían siendo cada vez menos fuertes y duraderos. No obstante, la tercera ola de contagio del Covid-19, y en mayor medida los bloqueos a las vías y los problemas de orden público, habrían generado una caída del PIB durante el segundo trimestre, frente al primero. Pese a lo anterior, los datos del índice de seguimiento a la economía (ISE) de abril y mayo han resultado mayores que lo esperado, y las nuevas cifras de actividad económica sectoriales sugieren que el impacto negativo de la pandemia sobre el producto se sigue moderando, en un entorno de menores restricciones a la movilidad y de mayor avance en el ritmo de vacunación. Los registros de transporte de carga (junio) y la demanda de energía no regulada (julio), entre otros, indican una recuperación importante después de los bloqueos en mayo. Con todo lo anterior, el incremento anual del PIB del segundo trimestre se habría situado alrededor del 17,3 % (antes 15,8 %), explicado en gran parte por una base baja de comparación. Para todo 2021 el equipo técnico incrementó su proyección de crecimiento desde un 6 % hasta el 7,5 %. Este pronóstico, que está rodeado de una incertidumbre inusualmente elevada, supone que no se presentarán problemas de orden público y que posibles nuevas olas de contagio del Covid-19 no tendrán efectos negativos adicionales sobre la actividad económica. Frente al pronóstico del informe pasado, la recuperación de la demanda externa, los niveles de precios de algunos bienes básicos que exporta el país y la dinámica de las remesas de trabajadores han sido mejores que las esperadas y seguirían impulsando la recuperación del ingreso nacional en lo que resta del año. A esto se sumaría la aún amplia liquidez internacional, la aceleración en el proceso de vacunación y las bajas tasas de interés, factores que continuarían favoreciendo la actividad económica. La mejor dinámica del primer semestre, que llevó a una revisión al alza en el crecimiento de todos los componentes del gasto, continuaría hacia adelante y, antes de lo esperado en abril, la economía recuperaría los niveles de producción de 2019 a finales de 2021. El pronóstico continúa incluyendo efectos de corto plazo sobre la demanda agregada de una reforma tributaria de magnitud similar a la proyectada por el Gobierno. Con todo eso, en el escenario central de este informe, el pronóstico de crecimiento para 2021 es del 7,5 % y para 2022 del 3,1 %. A pesar de esto, el nivel de la actividad económica seguiría siendo inferior a su potencial. La mejora en estas proyecciones, sin embargo, está rodeada de una alta incertidumbre. En junio la inflación anual (3,63 %) aumentó más de lo esperado debido al comportamiento del grupo de alimentos, mientras que la inflación básica (1,87 %) fue similar a la proyectada. En lo que resta del año el mayor nivel del IPC de alimentos persistiría y contribuiría a mantener la inflación por encima de la meta. A finales de 2022 la inflación total y básica retornarían a tasas cercanas al 3 %, en un entorno de desaceleración del IPC de alimentos y de menores excesos de capacidad productiva. En los meses recientes el aumento en los precios internacionales de los fletes y de los bienes agrícolas, y las mayores exportaciones de carne y el ciclo ganadero han ejercido presiones al alza sobre el precio de los alimentos, principalmente de los procesados. A estas fuerzas persistentes se sumaron los bloqueos de las vías nacionales y los problemas de orden público en varias ciudades registrados en mayo y parte de junio, los cuales se reflejaron en una fuerte restricción en la oferta y en un aumento anual no esperado del IPC de alimentos (8,52 %). El grupo de regulados (5,93 %) también se aceleró, debido a la baja base de comparación en los precios de la gasolina y a la disolución de parte de los alivios a las tarifas de servicios públicos otorgados en 2020. Como se proyectaba, la inflación SAR repuntó al 1,87 %, debido a la reactivación de los impuestos indirectos de algunos bienes y servicios eliminados un año atrás, y por las presiones al alza que ejercieron los alimentos sobre las comidas fuera del hogar (CFH), entre otros. En lo que resta del año se espera que el aumento en los alimentos perecederos se revierta, siempre y cuando no se registren nuevos bloqueos duraderos a las vías nacionales. El mayor nivel de precios de los alimentos procesados persistiría y contribuiría a mantener la inflación por encima de la meta a finales de año. La inflación SAR continuaría con una tendencia creciente, en la medida en que los excesos de capacidad productiva se sigan cerrando y registraría un aumento transitorio en marzo de 2022, debido principalmente al restablecimiento del impuesto al consumo en las CFH. Con todo esto, para finales de 2021 y 2022 se estima una inflación total del 4,1 % y 3,1 %, y una inflación básica del 2,6 % y 3,2 %, respectivamente. El comportamiento conjunto de los precios del IPC SAR, junto con continuas sorpresas al alza en la actividad económica, son interpretados por el equipo técnico como señales de amplios excesos de capacidad productiva de la economía. Estos persistirían en los siguientes dos años, al final de los cuales la brecha del producto se cerraría. El mayor crecimiento económico sugiere una brecha del producto menos negativa que la estimada hace un trimestre. Sin embargo, el comportamiento de la inflación básica, especialmente en servicios, indica que el PIB potencial se ha recuperado de forma sorpresiva y que los excesos de capacidad siguen siendo amplios, con una demanda agregada afectada de forma persistente. Esta interpretación encuentra soporte en el mercado laboral, en donde persiste un desempleo alto y la recuperación de los empleos perdidos se estancó. Adicionalmente, los aumentos en la inflación en buena medida están explicados por choques de oferta y de costos y por la disolución de algunos alivios de precios otorgados un año atrás. Los pronósticos de crecimiento y de inflación descritos son coherentes con una brecha del producto que se cierra más rápido y es menos negativa en todo el horizonte de pronóstico con respecto al informe de abril. No obstante, la incertidumbre sobre los excesos de capacidad es muy alta y es un riesgo sobre el pronóstico. Las perspectivas de las cuentas fiscales de Colombia se deterioraron, Standard & Poor’s Global Ratings (S&P) y Fitch Ratings (Fitch) redujeron su calificación crediticia, los bloqueos y problemas de orden público afectaron el producto y el país enfrentó una nueva ola de contagios de Covid-19 más acentuada y prolongada que las pasadas. Todo lo anterior se ha reflejado en un aumento de las primas de riesgo y en una depreciación del peso frente al dólar. Esto ha ocurrido en un entorno favorable de ingresos externos. Los precios internacionales del petróleo, del café y de otros bienes básicos que exporta el país aumentaron y han contribuido a la recuperación de los términos de intercambio y del ingreso nacional, y han mitigado las presiones al alza sobre las primas de riesgo y la tasa de cambio. En el presente informe se incrementó el precio esperado del petróleo para 2021 a USD 68 por barril (antes USD 61 bl) y para 2022 a USD 66 bl (antes USD 60 bl). Esta mayor senda presenta una convergencia hacia precios menores que los observados recientemente, como resultado de una mayor oferta mundial esperada de petróleo, la cual más que compensaría el incremento en la demanda de este bien básico. Por ende, se supone que el aumento reciente de los precios tiene un carácter transitorio. En el escenario macroeconómico actual se espera que las condiciones financieras internacionales sean algo menos favorables, a pesar de la mejora en los ingresos externos por cuenta de una mayor demanda y unos precios del petróleo y de otros productos de exportación más altos. Frente al informe de abril el crecimiento de la demanda externa fue mejor que el esperado, y las proyecciones para 2021 y 2022 aumentaron del 5,2 % al 6,0 % y del 3,4 % al 3,5 %, respectivamente. En lo corrido del año las cifras de actividad económica muestran una demanda externa más dinámica de la esperada. En los Estados Unidos y China la recuperación del producto ha sido más rápida que la registrada en los países de la región. En estos últimos la reactivación económica ha estado limitada por los rebrotes del Covid-19, las limitaciones en la oferta de vacunas y el poco espacio fiscal para enfrentar la pandemia, entre otros factores. La buena dinámica en el comercio externo de bienes se ha dado en un entorno de deterioro en las cadenas de valor y de un aumento importante en los precios de las materias primas y en el costo de los fletes. En los Estados Unidos la inflación sorprendió al alza y su valor observado y esperado se mantiene por encima de la meta, al tiempo que se incrementó la proyección de crecimiento económico. Con esto, el inicio de la normalización de la política monetaria en ese país se daría antes de lo proyectado. En este informe se estima que el primer incremento en la tasa de interés de la Reserva Federal de los Estados Unidos se dé a finales de 2022 (antes del primer trimestre de 2023). Para Colombia se supone una mayor prima de riesgo frente al informe de abril y se sigue esperando que presente una tendencia creciente, dada la acumulación de deuda pública y externa del país. Todo esto contribuiría a un incremento en el costo del financiamiento externo en el horizonte de pronóstico. La postura expansiva de la política monetaria sigue soportando unas condiciones financieras internas favorables. En el segundo trimestre la tasa de interés interbancaria y el índice bancario de referencia (IBR) se han mantenido acordes con la tasa de interés de política. Las tasas de interés promedio de captación y crédito continuaron históricamente bajas, a pesar de algunos incrementos observados a finales de junio. La cartera en moneda nacional detuvo su desaceleración anual y, entre marzo y junio, el crédito a los hogares se aceleró, principalmente para compra de vivienda. La recuperación de la cartera comercial y de los desembolsos a ese sector fue importante, y se alcanzó de nuevo el elevado saldo observado un año atrás, cuando las empresas requirieron niveles significativos de liquidez para enfrentar los efectos económicos de la pandemia. El riesgo de crédito aumentó, las provisiones se mantienes altas y algunos bancos han retirado de su balance una parte de su cartera vencida. No obstante, las utilidades del sistema financiero se han recuperado y sus niveles de liquidez y solvencia se mantienen por encima del mínimo regulatorio. A partir de este informe se implementará una nueva metodología para cuantificar y comunicar la incertidumbre que rodea los pronósticos del escenario macroeconómico central, en un entorno de política monetaria activa. Esta metodología se conoce como densidades predictivas (DP) y se explica en detalle en el Recuadro 1. Partiendo del balance de riesgos que contiene los principales factores que, de acuerdo con el juicio del equipo técnico, podrían afectar a la economía en el horizonte de pronóstico, la metodología DP produce distribuciones de probabilidad sobre el pronóstico de las principales variables (v. g.: crecimiento, inflación). Estas distribuciones reflejan el resultado de los posibles choques (a variables externas, precios y actividad económica) que podría recibir la economía y su transmisión, considerando la estructura económica y la respuesta de política monetaria en el futuro. En este sentido, permiten cuantificar la incertidumbre alrededor del pronóstico y su sesgo. El ejercicio DP muestra un sesgo a la baja en el crecimiento económico y en la brecha del producto, y al alza en la inflación. El balance de riesgos indica que las disyuntivas para la política monetaria serán potencialmente más complejas que lo contemplado en el pasado. Por el lado de las condiciones de financiamiento externo, se considera que el mayor riesgo es que se tornen un poco menos favorables, en un escenario en el cual la Reserva Federal de los Estados Unidos incremente con mayor prontitud su tasa de interés. Esto último, ante un crecimiento económico y del empleo mayor que el esperado en los Estados Unidos que genere presiones significativas sobre la inflación de ese país. A esto se suma la incertidumbre sobre el panorama fiscal en Colombia y sus efectos sobre la prima de riesgo y el costo del financiamiento externo. En el caso del crecimiento, la mayoría de los riesgos son a la baja, destacándose los efectos de la incertidumbre política y fiscal sobre las decisiones de consumo e inversión, la aparición de nuevas olas de contagio de la pandemia del Covid-19 y sus impactos sobre la actividad económica. En el caso de la inflación, se incorporó el riesgo de una mayor persistencia de los choques asociados con la disrupción de las cadenas de valor, mayores precios internacionales de las materias primas y de los alimentos, y una recuperación más lenta que la esperada de la cadena agrícola nacional afectada por los pasados bloqueos a las vías. Estos riesgos presionarían al alza principalmente los precios de los alimentos y de los bienes. Como principal riesgo a la baja se incluyó un alza de los arriendos menor que el esperado en el escenario central, explicada por una demanda débil y por una mayor oferta en 2022 dadas las altas ventas de vivienda observadas en el presente año. Con todo, el crecimiento económico presenta un sesgo a la baja y, con el 90 % de confianza, se encontraría entre un 6,1 % y 9,1 % para 2021 y entre el 0,5 % y 4,1 % para 2022. La brecha del producto tendría un sesgo a la baja, principalmente en 2022. El sesgo de la inflación es al alza, y se encontraría entre el 3,7 % y 4,9 % en 2021, y el 2,2 % y 4,7 % en 2022, con un 90 % de probabilidad. 1.2 Decisión de política monetaria En las reuniones de junio y julio la JDBR decidió mantener la tasa de política monetaria inalterada en 1,75 %.

To the bibliography