Um die anderen Arten von Veröffentlichungen zu diesem Thema anzuzeigen, folgen Sie diesem Link: Resource bounds analysis.

Zeitschriftenartikel zum Thema „Resource bounds analysis“

Geben Sie eine Quelle nach APA, MLA, Chicago, Harvard und anderen Zitierweisen an

Wählen Sie eine Art der Quelle aus:

Machen Sie sich mit Top-50 Zeitschriftenartikel für die Forschung zum Thema "Resource bounds analysis" bekannt.

Neben jedem Werk im Literaturverzeichnis ist die Option "Zur Bibliographie hinzufügen" verfügbar. Nutzen Sie sie, wird Ihre bibliographische Angabe des gewählten Werkes nach der nötigen Zitierweise (APA, MLA, Harvard, Chicago, Vancouver usw.) automatisch gestaltet.

Sie können auch den vollen Text der wissenschaftlichen Publikation im PDF-Format herunterladen und eine Online-Annotation der Arbeit lesen, wenn die relevanten Parameter in den Metadaten verfügbar sind.

Sehen Sie die Zeitschriftenartikel für verschiedene Spezialgebieten durch und erstellen Sie Ihre Bibliographie auf korrekte Weise.

1

Kahn, David M., und Jan Hoffmann. „Automatic amortized resource analysis with the Quantum physicist’s method“. Proceedings of the ACM on Programming Languages 5, ICFP (22.08.2021): 1–29. http://dx.doi.org/10.1145/3473581.

Der volle Inhalt der Quelle
Annotation:
We present a novel method for working with the physicist's method of amortized resource analysis, which we call the quantum physicist's method. These principles allow for more precise analyses of resources that are not monotonically consumed, like stack. This method takes its name from its two major features, worldviews and resource tunneling, which behave analogously to quantum superposition and quantum tunneling. We use the quantum physicist's method to extend the Automatic Amortized Resource Analysis (AARA) type system, enabling the derivation of resource bounds based on tree depth. In doing so, we also introduce remainder contexts, which aid bookkeeping in linear type systems. We then evaluate this new type system's performance by bounding stack use of functions in the Set module of OCaml's standard library. Compared to state-of-the-art implementations of AARA, our new system derives tighter bounds with only moderate overhead.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
2

Navas, Jorge, Mario Méndez-Lojo und Manuel V. Hermenegildo. „User-Definable Resource Usage Bounds Analysis for Java Bytecode“. Electronic Notes in Theoretical Computer Science 253, Nr. 5 (Dezember 2009): 65–82. http://dx.doi.org/10.1016/j.entcs.2009.11.015.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
3

SERRANO, A., P. LOPEZ-GARCIA und M. V. HERMENEGILDO. „Resource Usage Analysis of Logic Programs via Abstract Interpretation Using Sized Types“. Theory and Practice of Logic Programming 14, Nr. 4-5 (Juli 2014): 739–54. http://dx.doi.org/10.1017/s147106841400057x.

Der volle Inhalt der Quelle
Annotation:
AbstractWe present a novel general resource analysis for logic programs based on sized types. Sized types are representations that incorporate structural (shape) information and allow expressing both lower and upper bounds on the size of a set of terms and their subterms at any position and depth. They also allow relating the sizes of terms and subterms occurring at different argument positions in logic predicates. Using these sized types, the resource analysis can infer both lower and upper bounds on the resources used by all the procedures in a program as functions on input term (and subterm) sizes, overcoming limitations of existing resource analyses and enhancing their precision. Our new resource analysis has been developed within the abstract interpretation framework, as an extension of the sized types abstract domain, and has been integrated into the Ciao preprocessor, CiaoPP. The abstract domain operations are integrated with the setting up and solving of recurrence equations for inferring both size and resource usage functions. We show that the analysis is an improvement over the previous resource analysis present in CiaoPP and compares well in power to state of the art systems.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
4

ALBERT, ELVIRA, MIQUEL BOFILL, CRISTINA BORRALLERAS, ENRIQUE MARTIN-MARTIN und ALBERT RUBIO. „Resource Analysis driven by (Conditional) Termination Proofs“. Theory and Practice of Logic Programming 19, Nr. 5-6 (September 2019): 722–39. http://dx.doi.org/10.1017/s1471068419000152.

Der volle Inhalt der Quelle
Annotation:
AbstractWhen programs feature a complex control flow, existing techniques for resource analysis produce cost relation systems (CRS) whose cost functions retain the complex flow of the program and, consequently, might not be solvable into closed-form upper bounds. This paper presents a novel approach to resource analysis that is driven by the result of a termination analysis. The fundamental idea is that the termination proof encapsulates the flows of the program which are relevant for the cost computation so that, by driving the generation of the CRS using the termination proof, we produce a linearly-bounded CRS (LB-CRS). A LB-CRS is composed of cost functions that are guaranteed to be locally bounded by linear ranking functions and thus greatly simplify the process of CRS solving. We have built a new resource analysis tool, named MaxCore, that is guided by the VeryMax termination analyzer and uses CoFloCo and PUBS as CRS solvers. Our experimental results on the set of benchmarks from the Complexity and Termination Competition 2019 for C Integer programs show that MaxCore outperforms all other resource analysis tools.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
5

Wu, Zeyang, Kameng Nip und Qie He. „A New Combinatorial Algorithm for Separable Convex Resource Allocation with Nested Bound Constraints“. INFORMS Journal on Computing 33, Nr. 3 (Juli 2021): 1197–212. http://dx.doi.org/10.1287/ijoc.2020.1006.

Der volle Inhalt der Quelle
Annotation:
The separable convex resource allocation problem with nested bound constraints aims to allocate B units of resources to n activities to minimize a separable convex cost function, with lower and upper bounds on the total amount of resources that can be consumed by nested subsets of activities. We develop a new combinatorial algorithm to solve this model exactly. Our algorithm is capable of solving instances with millions of activities in several minutes. The running time of our algorithm is at most 73% of the running time of the current best algorithm for benchmark instances with three classes of convex objectives. The efficiency of our algorithm derives from a combination of constraint relaxation and divide and conquer based on infeasibility information. In particular, nested bound constraints are relaxed first; if the solution obtained violates some bound constraints, we show that the problem can be divided into two subproblems of the same structure and smaller sizes according to the bound constraint with the largest violation. Summary of Contribution. The resource allocation problem is a collection of optimization models with a wide range of applications in production planning, logistics, portfolio management, telecommunications, statistical surveys, and machine learning. This paper studies the resource allocation model with prescribed lower and upper bounds on the total amount of resources consumed by nested subsets of activities. These nested bound constraints are motivated by storage limits, time-window requirements, and budget constraints in various applications. The model also appears as a subproblem in models for green logistics and machine learning, and it has to be solved repeatedly. The model belongs to the class of computationally challenging convex mixed-integer nonlinear programs. We develop a combinatorial algorithm to solve this model exactly. Our algorithm is faster than the algorithm that currently has the best theoretical complexity in the literature on an extensive set of test instances. The efficiency of our algorithm derives from the combination of an infeasibility-guided divide-and-conquer framework and a scaling-based greedy subroutine for resource allocation with submodular constraints. This paper also showcases the prevalent mismatch between the theoretical worst-case time complexity of an algorithm and its practical efficiency. We have offered some explanations of this mismatch through the perspectives of worst-case analysis, specially designed instances, and statistical metrics of numerical experiments. The implementation of our algorithm is available on an online repository.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
6

Stefanov, Stefan M. „Solution of some convex separable resource allocation and production planning problems with bounds on the variables“. Journal of Interdisciplinary Mathematics 13, Nr. 5 (Oktober 2010): 541–69. http://dx.doi.org/10.1080/09720502.2010.10700719.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
7

Albert, Elvira, Nikolaos Bezirgiannis, Frank de Boer und Enrique Martin-Martin. „A Formal, Resource Consumption-Preserving Translation from Actors with Cooperative Scheduling to Haskell*“. Fundamenta Informaticae 177, Nr. 3-4 (10.12.2020): 203–34. http://dx.doi.org/10.3233/fi-2020-1988.

Der volle Inhalt der Quelle
Annotation:
We present a formal translation of a resource-aware extension of the Abstract Behavioral Specification (ABS) language to the functional language Haskell. ABS is an actor-based language tailored to the modeling of distributed systems. It combines asynchronous method calls with a suspend and resume mode of execution of the method invocations. To cater for the resulting cooperative scheduling of the method invocations of an actor, the translation exploits for the compilation of ABS methods Haskell functions with continuations. The main result of this article is a correctness proof of the translation by means of a simulation relation between a formal semantics of the source language and a high-level operational semantics of the target language, i.e., a subset of Haskell. We further prove that the resource consumption of an ABS program extended with a cost model is preserved over this translation, as we establish an equivalence of the cost of executing the ABS program and its corresponding Haskell-translation. Concretely, the resources consumed by the original ABS program and those consumed by the Haskell program are the same, considering a cost model. Consequently, the resource bounds automatically inferred for ABS programs extended with a cost model, using resource analysis tools, are sound resource bounds also for the translated Haskell programs. Our experimental evaluation confirms the resource preservation over a set of benchmarks featuring different asymptotic costs.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
8

Yoon, Man-Ki, Chang-Gun Lee und Junghee Han. „Migrating from Per-Job Analysis to Per-Resource Analysis for Tighter Bounds of End-to-End Response Times“. IEEE Transactions on Computers 59, Nr. 7 (Juli 2010): 933–42. http://dx.doi.org/10.1109/tc.2009.174.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
9

Sajid, Mohammad, und Zahid Raza. „An Analytical Model for Resource Characterization and Parameter Estimation for DAG-Based Jobs for Homogeneous Systems“. International Journal of Distributed Systems and Technologies 6, Nr. 1 (Januar 2015): 34–52. http://dx.doi.org/10.4018/ijdst.2015010103.

Der volle Inhalt der Quelle
Annotation:
High Performance Computing (HPC) systems demand and consume a significant amount of resources (e.g. server, storage, electrical energy) resulting in high operational costs, reduced reliability, and sometimes leading to waste of scarce natural resources. On one hand, the most important issue for these systems is achieving high performance, while on the other hand, the rapidly increasing resource costs appeal to effectively predict the resource requirements to ensure efficient services in the most optimized manner. The resource requirement prediction for a job thus becomes important for both the service providers as well as the consumers for ensuring resource management and to negotiate Service Level Agreements (SLAs), respectively, in order to help make better job allocation decisions. Moreover, the resource requirement prediction can even lead to improved scheduling performance while reducing the resource waste. This work presents an analytical model estimating the required resources for the modular job execution. The analysis identifies the number of processors required and the maximum and minimum bounds on the turnaround time and energy consumed. Simulation study reveals that the scheduling algorithms integrated with the proposed analytical model helps in improving the average throughput and the average energy consumption of the system. As the work predicts the resource requirements, it can even play an important role in Service-Oriented Architectures (SOA) like Cloud computing or Grid computing.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
10

Picano, Benedetta. „End-to-End Delay Bound for VR Services in 6G Terahertz Networks with Heterogeneous Traffic and Different Scheduling Policies“. Mathematics 9, Nr. 14 (12.07.2021): 1638. http://dx.doi.org/10.3390/math9141638.

Der volle Inhalt der Quelle
Annotation:
The emerging sixth-generation networks have to provide effective support to a wide plethora of novel disruptive heterogeneous applications. This paper models the probabilistic end-to-end delay bound for the virtual reality services in the presence of heterogeneous traffic flows by resorting to the stochastic network calculus principles and exploiting the martingale envelopes. The paper presents the network performance analysis under the assumption of different scheduling policies, considering both the earliest deadline first and the first-in-first-out queue discipline. Furthermore, differently from previous literature, the probabilistic per-flow bounds have been formulated taking into account a number of traffic flows greater than two, which results in a theoretical analysis that is remarkably more complex than the case in which only two concurrent flows are considered. Finally, the validity of the theoretical bounds have been confirmed by the evident closeness between the analytical predictions and the actual simulation results considering, for the sake of argument, four concurrent traffic flows with heterogeneous quality-of-service constraints. That closeness exhibits the ability of the proposed analysis in fitting the actual behavior of the system, representing a suitable theoretical tool to support resource allocation strategies, without violating service constraints.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
11

Bilò, Vittorio, Michele Flammini, Vasco Gallotti und Cosimo Vinci. „On Multidimensional Congestion Games“. Algorithms 13, Nr. 10 (15.10.2020): 261. http://dx.doi.org/10.3390/a13100261.

Der volle Inhalt der Quelle
Annotation:
We introduce multidimensional congestion games, that is, congestion games whose set of players is partitioned into d+1 clusters C0,C1,…,Cd. Players in C0 have full information about all the other participants in the game, while players in Ci, for any 1≤i≤d, have full information only about the members of C0∪Ci and are unaware of all the others. This model has at least two interesting applications: (i) it is a special case of graphical congestion games induced by an undirected social knowledge graph with independence number equal to d, and (ii) it represents scenarios in which players have a type and the level of competition they experience on a resource depends on their type and on the types of the other players using it. We focus on the case in which the cost function associated with each resource is affine and bound the price of anarchy and stability as a function of d with respect to two meaningful social cost functions and for both weighted and unweighted players. We also provide refined bounds for the special case of d=2 in presence of unweighted players.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
12

PENG, SHAOLIN, GREGORY PARSONS und ALEXANDER G. DEAN. „RESOURCE-FOCUSED TOOLCHAIN FOR RAPID PROTOTYPING OF EMBEDDED SYSTEMS“. Journal of Circuits, Systems and Computers 21, Nr. 02 (April 2012): 1240003. http://dx.doi.org/10.1142/s0218126612400038.

Der volle Inhalt der Quelle
Annotation:
This paper introduces the RaPTEX toolchain and its use for rapid prototyping and evaluation of embedded communication systems. This toolchain is unique for several reasons. First, by using static code analysis techniques, it is able to predict both the typical case and bounds for resource usage, such as computational, memory (both static and dynamic), and energy requirements. Second, it provides a graphical user interface with configurable software building blocks which allows easy creation and customization of protocol stacks. Third, it targets low-cost, low-energy hardware, allowing the creation of low-cost systems. We demonstrate the RaPTEX toolchain by evaluating different design options for an experimental ultrasonic communication system for biotelemetry in extremely shallow waters. The power, size, mass, and cost constraints of this application make it critical to pack as much processing into the available resources as possible. The RaPTEX toolchain analyzes resource use, enabling the system to safely operate closer to the edge of the resource envelope. The toolchain also helps users with the rapid prototyping of communication protocols by providing users with quick feedback on resource requirements. We demonstrate the use and output of the toolchain. We compare the accuracy of its predictions against measurements of the real hardware.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
13

Hatton, Erin. „Work beyond the bounds: a boundary analysis of the fragmentation of work“. Work, Employment and Society 29, Nr. 6 (13.05.2015): 1007–18. http://dx.doi.org/10.1177/0950017014568141.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
14

Hurtado-Lange, Daniela, und Siva Theja Maguluri. „Transform Methods for Heavy-Traffic Analysis“. Stochastic Systems 10, Nr. 4 (Dezember 2020): 275–309. http://dx.doi.org/10.1287/stsy.2019.0056.

Der volle Inhalt der Quelle
Annotation:
The drift method was recently developed to study queuing systems in steady state. It was used successfully to obtain bounds on the moments of the scaled queue lengths that are asymptotically tight in heavy traffic and in a wide variety of systems, including generalized switches, input-queued switches, bandwidth-sharing networks, and so on. In this paper, we develop the use of transform techniques for heavy-traffic analysis, with a special focus on the use of moment-generating functions. This approach simplifies the proofs of the drift method and provides a new perspective on the drift method. We present a general framework and then use the moment-generating function method to obtain the stationary distribution of scaled queue lengths in heavy traffic in queuing systems that satisfy the complete resource pooling condition. In particular, we study load balancing systems and generalized switches under general settings.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
15

LOPEZ-GARCIA, P., L. DARMAWAN, M. KLEMEN, U. LIQAT, F. BUENO und M. V. HERMENEGILDO. „Interval-based resource usage verification by translation into Horn clauses and an application to energy consumption“. Theory and Practice of Logic Programming 18, Nr. 2 (März 2018): 167–223. http://dx.doi.org/10.1017/s1471068418000042.

Der volle Inhalt der Quelle
Annotation:
AbstractMany applications require conformance with specifications that constrain the use of resources, such as execution time, energy, bandwidth, etc. We present a configurable framework for static resource usage verification where specifications can include data size-dependent resource usage functions, expressing both lower and upper bounds. Ensuring conformance with respect to such specifications is an undecidable problem. Therefore, to statically check such specifications, our framework infers the same type of resource usage functions, which safely approximate the actual resource usage of the program, and compares them against the specification. We review how this framework supports several languages and compilation output formats by translating them to an intermediate representation based on Horn clauses and using the configurability of the framework to describe the resource semantics of the input language. We provide a detailed formalization and extend the framework so that both resource usage specification and analysis/verification output can include preconditions expressing intervals for the input data sizes for which assertions are intended to hold, proved, or disproved. Most importantly, we also extend the classes of functions that can be checked. We also report on and provide results from an implementation within the Ciao/CiaoPP framework, as well as on a practical tool built by instantiating this framework for the verification of energy consumption specifications for imperative/embedded programs. Finally, we show as an example how embedded software developers can use this tool, in particular, for determining values for program parameters that ensure meeting a given energy budget while minimizing the loss in quality of service.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
16

Coester, Christian, Elias Koutsoupias und Philip Lazos. „The Infinite Server Problem“. ACM Transactions on Algorithms 17, Nr. 3 (August 2021): 1–23. http://dx.doi.org/10.1145/3456632.

Der volle Inhalt der Quelle
Annotation:
We study a variant of the k -server problem, the infinite server problem, in which infinitely many servers reside initially at a particular point of the metric space and serve a sequence of requests. In the framework of competitive analysis, we show a surprisingly tight connection between this problem and the resource augmentation version of the k -server problem, also known as the (h,k) -server problem, in which an online algorithm with k servers competes against an offline algorithm with h servers. Specifically, we show that the infinite server problem has bounded competitive ratio if and only if the (h,k) -server problem has bounded competitive ratio for some k = O ( h ). We give a lower bound of 3.146 for the competitive ratio of the infinite server problem, which holds even for the line and some simple weighted stars. It implies the same lower bound for the (h,k) -server problem on the line, even when k/h → ∞, improving on the previous known bounds of 2 for the line and 2.4 for general metrics. For weighted trees and layered graphs, we obtain upper bounds, although they depend on the depth. Of particular interest is the infinite server problem on the line, which we show to be equivalent to the seemingly easier case in which all requests are in a fixed bounded interval. This is a special case of a more general reduction from arbitrary metric spaces to bounded subspaces. Unfortunately, classical approaches (double coverage and generalizations, work function algorithm, balancing algorithms) fail even for this special case.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
17

Dowdell, Benjamin L., J. Tim Kwiatkowski und Kurt J. Marfurt. „Seismic characterization of a Mississippi Lime resource play in Osage County, Oklahoma, USA“. Interpretation 1, Nr. 2 (01.11.2013): SB97—SB108. http://dx.doi.org/10.1190/int-2013-0026.1.

Der volle Inhalt der Quelle
Annotation:
With the advent of horizontal drilling and hydraulic fracturing in the Midcontinent, USA, fields once thought to be exhausted are now experiencing renewed exploitation. However, traditional Midcontinent seismic analysis techniques no longer provide satisfactory reservoir characterization for these unconventional plays; new seismic analysis methods are needed to properly characterize these radically innovative play concepts. Time processing and filtering is applied to a raw 3D seismic data set from Osage County, Oklahoma, paying careful attention to velocity analysis, residual statics, and coherent noise filtering. The use of a robust prestack structure-oriented filter and spectral whitening greatly enhances the results. After prestack time migrating the data using a Kirchhoff algorithm, new velocities are picked. A final normal moveout correction is applied using the new velocities, followed by a final prestack structure-oriented filter and spectral whitening. Simultaneous prestack inversion uses the reprocessed and time-migrated seismic data as input, along with a well from within the bounds of the survey. With offsets out to 3048 m and a target depth of approximately 880 m, we can invert for density in addition to P- and S-impedance. Prestack inversion attributes are sensitive to lithology and porosity while surface seismic attributes such as coherence and curvature are sensitive to lateral changes in waveform and structure. We use these attributes in conjunction with interpreted horizontal image logs to identify zones of high porosity and high fracture density.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
18

Mazzella, Matthew J., Dana Boyd Barr, Kurunthachalam Kannan, Chitra Amarasiriwardena, Syam S. Andra und Chris Gennings. „Evaluating inter-study variability in phthalate and trace element analyses within the Children’s Health Exposure Analysis Resource (CHEAR) using multivariate control charts“. Journal of Exposure Science & Environmental Epidemiology 31, Nr. 2 (18.02.2021): 318–27. http://dx.doi.org/10.1038/s41370-021-00293-w.

Der volle Inhalt der Quelle
Annotation:
Abstract Background The Children’s Health Exposure Analysis Resource (CHEAR) program allows researchers to expand their research goals by offering the assessment of environmental exposures in their previously collected biospecimens. Samples are analyzed in one of CHEAR’s network of six laboratory hubs with the ability to assess a wide array of environmental chemicals. The ability to assess inter-study variability is important for researchers who want to combine datasets across studies and laboratories. Objective Herein we establish a process of evaluating inter-study variability for a given analytic method. Methods Common quality control (QC) pools at two concentration levels (A and B) in urine were created within CHEAR for insertion into each batch of samples tested at a rate of three samples of each pool per 100 study samples. We assessed these QC pool results for seven phthalates analyzed for five CHEAR studies by three different lab hubs utilizing multivariate control charts to identify out-of-control runs or sets of samples associated with a given QC sample. We then tested the conditions that would lead to an out-of-control run by simulating outliers in an otherwise “in-control” set of 12 trace elements in blood QC samples (NIST SRM 955c). Results When phthalates were assessed within study, we identified a single out-of-control run for two of the five studies. Combining QC results across lab hubs, all of the runs from these two studies were now in-control, while multiple runs from two other studies were pushed out-of-control. In our simulation study we found that 3–6 analytes with outlier values (5xSD) within a run would push that run out of control in 65–83% of simulations, respectively. Significance We show how acceptable bounds of variability can be established for a given analytic method by evaluating QC materials across studies using multivariate control charts.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
19

Yang, Xiuli, Yanhong Huang, Jianqi Shi und Zongyu Cao. „A Performance Analysis Framework of Time-Triggered Ethernet Using Real-Time Calculus“. Electronics 9, Nr. 7 (03.07.2020): 1090. http://dx.doi.org/10.3390/electronics9071090.

Der volle Inhalt der Quelle
Annotation:
With increasing demands of deterministic and real-time communication, network performance analysis is becoming an increasingly important research topic in safety-critical areas, such as aerospace, automotive electronics and so on. Time-triggered Ethernet (TTEthernet) is a novel hybrid network protocol based on the Ethernet standard; it is deterministic, synchronized and congestion-free. TTEthernet with a time-triggered mechanism meets the real-time and reliability requirements of safety-critical applications. Time-triggered (TT) messages perform strict periodic scheduling following the offline schedule tables. Different scheduling strategies have an effect on the performance of TTEthernet. In this paper, a performance analysis framework is designed to analyze the end-to-end delay, backlog bounds and resource utilization of network by real-time calculus. This method can be used as a base for the performance evaluation of TTEthernet scheduling. In addition, this study discusses the impacts of clock synchronization and traffic integration strategies on TT traffic in the network. Finally, a case study is presented to prove the feasibility of the performance analysis framework.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
20

Hurtado-Lange, Daniela, und Siva Theja Maguluri. „Heavy-traffic Analysis of the Generalized Switch under Multidimensional State Space Collapse“. ACM SIGMETRICS Performance Evaluation Review 48, Nr. 3 (05.03.2021): 33–34. http://dx.doi.org/10.1145/3453953.3453959.

Der volle Inhalt der Quelle
Annotation:
Stochastic Processing Networks that model wired and wireless networks, and other queueing systems, have been studied in heavytraffic limit under the so-called Complete Resource Pooling (CRP) condition. When the CRP condition is not satisfied, heavy-traffic results are known only in the special case of an input-queued switch and bandwidth-sharing network. In this paper, we consider a very general queueing system called the 'generalized switch' that includes wireless networks under fading, data center networks, input-queued switch, etc. The primary contribution of this paper is to present the exact value of the steadystate mean of certain linear combinations of queue lengths in the heavy-traffic limit under MaxWeight scheduling algorithm. We use the Drift method, and we also present a negative result that it is not possible to obtain the remaining linear combinations (and consequently all the individual mean queue lengths) using this method. We do this by presenting an alternate view of the Drift method in terms of an (under-determined) system of linear equations. Finally, we use this system of equations to obtain upper and lower bounds on all linear combinations of queue lengths.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
21

Lee, Jong Wook, Sun Dong Chegal und Seung Oh Lee. „A Review of Tank Model and Its Applicability to Various Korean Catchment Conditions“. Water 12, Nr. 12 (21.12.2020): 3588. http://dx.doi.org/10.3390/w12123588.

Der volle Inhalt der Quelle
Annotation:
This paper reviews a conceptual rainfall-runoff model called Tank which has been widely used over the last 20 years in Korea as a part of a water resource modelling framework for assessing and developing long-term water resource polices. In order to examine the uncertainty of model predictions and the sensitivity of model’s parameters, Monte Carlos and Markov chain-based approaches are applied to five catchments of various Korean geographical and climatic conditions where the catchment sizes are ranged from 83 to 4786 km2. In addition, three optimization algorithms—dynamically dimensioned search (DDS), robust parameter estimation (ROPE), and shuffled complex evolution (SCE)—are selected to test whether the model parameters can be optimized consistently within a narrower range than the uncertainty bounds. From the uncertainty analysis, it is found that there is limited success in refining the priori distributions of the model parameters, indicating there is a high degree of equifinality for some parameters or at least there are large numbers of parameter combinations leading to good solutions within model’s uncertainty bounds. Out of the three optimization algorithms, SCE meets the criteria of the consistency best. It is also found that there are still some parameters that even the SCE method struggles to refine the priori distributions. It means that their contribution to model results is minimal and can take a value within a reasonable range. It suggests that the model may be reconceptualized to be parsimonious and to rationalize some parameters without affecting model’s capacity to replicate historical flow characteristics. Cross-validation indicates that sensitive parameters to catchment characteristics can be transferred when geophysical similarity exists between two catchments. Regionalization can be further improved by using a regression or geophysical similarity-based approach to transfer model parameters to ungauged catchments. It may be beneficial to categorize the model parameters depending on the level of their sensitivities, and a different approach to each category may be applied to regionalize the calibrated parameters.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
22

Amjad, Maliha, Omer Chughtai, Muhammad Naeem und Waleed Ejaz. „SWIPT-Assisted Energy Efficiency Optimization in 5G/B5G Cooperative IoT Network“. Energies 14, Nr. 9 (27.04.2021): 2515. http://dx.doi.org/10.3390/en14092515.

Der volle Inhalt der Quelle
Annotation:
Resource use in point-to-point and point-to-multipoint communication emerges with the tremendous growth in wireless communication technologies. One of the technologies is wireless power transfer which may be used to provide sufficient resources for energy-constrained networks. With the implication of cooperative communication in 5G/B5G and the Internet of Things (IoT), simultaneous wireless information and power transfer (SWIPT)-assisted energy efficiency and appropriate resource use become challenging tasks. In this paper, multiple IoT-enabled devices are deployed to cooperate with the source node through intermediate/relay nodes powered by radio-frequency (RF) energy. The relay forwards the desired information generated by the source node to the IoT devices with the fusion of decode/amplify processes and charges itself at the same time through energy harvesting technology. In this regard, a problem with throughput, energy efficiency, and joint throughput with user admission maximization is formulated while assuring the useful, practical network constraints, which contemplate the upper/lower bounds of power transmitted by the source node, channel condition, and energy harvesting. The formulated problem is a mixed-integer non-linear problem (MINLP). To solve the formulated problem, the rate of individual IoT-enabled devices (b/s), number of selected IoT devices, and the sum-rate maximization are prosecuted for no-cooperation, cooperation with diversity, and cooperation without diversity. Moreover, a comparison of the outer approximation algorithm (OAA) and mesh adaptive direct search algorithm (MADS) for non-linear optimization with the exhaustive search algorithm is provided. The results with reference to the complexity of the algorithms have also been evaluated which show that 4.68×10−10 OAA and 7.81×10−11 MADS as a percent of ESA, respectively. Numerous simulations are carried out to exhibit the usefulness of the analysis to achieve the convergence to ε-optimal solution.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
23

Dey, Sima Rani, und Mohammad Tareque. „External debt and growth: role of stable macroeconomic policies“. Journal of Economics, Finance and Administrative Science 25, Nr. 50 (22.07.2020): 185–204. http://dx.doi.org/10.1108/jefas-05-2019-0069.

Der volle Inhalt der Quelle
Annotation:
Purpose This study aims to examine the impact of external debt on economic growth in Bangladesh within a broader macroeconomic scenario. Design/methodology/approach In the process of doing so, it assesses the empirical cointegration, long-run and short-run dynamics of the concerned variables for the period of 1980–2017 applying the autoregressive distributed lag (ARDL) bounds testing approach to cointegration. First, debt-gross domestic product linkage explores the impact of external debt impact on economic growth using a set of macro and country risk variables, and then this linkage is also analyzed along with a newly formed macroeconomic policy (MEP) variable using principal component analysis. Findings The study results reveal the negative impact of external debt on GDP growth, but the larger positive impact of MEP index indicates that this adverse effect of debt can be mitigated or even nullified by sound MEP and appropriate human resource policy. Originality/value The dynamic effects of different shocks (external debt and macro policy variable) on economic growth by vector autoregression impulse response function also confirm our ARDL findings.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
24

Lucarelli, Giorgio, Benjamin Moseley, Nguyen Kim Thang, Abhinav Srivastav und Denis Trystram. „Online Non-preemptive Scheduling on Unrelated Machines with Rejections“. ACM Transactions on Parallel Computing 8, Nr. 2 (30.06.2021): 1–22. http://dx.doi.org/10.1145/3460880.

Der volle Inhalt der Quelle
Annotation:
When a computer system schedules jobs there is typically a significant cost associated with preempting a job during execution. This cost can be incurred from the expensive task of saving the memory’s state or from loading data into and out of memory. Thus, it is desirable to schedule jobs non-preemptively to avoid the costs of preemption. There is a need for non-preemptive system schedulers for desktops, servers, and data centers. Despite this need, there is a gap between theory and practice. Indeed, few non-preemptive online schedulers are known to have strong theoretical guarantees. This gap is likely due to strong lower bounds on any online algorithm for popular objectives. Indeed, typical worst-case analysis approaches, and even resource-augmented approaches such as speed augmentation, result in all algorithms having poor performance guarantees. This article considers online non-preemptive scheduling problems in the worst-case rejection model where the algorithm is allowed to reject a small fraction of jobs. By rejecting only a few jobs, this article shows that the strong lower bounds can be circumvented. This approach can be used to discover algorithmic scheduling policies with desirable worst-case guarantees. Specifically, the article presents algorithms for the following three objectives: minimizing the total flow-time, minimizing the total weighted flow-time plus energy where energy is a convex function, and minimizing the total energy under the deadline constraints. The algorithms for the first two problems have a small constant competitive ratio while rejecting only a constant fraction of jobs. For the last problem, we present a constant competitive ratio without rejection. Beyond specific results, the article asserts that alternative models beyond speed augmentation should be explored to aid in the discovery of good schedulers in the face of the requirement of being online and non-preemptive.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
25

Kleiber, Pierre, Michael G. Hinton und Yuji Uozumi. „Stock assessment of blue marlin (Makaira nigricans) in the Pacific using MULTIFAN-CL“. Marine and Freshwater Research 54, Nr. 4 (2003): 349. http://dx.doi.org/10.1071/mf01246.

Der volle Inhalt der Quelle
Annotation:
In the Pacific, blue marlin are an incidental catch of longline fisheries and an important resource for big game recreational fishing. Over the past two decades, blue marlin assessments by different techniques have yielded results ranging from an indication of declining stock to a state of sustained yield at approximately the maximum average level. Longline fishing practices have changed over the years since the 1950s in response to changes in principal target species and to gear developments. Despite increasingly sophisticated attempts to standardize fishing effort with changing fishing practices, the stock assessments to date are likely confounded to a greater or lesser degree by changes in catchability for blue marlin. Yet, only data from commercial longline fisheries targeting tuna provide sufficient spatial and temporal coverage to allow assessment of this resource. To re-assess the blue marlin stocks in the Pacific and also to assess the efficacy of a habitat-based standardization of longline effort, a collaborative analysis was conducted involving scientists at the National Research Institute of Far Seas Fisheries, Shimizu, Japan, the Inter-American Tropical Tuna Commission, La Jolla, California, and the NOAA Fisheries Honolulu Laboratory, Honolulu, Hawaii. Using MULTIFAN-CL as an assessment tool, there was considerable uncertainty in quantifying the fishing effort levels that would produce a maximum sustainable yield. However, it was found that, at worst, blue marlin in the Pacific are close to a fully exploited state, that is the population and the fishery are somewhere near the top of the yield curve. Furthermore, it was found that effort standardization using a habitat-based model allowed estimation of parameters within reasonable bounds and with reduced confidence intervals about those values.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
26

Ali, Haris. „Reciprocity or negotiation in the psychological contract: a power perspective“. Employee Relations: The International Journal 43, Nr. 5 (09.04.2021): 1250–67. http://dx.doi.org/10.1108/er-09-2019-0367.

Der volle Inhalt der Quelle
Annotation:
PurposeThe psychological contract literature is generally based on the assumption of reciprocity between employee and employer. The emphasis on reciprocity, however, largely downplays the implications of power dynamics in the employment relationship. In order to bridge this gap, the current research investigates psychological contract from the lens of power particularly focusing on reciprocity.Design/methodology/approachIn total, 43 semi-structured interviews are carried out with 37 employees and six managers of three call center companies in Pakistan. The technique of template analysis is used for data analysis.FindingsIn contrast to the assumption of reciprocity, the research findings highlight employees' perceived inability to reciprocate the employer's inducements on parity basis, because of their view of power asymmetry in the employment relationship. The results further suggest the high tendency among employees to attribute employer reciprocity largely to their managers. The findings also point toward divergence in the reciprocity perceptions of employees and managers in relation with the employers.Research limitations/implicationsThe emphasis on call centers bounds the generality of results. Future research is needed to further explore the impact of power asymmetry on reciprocity in organizations of other industries. With significant implications for the employment relations, negotiated contracts consider the exchange between employee and employer as an obligation rather than a voluntary act of kindness, as emphasized in reciprocity.Originality/valueThis research contributes to knowledge by emphasizing the significance of negotiation rather than reciprocation in the psychological contract. The negotiation approach efficiently recognizes the implications of power asymmetry that remain generally under-researched in the psychological contract literature.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
27

Hoffmann, Jan, Ankush Das und Shu-Chun Weng. „Towards automatic resource bound analysis for OCaml“. ACM SIGPLAN Notices 52, Nr. 1 (11.05.2017): 359–73. http://dx.doi.org/10.1145/3093333.3009842.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
28

Albinsson, Gunilla, und Kerstin Arnesson. „The managerial position in a Swedish municipal organization: Possibilities and limitations“. Economic and Industrial Democracy 39, Nr. 3 (04.11.2016): 500–535. http://dx.doi.org/10.1177/0143831x16639656.

Der volle Inhalt der Quelle
Annotation:
The purpose of this article is to explore how a group of managers construct their reality, more specifically what it means to work as a manager in a municipal organization. The empirical data for the study were obtained from a Swedish medium-sized municipality and the study takes as its research approach grounded theory, as developed by Glaser and Strauss. Consequently, the empirical data formed the basis for the research, which takes a multi-methodical and theory-generating approach. The methods used in the study include the use of a questionnaire study, interviews in focus groups, observations, reflective work diaries, and the creation of feedback sessions. The result shows that the managers work in an organization where conflicting and competing value systems act together. These can be interpreted as environmental factors and external bounds on a structural societal level, which cannot be influenced. A point of analysis is that these factors and external bounds to a high degree permeate the manager’s workday and can therefore be seen as a plausible explanation for the boundless nature of the managerial task. For most of the managers of the study, this was expressed as uncertainty as to how to define and interpret goals and as to what the managerial role includes with regard to areas of responsibility. It is interesting to ask, however, whether these conditions are not characteristic of the role of managers and work life in general. The results also show that the substantive theory of the study was not judged to be valid for the municipal companies. These managers do not express as ambivalent an approach to competing value systems as the managers in other sections of the municipality do. Nor do they appear to question their professional knowledge, the work content or managership. Another empirical important finding is that the managers believe that the organizational conditions limit ability to carry out the manager task, but that, despite this, they indicate, paradoxically, that they like their work and the social work environment.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
29

Westerberg, I. K., L. Gong, K. J. Beven, J. Seibert, A. Semedo, C. Y. Xu und S. Halldin. „Regional water balance modelling using flow-duration curves with observational uncertainties“. Hydrology and Earth System Sciences 18, Nr. 8 (14.08.2014): 2993–3013. http://dx.doi.org/10.5194/hess-18-2993-2014.

Der volle Inhalt der Quelle
Annotation:
Abstract. Robust and reliable water-resource mapping in ungauged basins requires estimation of the uncertainties in the hydrologic model, the regionalisation method, and the observational data. In this study we investigated the use of regionalised flow-duration curves (FDCs) for constraining model predictive uncertainty, while accounting for all these uncertainty sources. A water balance model was applied to 36 basins in Central America using regionally and globally available precipitation, climate and discharge data that were screened for inconsistencies. A rating-curve analysis for 35 Honduran discharge stations was used to estimate discharge uncertainty for the region, and the consistency of the model forcing and evaluation data was analysed using two different screening methods. FDCs with uncertainty bounds were calculated for each basin, accounting for both discharge uncertainty and, in many cases, uncertainty stemming from the use of short time series, potentially not representative for the modelling period. These uncertain FDCs were then used to regionalise a FDC for each basin, treating it as ungauged in a cross-evaluation, and this regionalised FDC was used to constrain the uncertainty in the model predictions for the basin. There was a clear relationship between the performance of the local model calibration and the degree of data set consistency – with many basins with inconsistent data lacking behavioural simulations (i.e. simulations within predefined limits around the observed FDC) and the basins with the highest data set consistency also having the highest simulation reliability. For the basins where the regionalisation of the FDCs worked best, the uncertainty bounds for the regionalised simulations were only slightly wider than those for a local model calibration. The predicted uncertainty was greater for basins where the result of the FDC regionalisation was more uncertain, but the regionalised simulations still had a high reliability compared to the locally calibrated simulations and often encompassed them. The regionalised FDCs were found to be useful on their own as a basic signature constraint; however, additional regionalised signatures could further constrain the uncertainty in the predictions and may increase the robustness to severe data inconsistencies, which are difficult to detect for ungauged basins.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
30

Giesl, Jürgen, und Jan Hoffmann. „Preface: Special Issue on Automatic Resource Bound Analysis“. Journal of Automated Reasoning 59, Nr. 1 (05.12.2016): 1–2. http://dx.doi.org/10.1007/s10817-016-9399-8.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
31

Kien Quang, Huynh, Mai Quoc Gia, Nguyen Hoang An, Vo Thi Thanh Ha und Tran Van Hieu. „Cloning, expression, and purification of the M cell targeting peptide CPE16 derived from C-terminus of Clostridium perfringens enterotoxin and the binding evaluation with Claudin-r4“. Science and Technology Development Journal - Natural Sciences 3, Nr. 1 (26.04.2019): 38–45. http://dx.doi.org/10.32508/stdjns.v3i1.723.

Der volle Inhalt der Quelle
Annotation:
Developing the oral vaccine that stimulates the mucosal immune system in order to prevent the gastro-intestinal infection is an indispensable demand nowadays. Targeting the M cells, which is a sampling antigen cell, is a highly efficient solution to prevent the dispersion of antigens. Many researches demonstrate that C-terminus Clostridium perfringens enterotoxin bounds to the Claudin- 4 receptor on the M cell surface. By using bioinformatics methods, the peptide CPE16 (16 amino acid of C-terminus of Clostridium perfringens enterotoxin) was predicted to have a high affinity to Claudin-4 receptor on M cells. In this present study, CPE16-GFP was produced as a resource to assess the binding ability to M cells. Recombinant plasmid pET22b-cpe16-gfp was constructed through cloning cpe16-gfp gene into pET22b by two restriction enzymes, NdeI and XhoI, respectively. The recombinant plasmid was transformed into E. coli BL21 (DE3) strain. The expression of protein CPE16-GFP was induced by 0.5 mM IPTG and confirmed by SDS-PAGE analysis and Western blot probed with anti-6xHis antibody. CPE16-GFP protein was expressed in soluble form. CPE16- GFP was purified by using immobilized-metal affinity chromatography with the purity up to 94.14 percent. Finally, CPE16 was tested for the binding ability to recombinant GST-claudin-R4 with the use of silicon nanowire (SiNW-FET). The result showed that CPE16 interacted with GST-claudin-R4 presented by the change of the current through nanowire, compared to its counterpart control GST.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
32

Legault, Kelly Rankin, Tanya Beck und Jason Engle. „INFLUENCE OF INLET / SHOAL COMPLEX ON ADJACENT SHORELINES VIA INLET SINK METHOD“. Coastal Engineering Proceedings 1, Nr. 33 (25.10.2012): 76. http://dx.doi.org/10.9753/icce.v33.sediment.76.

Der volle Inhalt der Quelle
Annotation:
The region of influence of the inlet on the adjacent shoreline was determined via examination of the inlet’s net sink effect. The net sink effect, or volumetric impact, was computed by adding the volume (or rate) of net sand accumulation within the inlet's channels and shoals with the cumulative volumetric losses on adjacent shorelines to conserve sediment mass after accounting for the volumes either added to adjacent beaches or removed from the ebb shoal by means of nourishment and sediment mining. Volume change of the beaches and ebb shoal complex was computed within a geospatial framework consisting of Regional Mapping and Analysis Program (RMAP), ArcGIS and the Surface-water Modeling System (SMS). Inlet-adjacent cumulative volume changes were then examined to discern the minimum distance away from the inlet along which this volumetric impact was manifest. The alongshore influence of the inlet as determined by the inlet sink method for the 1999-2010 time period was found to be 7.4 miles to the north and 5.5 miles to the south. The inlet sink effect for St. Augustine Inlet is 278,000 cu yd/year, balanced by 99,000 cu yd/yr of erosion from the north beaches and 179,000 cu yd/yr of erosion from the south beaches. If managed properly, the inlet could serve as a valuable, long-term resource for the beaches of St. Johns County within the bounds of its sink effect.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
33

Soltani, Ali, Akbar Alam Tabriz, Masoud Sanei und Ismaeil Najafi Trojeni. „Evaluation of the suggestions system performance using robust DEA model: The case of National Iranian Gas Company“. International Journal of Engineering Business Management 9 (01.01.2017): 184797901769324. http://dx.doi.org/10.1177/1847979017693244.

Der volle Inhalt der Quelle
Annotation:
The suggestions system is a part of total quality management to create individual and group spirit of partnership between staff and increase efficiency in the organization. Also, diagnosis and improvement process is one of the steps of the chain in the processes of suggestions system. In this study, an approach has been proposed to evaluate efficiency of organizations in performing suggestions system with these aims: (1) Reviewing all the elements in the successful implementation of the suggestions system and (2) providing an effective scientific approach to evaluate the organizations on implementing this system considering the uncertainty in the data. Methodology used in this study included the following techniques: (1) Factor analyzing to clarify the internal correlation between significant criteria and detect the major criteria and (2) using robust data envelopment analysis (RDEA) model to evaluate efficiency of organizations in performing suggestions system. The method is based on 3 inputs and 17 outputs in which some outputs are uncertain scores in form of intervals with uncertain bounds. This model has been solved for different Γs, and a value of weights and rankings for each Decision Making Unit (DMU) has been saved by using the obtained values. In the following a simulation has been used to compute the conformity of the rankings from the RDEA model with reality. Doing so shows that the maximum conformity occurs Γ = 6. Therefore, we can conclude that specific values of Γ can maximize conformity and thus more authentic final rankings for the DMUs in this interval of Γ may be expected.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
34

WOODS, JOHN. „COGNITIVE ECONOMICS AND THE LOGIC OF ABDUCTION“. Review of Symbolic Logic 5, Nr. 1 (25.01.2012): 148–61. http://dx.doi.org/10.1017/s175502031100027x.

Der volle Inhalt der Quelle
Annotation:
AbstractAn agent-centered, goal-directed, resource-bound logic of human reasoning would do well to note that individual cognitive agency is typified by the comparative scantness of available cognitive resources—information, time, and computational capacity, to name just three. This motivates individual agents to set their cognitive agendas proportionately, that is, in ways that carry some prospect of success with the resources on which they are able to draw. It also puts a premium on cognitive strategies which make economical use of those resources. These latter I callscant-resource adjustment strategies,and they supply the context for an analysis of abduction. The analysis is Peircian in tone, especially in the emphasis it places on abduction’signorance-preservingcharacter. My principal purpose here is to tie abduction’s scarce-resource adjustment capacity to its ignorance preservation.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
35

Shirzadeh Chaleshtarti, A., S. Shadrokh und Y. Fathi. „Branch and Bound Algorithms for Resource Constrained Project Scheduling Problem Subject to Nonrenewable Resources with Prescheduled Procurement“. Mathematical Problems in Engineering 2014 (2014): 1–15. http://dx.doi.org/10.1155/2014/634649.

Der volle Inhalt der Quelle
Annotation:
A lot of projects in real life are subject to some kinds of nonrenewable resources that are not exactly similar to the type defined in the project scheduling literature. The difference stems from the fact that, in those projects, contrary to the common assumption in the project scheduling literature, nonrenewable resources are not available in full amount at the beginning of the project, but they are procured along the project horizon. In this paper, we study this different type of nonrenewable resources. To that end, we extend the resource constrained project scheduling problem (RCPSP) by this resource type (RCPSP-NR) and customize four basic branch and bound algorithms of RCPSP for it, including precedence tree, extension alternatives, minimal delaying alternatives, and minimal forbidden sets. Several bounding and fathoming rules are introduced to the algorithms to shorten the enumeration process. We perform comprehensive experimental analysis using the four customized algorithms and also CPLEX solver.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
36

Amusa, Kafayat, und Mutiu Abimbola Oyinlola. „The effectiveness of government expenditure on economic growth in Botswana“. African Journal of Economic and Management Studies 10, Nr. 3 (02.09.2019): 368–84. http://dx.doi.org/10.1108/ajems-03-2018-0081.

Der volle Inhalt der Quelle
Annotation:
Purpose The purpose of this paper is to examine the relationship between government expenditure and economic growth in Botswana over the period 1985‒2016. The study employed the auto-regressive distributed lag (ARDL) bounds testing approach in investigating the nexus. The study makes the argument that the effectiveness of public spending should be assessed not only against the amount of the expenditure but also by the type of the expenditure. The empirical findings showed that aggregate expenditure has a negative short-run and positive long-run effect on economic growth. When expenditure is disaggregated, both forms of expenditures have a positive short-run effect on economic growth, whereas only a long-run positive impact of recurrent expenditure is observed. The study suggests the need to prioritize scarce resources in productive recurrent and development spending that enables increased productivity. Design/methodology/approach This study examined the effectiveness of government spending in Botswana, within an ARDL framework from 1985 to 2016. To achieve this, the analysis is carried out on both an aggregate and disaggregated level. Government spending is divided into recurrent and development expenditures. Findings This study examined the effectiveness of government spending in Botswana, within an ARDL framework from 1985 to 2016. To achieve this, the analysis hinged on both the aggregate and disaggregated levels. The results of the aggregate analysis suggest that total public expenditure has a negative impact on economic growth in the short run; however, its impact becomes positive over the long run. On disaggregating government spending, the results show that both recurrent and development expenditures have a significant positive short-run impact on growth; however, in the long run, the significant positive impact is only observed for recurrent expenditure. Practical implications The results provide evidence of the diverse effects of government expenditure in the country. In the period under investigation, 73 percent of total government expenditure in Botswana was recurrent in nature, whereas 23 percent was related to development. From the results, it can be observed that although the recurrent expenditure has contributed to increased growth and must be encouraged, it is also pertinent for the Botswana Government to endeavor to place more emphasis on productive development expenditure in order to enhance short- and long-term growth. Further, there is a need to strengthen the growth-enhancing structures and to prioritize the scarce economic resources toward productive spending and ensuring continued proper governance over such expenditures. Originality/value The study provides empirical evidence on the effectiveness of government spending in a small open, resource-reliant middle-income SSA economy and argues that the effectiveness of public spending must be assessed not only against the amount of the expenditure but also on the type or composition of the expenditure. The study contributes to the scant empirical literature on Botswana by employing the ARDL approach to cointegration technique in estimating the long- and short-run impact of government expenditure on economic growth between 1985 and 2016.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
37

Mehmood, Asif, Shaofeng Jia, Rashid Mahmood, Jiabao Yan und Moien Ahsan. „Non-Stationary Bayesian Modeling of Annual Maximum Floods in a Changing Environment and Implications for Flood Management in the Kabul River Basin, Pakistan“. Water 11, Nr. 6 (14.06.2019): 1246. http://dx.doi.org/10.3390/w11061246.

Der volle Inhalt der Quelle
Annotation:
Recent evidence of regional climate change associated with the intensification of human activities has led hydrologists to study a flood regime in a non-stationarity context. This study utilized a Bayesian framework with informed priors on shape parameter for a generalized extreme value (GEV) model for the estimation of design flood quantiles for “at site analysis” in a changing environment, and discussed its implications for flood management in the Kabul River basin (KRB), Pakistan. Initially, 29 study sites in the KRB were used to evaluate the annual maximum flood regime by applying the Mann–Kendall test. Stationary (without trend) and a non-stationary (with trend) Bayesian models for flood frequency estimation were used, and their results were compared using the corresponding flood frequency curves (FFCs), along with their uncertainty bounds. The results of trend analysis revealed significant positive trends for 27.6% of the gauges, and 10% showed significant negative trends at the significance level of 0.05. In addition to these, 6.9% of the gauges also represented significant positive trends at the significance level of 0.1, while the remaining stations displayed insignificant trends. The non-stationary Bayesian model was found to be reliable for study sites possessing a statistically significant trend at the significance level of 0.05, while the stationary Bayesian model overestimated or underestimated the flood hazard for these sites. Therefore, it is vital to consider the presence of non-stationarity for sustainable flood management under a changing environment in the KRB, which has a rich history of flooding. Furthermore, this study also states a regional shape parameter value of 0.26 for the KRB, which can be further used as an informed prior on shape parameter if the study site under consideration possesses the flood type “flash”. The synchronized appearance of a significant increase and decrease of trends within very close gauge stations is worth paying attention to. The present study, which considers non-stationarity in the flood regime, will provide a reference for hydrologists, water resource managers, planners, and decision makers.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
38

Heinrich, Markus, und David Gross. „Robustness of Magic and Symmetries of the Stabiliser Polytope“. Quantum 3 (08.04.2019): 132. http://dx.doi.org/10.22331/q-2019-04-08-132.

Der volle Inhalt der Quelle
Annotation:
We give a new algorithm for computing therobustness of magic- a measure of the utility of quantum states as a computational resource. Our work is motivated by themagic state modelof fault-tolerant quantum computation. In this model, all unitaries belong to the Clifford group. Non-Clifford operations are effected by injecting non-stabiliser states, which are referred to asmagic statesin this context. Therobustness of magicmeasures the complexity of simulating such a circuit using a classical Monte Carlo algorithm. It is closely related to the degree negativity that slows down Monte Carlo simulations through the infamoussign problem. Surprisingly, the robustness of magic issub- multiplicative. This implies that the classical simulation overhead scales subexponentially with the number of injected magic states - better than a naive analysis would suggest. However, determining the robustness ofncopies of a magic state is difficult, as its definition involves a convex optimisation problem in a 4n-dimensional space. In this paper, we make use of inherent symmetries to reduce the problem tondimensions. The total run-time of our algorithm, while still exponential inn, is super-polynomially faster than previously published methods. We provide a computer implementation and give the robustness of up to 10 copies of the most commonly used magic states. Guided by the exact results, we find a finite hierarchy of approximate solutions where each level can be evaluated in polynomial time and yields rigorous upper bounds to the robustness. Technically, we use symmetries of the stabiliser polytope to connect the robustness of magic to the geometry of a low-dimensional convex polytope generated by certainsigned quantum weight enumerators. As a by-product, we characterised the automorphism group of the stabiliser polytope, and, more generally, of projections onto complex projective 3-designs.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
39

Cranmer, Holly, Tanja Podkonjak, Eugene Benson, Jonathon Dabora und Graham H. Jackson. „Delivering Systemic Anti-Cancer Treatment Via Another Route of Administration As an Option to Reduce Exposure to COVID-19 in Patients with Multiple Myeloma in the UK“. Blood 136, Supplement 1 (05.11.2020): 12–13. http://dx.doi.org/10.1182/blood-2020-142338.

Der volle Inhalt der Quelle
Annotation:
Introduction: Severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) is a strain of coronavirus that causes a respiratory illness known as COVID-19. COVID-19 is a pandemic affecting many countries globally.(1) As of 23rd July 2020, there have been 297,146 lab-confirmed cases of COVID-19 in the UK and 45,554 people who tested positive for the virus have died.(2) Patients with multiple myeloma (MM) are at a higher risk of contracting the virus and experiencing more severe outcomes.(3-6) The higher risk is driven by a compromised immune system, the use of immunosuppressive agents and patient characteristics aligning with key risk factors - patients with MM are often elderly and have multiple co-morbidities.(7) In light of the COVID-19 outbreak, NHS England and NICE have issued guidance to modify usual care to reduce patient exposure to COVID-19. For patients with cancer, NICE recommend delivering systemic anti-cancer treatment in different and less immunosuppressive regimens, different locations (ideally at home) or via another (less invasive and/or less resource intensive) route of administration where possible.(8)(9) The objective of this analysis is to explore the impact of switching patients from intravenous (IV) treatments requiring hospital administrations to subcutaneous (SC) or oral alternatives which can be administered at home or in an outpatient setting which reduces the patient's potential exposure to COVID-19. Methods: A decision tree model was developed in Microsoft Excel® (Figure 1). Patients enter the model and are assigned a probability of being treated; those that are treated are then assigned a probability of IV, SC or oral-based therapy. Based on the route of administration, patients are assigned a probability of contracting COVID-19 and, for those patients that do contract the virus, a probability of death from the virus is estimated. The model compares the outcomes from two identical decision trees: one informed by the pre-COVID-19 treatment pathway and one informed by the post-COVID-19 pathway. Model inputs, including COVID-19 inputs (e.g., number of active and diseased COVID-19 cases among patients with MM), have been informed by the literature and clinical opinion. Costs reflected in the model include: treatment of COVID-19, treatment for MM and administration of MM treatments. Scenario analyses explore lower and upper bounds for key inputs. Results are presented from a UK perspective and a 1-year time horizon (from model entry) is considered. Results: Per the model, treating patients with oral therapies is shown to reduce the number of COVID-19 cases and the number of COVID-19 deaths in patients with MM compared with IV- and SC-delivered therapies. These outcomes translate into cost savings driven by costs avoided in treating COVID-19. There was a limited difference in the costs of treating the underlying MM despite the switch. However, there were additional cost savings demonstrated through avoiding expensive and resource intensive administration appointments associated with IV therapies, and to a lesser extent SC therapies. The use of oral therapies has also aided the increase in telemedicine for routine appointments - scenarios exploring this demonstrate further savings. These results are driven by the perceived risk attached to each of the different routes of administration - scenario analyses demonstrated that assuming even the lower bound risk (an assumed additional risk of 10%) for IV therapies vs. oral therapies, a significant number of COVID-19 cases and deaths were avoided, and costs reduced. Conclusions: Changes to the treatment pathway for patients with MM in light of the COVID-19 pandemic aim to reduce the exposure to the virus for these patients. The model demonstrates that simply switching the route of administration can reduce the number of COVID-19 cases and deaths . This has important implications in avoiding severe outcomes, decreasing the spread of the virus and reducing the cost and resource use burden to the healthcare system. In addition, the model reflects potential efficiencies which may extend beyond the COVID-19 pandemic (e.g. telemedicine) to optimize clinical practice for patients with MM in the longer-term. Disclosures Cranmer: Takeda: Current Employment. Podkonjak:Takeda: Current Employment. Benson:Takeda: Current Employment. Dabora:Takeda: Current Employment. Jackson:Merck Sharp and Dohme: Honoraria; Chugai: Honoraria; Amgen: Honoraria; Janssen: Honoraria; Celgene: Honoraria; Takeda: Honoraria; Roche: Honoraria.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
40

Sinn, Moritz, Florian Zuleger und Helmut Veith. „Complexity and Resource Bound Analysis of Imperative Programs Using Difference Constraints“. Journal of Automated Reasoning 59, Nr. 1 (11.01.2017): 3–45. http://dx.doi.org/10.1007/s10817-016-9402-4.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
41

Valuev, Andrey. „ON CALCULATION OF LINEAR RESOURCE PLANNING MODELS FOR OPTIMAL PROJECT SCHEDULING“. Mathematical Modelling and Analysis 13, Nr. 2 (30.06.2008): 275–88. http://dx.doi.org/10.3846/1392-6292.2008.13.275-288.

Der volle Inhalt der Quelle
Annotation:
Recent author's papers have shown new opportunities resulting from the treatment of resource planning in project scheduling as the optimization problem for a hybrid system. This approach gives the possibility to work out the optimum resource sharing in an iteration process of branch‐and‐bound type. The present paper concentrates on the most standard case of the problem in question for which all the relationships may be represented in the linear form. Two exact finite methods are proposed. The first method is obtained using the piecewise‐linear form of Bellman function, the second evolves from the decomposition approach for dynamic linear programming problem.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
42

Chan, Joseph Wun-Tat, Prudence W. H. Wong und Fencol C. C. Yung. „On Dynamic Bin Packing: An Improved Lower Bound and Resource Augmentation Analysis“. Algorithmica 53, Nr. 2 (01.04.2008): 172–206. http://dx.doi.org/10.1007/s00453-008-9185-z.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
43

Bonn, Thomas. „On the political sideline? The institutional isolation of donor organizations in Jordanian hydropolitics“. Water Policy 15, Nr. 5 (06.06.2013): 728–37. http://dx.doi.org/10.2166/wp.2013.007.

Der volle Inhalt der Quelle
Annotation:
Fresh water availability is very low in Jordan. Current water usage is unsustainable and structures of water resource governance are inadequate. Drawing on expert interviews and the analysis of media texts, this study shows that patterns and privileges of water consumption sustain specific political and social orders, aggravating Jordan's suboptimal water resource deployment. Many of these long-established modes of water distribution are not commensurate with new resource governance structures fostered by international development cooperation. This puts pressure on Jordan's political elite: the flow of foreign aid stabilizes the regime as does the preservation of existing privileges. It is argued here that maintaining two opposed but coexisting ‘resource realities’, i.e. governance structures and usage patterns of water resources, allows the regime to escape this dilemma. Donor organizations are thus bound to operate in an institutionally isolated sphere in Jordan with only a marginal ability to penetrate the relevant actor groups to trigger profound effects on either resource reality.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
44

Schwartz, Laura E., Kathryn H. Howell, Lacy E. Jamison, Kristina M. Decker und Idia B. Thurston. „Examining Resource-Driven Resilience and Intimate Partner Violence in Women“. Partner Abuse 12, Nr. 2 (22.04.2021): 112–29. http://dx.doi.org/10.1891/pa-2020-0017.

Der volle Inhalt der Quelle
Annotation:
Resilience is gaining attention in trauma research, but how it is conceptualized across studies often differs. Further, limited empirical research has been conducted on group-level resilience factors in the context of intimate partner violence (IPV). The current study assessed resilience using two models (i.e., social-ecological and “bounce back”) by investigating how resilience resource variables across the social ecology cluster together and relate to an individual's ability to bounce back after experiencing IPV. Latent profile analysis was used to generate profiles of individual (spirituality), social (social support, community cohesion), cultural (ethnic identity), and physical (use of public assistance) resources consistent with the social-ecological model of resilience. Differences among the latent profiles on overall resilience scores were investigated. Participants were 160 women (Mage = 34.7, 69% Black-identified, 75% with yearly household income less than $20,000) who experienced IPV in the past 6 months. Four resource profiles emerged: (a) generally high (GH); (b) low individual and cultural (LIC); (c) high physical (HP); and (d) low social (LS). The GH profile reported significantly higher resilience than the LIC profile. Findings suggest nuanced variations in resources among women experiencing adversity. These varied resource profiles relate to unique differences in resilience among women exposed to IPV. Based on these findings, interventions to address IPV may be most impactful if they promote stronger ethnic identity and increased spirituality. Future research should build on this work by utilizing more systems-level conceptualizations of resilience and including factors that capture not only physical resources, but also individual, social, and cultural resources.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
45

Song, Ting-Ting, Jie Zhang, Su-Juan Qin, Fei Gao und Qiao-Yan Wen. „Finite-key analysis for quantum key distribution with decoy states“. Quantum Information and Computation 11, Nr. 5&6 (Mai 2011): 374–99. http://dx.doi.org/10.26421/qic11.5-6-2.

Der volle Inhalt der Quelle
Annotation:
We analyze the security of finite-resource quantum key distribution with decoy states, and present the security bound for the practical implementations by introducing the deviations of the probability of sending a $k$-photon pulse and the error rate of the quantum state. The bound is simulated under reasonable values of the observed parameters. Compared with the previous works, the security bound is more stringent.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
46

Shirzadeh Chaleshtari, Ali. „Resource Tardiness Weighted Cost Minimization in Project Scheduling“. Advances in Operations Research 2017 (2017): 1–11. http://dx.doi.org/10.1155/2017/1308704.

Der volle Inhalt der Quelle
Annotation:
In this paper, we study a project scheduling problem that is called resource constrained project scheduling problem under minimization of total weighted resource tardiness penalty cost (RCPSP-TWRTPC). In this problem, the project is subject to renewable resources, each renewable resource is available for limited time periods during the project life cycle, and keeping the resource for each extra period results in some tardiness penalty cost. We introduce a branch and bound algorithm to solve the problem exactly and use several bounding, fathoming, and dominance rules in our algorithm to shorten the enumeration process. We point out parameters affecting the RCPSP-TWRTPC degree of difficulty, generate extensive sets of sample instances for the problem, and perform comprehensive experimental analysis using the customized algorithm and also CPLEX solver. We analyze the algorithm behavior with respect to the changes in instances degree of difficulty and compare its performance for different cases with the CPLEX solver. The results reveal algorithm efficiency.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
47

Zhao, Ming. „The Study and Analysis of Resource Distribution on the Basis of Mathematic Programming Model“. Applied Mechanics and Materials 687-691 (November 2014): 4963–66. http://dx.doi.org/10.4028/www.scientific.net/amm.687-691.4963.

Der volle Inhalt der Quelle
Annotation:
The rational distribution of financial resource is researched during the course of its raising, allocation and using. The input-occupancy-output programing model is constructed based on input-occupancy-output technique and mathematics programming, and the distribution situation of financing resource is calculated and analyzed such a loan, stock, bound in China 2009.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
48

Ulfseth, Lena A., Staffan Josephsson und Sissel Alsaker. „Homeward bound“. Narrative Inquiry 26, Nr. 1 (05.12.2016): 22–38. http://dx.doi.org/10.1075/ni.26.1.02ulf.

Der volle Inhalt der Quelle
Annotation:
With a focus on enacted narratives, this ethnographic study addresses how people with mental illness communicate returning home after a treatment stay at a psychiatric centre. Data were analysed based on Ricoeur’s theory of narrative and action. Our analysis consisted of three analytic layers: the significant issue of discharge, identifying three stories of how being on the way home is enacted, and a further interpretation and discussion. The narrative analysis shows how significant issues of returning home are enacted among persons in everyday activities at one centre, and how an inherent ambiguity raised some challenges within the field of mental health. This study shows how conducting everyday activities enable people use the available narrative resources to negotiate the self; hence they reflect and create thoughts about the return home that are shared among persons at the centre.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
49

Pericherla, Suryateja Satya. „Analysis of Host Resources Utilization by OpenStack in Ubuntu Environment“. Emerging Science Journal 4, Nr. 6 (01.12.2020): 466–92. http://dx.doi.org/10.28991/esj-2020-01246.

Der volle Inhalt der Quelle
Annotation:
Cloud computing is one of the frontier technologies, which over the last decade has gained a widespread commercial and educational user base. OpenStack is one of the popular open source cloud management platforms for establishing a private or public Infrastructure-as-a-Service (IAAS) cloud. Although OpenStack started with very few core modules, it now houses nearly 38 modules and is quite complex. Such a complex software bundle is bound to have an impact on the underlying hardware utilization of the host system. The objective is to monitor the usage of system resources by OpenStack on commodity hardware. This paper analyzes the effect of OpenStack on the host machine's hardware. An extensive empirical evaluation has been done on different types of hardware, at different virtualization levels and with different flavours of operating systems comparing the CPU utilization, memory consumption, disk I/O, network, and I/O requests. OpenStack was deployed using Devstack on a single node. The novel aspect of this work is monitoring the resource usage by OpenStack without creating virtual machines on commodity hardware. From the analysis of data, it is observed that standalone machine with Ubuntu server operating system is the least effected by OpenStack and thereby has more available resources for computation of user workloads. Doi: 10.28991/esj-2020-01246 Full Text: PDF
APA, Harvard, Vancouver, ISO und andere Zitierweisen
50

Chen, Xuanjin. „Antecedents of Technological Diversification: A Resource Dependence Logic“. Journal of Open Innovation: Technology, Market, and Complexity 5, Nr. 4 (07.10.2019): 80. http://dx.doi.org/10.3390/joitmc5040080.

Der volle Inhalt der Quelle
Annotation:
This paper extends resource dependence logic by investigating the antecedents of technological diversification and further identifies its boundary condition. We argue that this resource dependence logic is bound by state ownership through coalitions with firms, a less discussed component of interdependence. The empirical results, based on a panel data analysis of Chinese listed firms, suggest that environmental dynamism positively relates to technological diversification, while environmental munificence negatively relates to technological diversification. These relationships changed when state ownership is considered. The theoretical implications for resource dependence theory and diversification research are discussed.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Wir bieten Rabatte auf alle Premium-Pläne für Autoren, deren Werke in thematische Literatursammlungen aufgenommen wurden. Kontaktieren Sie uns, um einen einzigartigen Promo-Code zu erhalten!

Zur Bibliographie