Journal articles on the topic 'Open source discrete event simulation (DES)'

To see the other types of publications on this topic, follow the link: Open source discrete event simulation (DES).

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Open source discrete event simulation (DES).'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Palmer, Geraint I., Vincent A. Knight, Paul R. Harper, and Asyl L. Hawa. "Ciw: An open-source discrete event simulation library." Journal of Simulation 13, no. 1 (May 20, 2018): 68–82. http://dx.doi.org/10.1080/17477778.2018.1473909.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Peixoto, Túlio Almeida, João José de Assis Rangel, Ítalo de Oliveira Matias, Fábio Freitas da Silva, and Eder Reis Tavares. "Ururau: a free and open-source discrete event simulation software." Journal of Simulation 11, no. 4 (November 2017): 303–21. http://dx.doi.org/10.1057/s41273-016-0038-5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Sőrés, Milán, and Attila Fodor. "Simulation of Electrical Grid with Omnet++ Open Source Discrete Event System Simulator." Hungarian Journal of Industry and Chemistry 44, no. 2 (December 1, 2016): 85–91. http://dx.doi.org/10.1515/hjic-2016-0010.

Full text
Abstract:
Abstract The simulation of electrical networks is very important before development and servicing of electrical networks and grids can occur. There are software that can simulate the behaviour of electrical grids under different operating conditions, but these simulation environments cannot be used in a single cloud-based project, because they are not GNU-licensed software products. In this paper, an integrated framework was proposed that models and simulates communication networks. The design and operation of the simulation environment are investigated and a model of electrical components is proposed. After simulation, the simulation results were compared to manual computed results.
APA, Harvard, Vancouver, ISO, and other styles
4

Dagkakis, G., and C. Heavey. "A review of open source discrete event simulation software for operations research." Journal of Simulation 10, no. 3 (August 2016): 193–206. http://dx.doi.org/10.1057/jos.2015.9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Rossetti, Manuel D. "Java Simulation Library (JSL): an open-source object-oriented library for discrete-event simulation in Java." International Journal of Simulation and Process Modelling 4, no. 1 (2008): 69. http://dx.doi.org/10.1504/ijspm.2008.020614.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Olaitan, Oladipupo, John Geraghty, Paul Young, Georgios Dagkakis, Cathal Heavey, Martin Bayer, Jerome Perrin, and Sebastien Robin. "Implementing ManPy, a Semantic-free Open-source Discrete Event Simulation Package, in a Job Shop." Procedia CIRP 25 (2014): 253–60. http://dx.doi.org/10.1016/j.procir.2014.10.036.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Dagkakis, Georgios, Ioannis Papagiannopoulos, and Cathal Heavey. "ManPy: an open-source software tool for building discrete event simulation models of manufacturing systems." Software: Practice and Experience 46, no. 7 (August 30, 2015): 955–81. http://dx.doi.org/10.1002/spe.2347.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Laurindo, Quézia Manuela Gonçalves, Túlio Almeida Peixoto, and João José de Assis Rangel. "Communication mechanism of the discrete event simulation and the mechanical project softwares for manufacturing systems." Journal of Computational Design and Engineering 6, no. 1 (March 9, 2018): 70–80. http://dx.doi.org/10.1016/j.jcde.2018.02.005.

Full text
Abstract:
Abstract This paper presents an integration mechanism for online communication between a discrete event simulation (DES) software and a system dynamics (SD) software. The integration between them allowed executing a hybrid and broader simulation, in which the complexity of the systems and their multi-faceted relationships may demand the combination of different simulation methods and the synergies between the techniques. The Ururau free and open-source software (FOSS) was applied to implement the DES model. In order to build the dynamic model, we used the software for mechanical design called CAD 3D Software Inventor®. Besides, we also employed the DES model in the test step of a control system in real time. The results of that mechanism implementation enabled the evaluation of different aspects of a typical manufacturing system. Furthermore, the integration between the control system and the DES model allowed validating the logic of the programmable logic controller (PLC). Highlights Mechanism for online communication between a discrete event simulation (DES) software and a system dynamics (SD) software. A free and open-source software (FOSS) was applied to implement the DES model. The results of that mechanism implementation enabled the evaluation of different aspects of a typical manufacturing system.
APA, Harvard, Vancouver, ISO, and other styles
9

Belattar, Brahim, and Abdelhabib Bourouis. "Ascertaining Important Features of the JAPROSIM Simulation Library." International Journal of Mathematics and Computers in Simulation 16 (January 12, 2022): 29–36. http://dx.doi.org/10.46300/9102.2022.16.6.

Full text
Abstract:
This paper describes important features of JAPROSIM, a free and open source simulation library implemented in Java programming language. It provides a framework for building discrete event simulation models. The process interaction world view adopted by JAPROSIM is discussed. We present the architecture and major components of the simulation library. In order to ascertain important features of JAPROSIM, examples are given. Further motivations are discussed and suggestions for improving our work are given.
APA, Harvard, Vancouver, ISO, and other styles
10

Lang, Sebastian, Tobias Reggelin, Marcel Müller, and Abdulrahman Nahhas. "Open-source discrete-event simulation software for applications in production and logistics: An alternative to commercial tools?" Procedia Computer Science 180 (2021): 978–87. http://dx.doi.org/10.1016/j.procs.2021.01.349.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Clay, E., A. Vataire, O. Cristeau, and S. Aballea. "PRM147 - OPEN-SOURCE MODELS IN HEALTH ECONOMICS: EXPERIENCE BASED ON A DISCRETE EVENT SIMULATION MODEL IN MAJOR DEPRESSIVE DISORDER." Value in Health 21 (October 2018): S381. http://dx.doi.org/10.1016/j.jval.2018.09.2267.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Clay, E., A. Vataire, O. Cristeau, and M. Toumi. "PRM149 - OPEN-SOURCE DISCRETE EVENT SIMULATION MODEL FOR MAJOR DEPRESSION DISORDER (MDD): AN UPDATE SUITABLE FOR NEW GENERATION TREATMENT?" Value in Health 21 (October 2018): S381—S382. http://dx.doi.org/10.1016/j.jval.2018.09.2269.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Jimenez, Belmonte, Garrido, Ruz, and Vazquez. "Software Tool for Acausal Physical Modelling and Simulation." Symmetry 11, no. 10 (September 24, 2019): 1199. http://dx.doi.org/10.3390/sym11101199.

Full text
Abstract:
Modelling and simulation are key tools for analysis and design of systems and processes from almost any scientific or engineering discipline. Models of complex systems are typically built on acausal Differential-Algebraic Equations (DAE) and discrete events using Object-Oriented Modelling (OOM) languages, and some of their key concepts can be explained as symmetries. To obtain a computer executable version from the original model, several algorithms, based on bipartite symmetric graphs, must be applied for automatic equation generation, removing alias equations, computational causality assignment, equation sorting, discrete-event processing or index reduction. In this paper, an open source tool according to OOM paradigm and developed in MATLAB is introduced. It implements such algorithms adding an educational perspective about how they work, since the step by step results obtained after processing the model equations can be shown. The tool also allows to create models using its own OOM language and to simulate the final executable equation set. It was used by students in a modelling and simulation course of the Automatic Control and Industrial Electronics Engineering degree, showing a significant improvement in their understanding and learning of the abovementioned topics after their assessment.
APA, Harvard, Vancouver, ISO, and other styles
14

May, Marvin Carl, Lars Kiefer, Andreas Kuhnle, and Gisela Lanza. "Ontology-Based Production Simulation with OntologySim." Applied Sciences 12, no. 3 (February 3, 2022): 1608. http://dx.doi.org/10.3390/app12031608.

Full text
Abstract:
Imagine the possibility to save a simulation at any time, modify or analyze it, and restart again with exactly the same state. The conceptualization and its concrete manifestation in the implementation OntologySim is demonstrated in this paper. The presented approach of a fully ontology-based simulation can solve current challenges in modeling and simulation in production science. Due to the individualization and customization of products and the resulting increase in complexity of production, a need for flexibly adaptable simulations arises. This need is exemplified in the trend towards Digital Twins and Digital Shadows. Their application to production systems, against the background of an ever increasing speed of change in such systems, is arduous. Moreover, missing understandability and human interpretability of current approaches hinders successful, goal oriented applications. The OntologySim can help solving this challenge by providing the ability to generate truly cyber physical systems, both interlocked with reality and providing a simulation framework. In a nutshell, this paper presents a discrete-event-based open-source simulation using multi-agency and ontology.
APA, Harvard, Vancouver, ISO, and other styles
15

Xanthopoulos, A. S., and D. E. Koulouriotis. "A comparative study of different pull control strategies in multi-product manufacturing systems using discrete event simulation." Advances in Production Engineering & Management 16, no. 4 (December 18, 2021): 473–84. http://dx.doi.org/10.14743/apem2021.4.414.

Full text
Abstract:
Pull production control strategies coordinate manufacturing operations based on actual demand. Up to now, relevant publications mostly examine manufacturing systems that produce a single type of a product. In this research, we examine the CONWIP, Base Stock, and CONWIP/Kanban Hybrid pull strategies in multi-product manufacturing systems. In a multi-product manufacturing system, several types of products are manufactured by utilizing the same resources. We develop queueing network models of multi-stage, multi-product manufacturing systems operating under the three aforementioned pull control strategies. Simulation models of the alternative production systems are implemented using an open-source software. A comparative evaluation of CONWIP, Base Stock and CONWIP/Kanban Hybrid in multi-product manufacturing is carried out in a series of simulation experiments with varying demand arrival rates, setup times and control parameters. The control strategies are compared based on average wait time of backordered demand, average finished products inventories, and average length of backorders queues. The Base Stock strategy excels when the manufacturing system is subjected to high demand arrival rates. The CONWIP strategy produced consistently the highest level of finished goods inventories. The CONWIP/Kanban Hybrid strategy is significantly affected by the workload that is imposed on the system.
APA, Harvard, Vancouver, ISO, and other styles
16

Altman, Micah, and Richard Landau. "Selecting Efficient and Reliable Preservation Strategies." International Journal of Digital Curation 15, no. 1 (September 29, 2020): 5. http://dx.doi.org/10.2218/ijdc.v15i1.727.

Full text
Abstract:
This article addresses the problem of formulating efficient and reliable operational preservation policies that ensure bit-level information integrity over long periods, and in the presence of a diverse range of real-world technical, legal, organizational, and economic threats. We develop a systematic, quantitative prediction framework that combines formal modeling, discrete-event-based simulation, hierarchical modeling, And then use empirically calibrated sensitivity analysis to identify effective strategies. Specifically, the framework formally defines an objective function for preservation that maps a set of preservation policies and a risk profile to a set of preservation costs, and an expected collection loss distribution. In this framework, a curator’s objective is to select optimal policies that minimize expected loss subject to their budget constraint. To estimate preservation loss under different policy conditions optimal policies, we develop a statistical hierarchical risk model – that statistically four sources of risk: the storage hardware; the physical environment; the curating institution; and the global environment. We then employ a general discrete event-based simulation framework to evaluate the expected loss and cost of employing varying preservation strategies under specific parameterization of risks. The framework offers flexibility for the modeling of a wide range of preservation policies and threats. Since this framework is open source, and easily deployed in a cloud computing environment, it can be used to produce analysis based on independent estimates of scenario-specific costs, reliability, and risks. We present results summarizing hundreds of thousands of simulations using this framework. This analysis points to a number of robust and broadly applicable preservation strategies, provides novel insights into specific preservation tactics, and provides evidence that challenges received wisdom.
APA, Harvard, Vancouver, ISO, and other styles
17

Bendechache, Malika, Sergej Svorobej, Patricia Takako Endo, Adrian Mihai, and Theo Lynn. "Simulating and Evaluating a Real-World ElasticSearch System Using the RECAP DES Simulator." Future Internet 13, no. 4 (March 24, 2021): 83. http://dx.doi.org/10.3390/fi13040083.

Full text
Abstract:
Simulation has become an indispensable technique for modelling and evaluating the performance of large-scale systems efficiently and at a relatively low cost. ElasticSearch (ES) is one of the most popular open source large-scale distributed data indexing systems worldwide. In this paper, we use the RECAP Discrete Event Simulator (DES) simulator, an extension of CloudSimPlus, to model and evaluate the performance of a real-world cloud-based ES deployment by an Irish small and medium-sized enterprise (SME), Opening.io. Following simulation experiments that explored how much query traffic the existing Opening.io architecture could cater for before performance degradation, a revised architecture was proposed, adding a new virtual machine in order to dissolve the bottleneck. The simulation results suggest that the proposed improved architecture can handle significantly larger query traffic (about 71% more) than the current architecture used by Opening.io. The results also suggest that the RECAP DES simulator is suitable for simulating ES systems and can help companies to understand their infrastructure bottlenecks under various traffic scenarios and inform optimisation and scalability decisions.
APA, Harvard, Vancouver, ISO, and other styles
18

Babich, Fulvio, Massimilliano Comisso, Aljosa Dorni, Flavio Barisi, Marco Driusso, and Allesandro Manià. "Discrete-time simulation of smart antenna systems in Network Simulator-2 Using MATLAB and Octave." SIMULATION 87, no. 11 (December 8, 2010): 932–46. http://dx.doi.org/10.1177/0037549710387762.

Full text
Abstract:
This paper presents two platforms that exploit the scalability properties of Network Simulator-2 for the discrete-event simulation of a telecommunication network, and the modeling capabilities of two development tools for the discrete-time implementation of adaptive antenna arrays at the physical layer. The two tools are the proprietary MATLAB and the open source Octave, both of which are used to implement the physical antenna system, the beamforming algorithm, the channel coding scheme, and the multipath and fading statistics. The adopted approach enables detailed modeling of the antenna radiation pattern generated by each network node, thus improving the accuracy of the signal-to-interference ratio estimated at the receiver. This study describes the methods that can be adopted to interface MATLAB and Octave with Network Simulator-2, and discusses the advantages and disadvantages that characterize the integration of the two tools with Network Simulator-2. The proposed numerical platforms, which can be interfaced with any wireless network supported by Network Simulator-2, are used to investigate the possibility of exploiting smart antenna systems in a wireless mesh network to enable the coexistence of multiple simultaneous communications.
APA, Harvard, Vancouver, ISO, and other styles
19

Bean, Daniel M., Paul Taylor, and Richard J. B. Dobson. "A patient flow simulator for healthcare management education." BMJ Simulation and Technology Enhanced Learning 5, no. 1 (October 7, 2017): 46–48. http://dx.doi.org/10.1136/bmjstel-2017-000251.

Full text
Abstract:
Simulation and analysis of patient flow can contribute to the safe and efficient functioning of a healthcare system, yet it is rarely incorporated into routine healthcare management, partially due to the technical training required. This paper introduces a free and open source patient flow simulation software tool that enables training and experimentation with healthcare management decisions and their impact on patient flow. Users manage their simulated hospital with a simple web-based graphical interface. The model is a stochastic discrete event simulation in which patients are transferred between wards of a hospital according to their treatment needs. Entry to each ward is managed by queues, with different policies for queue management and patient prioritisation per ward. Users can manage a simulated hospital, distribute resources between wards and decide how those resources should be prioritised. Simulation results are immediately available for analysis in-browser, including performance against targets, patient flow networks and ward occupancy. The patient flow simulator, freely available at https://khp-informatics.github.io/patient-flow-simulator, is an interactive educational tool that allows healthcare students and professionals to learn important concepts of patient flow and healthcare management.
APA, Harvard, Vancouver, ISO, and other styles
20

TRIPAKIS, STAVROS, CHRISTOS STERGIOU, CHRIS SHAVER, and EDWARD A. LEE. "A modular formal semantics for Ptolemy." Mathematical Structures in Computer Science 23, no. 4 (July 8, 2013): 834–81. http://dx.doi.org/10.1017/s0960129512000278.

Full text
Abstract:
Ptolemy‡ is an open-source and extensible modelling and simulation framework. It offers heterogeneous modeling capabilities by allowing different models of computation, both untimed and timed, to be composed hierarchically in an arbitrary fashion. This paper proposes a formal semantics for Ptolemy that is modular in the sense that atomic actors and their compositions are treated in a unified way. In particular, all actors conform to an executable interface that contains four functions: fire (produce outputs given current state and inputs); postfire (update state instantaneously); deadline (how much time the actor is willing to let elapse); and time-update (update the state with the passage of time). Composite actors are obtained using composition operators that in Ptolemy are called directors. Different directors realise different models of computation. In this paper, we formally define the directors for the following models of computation: synchronous- reactive, discrete event, continuous time, process networks and modal models.
APA, Harvard, Vancouver, ISO, and other styles
21

Sokolovska, Z. M., O. А. Klepikova, N. V. Yatsenko, and A. V. Marchenko. "Management of Development of Projects of Product IT Companies on Simulation Platforms." Business Inform 5, no. 520 (2021): 108–23. http://dx.doi.org/10.32983/2222-4459-2021-5-108-123.

Full text
Abstract:
The article is aimed at disclosing the possibilities of using the simulation modeling apparatus in the management of the development of projects of product IT companies. Based on the analysis of a number of literary sources, an overview of research in the field of IT project management is carried out and the existing status of application of the simulation instrumentarium in the solution of applied tasks is described. The compliance of simulation technologies with the special requirements of the tasks inherent in the processes of software products development is substantiated. According to the life cycle of software creation, a brief characterization of the typical stages of project promotion is provided. A simulation model is proposed that reproduces the dynamics of the deployment of product projects of IT companies. The model is built using a hybrid approach – a combination of discrete-event and agent-based methodologies – on the software platform for multi-platform simulation AnyLogic. The discrete-event approach provides a high degree of detail of the processes being modeled. The agent-based approach allows to define the project as a dynamic unit with specific properties, which makes it possible to take into account the peculiarities of products ordered by consumers – clients of an IT firm. The model’s work is demonstrated by the results of standard and optimization simulation experiments implemented on conditional data of an average product IT company. These situations prove the possibility of using the model as a platform for making managerial decisions by the head of the product project development team. The model is presented as a simulator with the modular, open architecture and parametric adjustment for specific conditions of experiments. It may be recommended for product IT companies to practice management of the projects of software development.
APA, Harvard, Vancouver, ISO, and other styles
22

Bock, Michael, Olaf Conrad, Andreas Günther, Ernst Gehrt, Rainer Baritz, and Jürgen Böhner. "SaLEM (v1.0) – the Soil and Landscape Evolution Model (SaLEM) for simulation of regolith depth in periglacial environments." Geoscientific Model Development 11, no. 4 (April 27, 2018): 1641–52. http://dx.doi.org/10.5194/gmd-11-1641-2018.

Full text
Abstract:
Abstract. We propose the implementation of the Soil and Landscape Evolution Model (SaLEM) for the spatiotemporal investigation of soil parent material evolution following a lithologically differentiated approach. Relevant parts of the established Geomorphic/Orogenic Landscape Evolution Model (GOLEM) have been adapted for an operational Geographical Information System (GIS) tool within the open-source software framework System for Automated Geoscientific Analyses (SAGA), thus taking advantage of SAGA's capabilities for geomorphometric analyses. The model is driven by palaeoclimatic data (temperature, precipitation) representative of periglacial areas in northern Germany over the last 50 000 years. The initial conditions have been determined for a test site by a digital terrain model and a geological model. Weathering, erosion and transport functions are calibrated using extrinsic (climatic) and intrinsic (lithologic) parameter data. First results indicate that our differentiated SaLEM approach shows some evidence for the spatiotemporal prediction of important soil parental material properties (particularly its depth). Future research will focus on the validation of the results against field data, and the influence of discrete events (mass movements, floods) on soil parent material formation has to be evaluated.
APA, Harvard, Vancouver, ISO, and other styles
23

Modares, Jalil, Nicholas Mastronarde, and Karthik Dantu. "Simulating unmanned aerial vehicle swarms with the UB-ANC Emulator." International Journal of Micro Air Vehicles 11 (January 2019): 175682931983766. http://dx.doi.org/10.1177/1756829319837668.

Full text
Abstract:
Recent advances in multi-rotor vehicle control and miniaturization of hardware, sensing, and battery technologies have enabled cheap, practical design of micro air vehicles for civilian and hobby applications. In parallel, several applications are being envisioned that bring together a swarm of multiple networked micro air vehicles to accomplish large tasks in coordination. However, it is still very challenging to deploy multiple micro air vehicles concurrently. To address this challenge, we have developed an open software/hardware platform called the University at Buffalo’s Airborne Networking and Communications Testbed (UB-ANC), and an associated emulation framework called the UB-ANC Emulator. In this paper, we present the UB-ANC Emulator, which combines multi-micro air vehicle planning and control with high-fidelity network simulation, enables practitioners to design micro air vehicle swarm applications in software and provides seamless transition to deployment on actual hardware. We demonstrate the UB-ANC Emulator’s accuracy against experimental data collected in two mission scenarios: a simple mission with three networked micro air vehicles and a sophisticated coverage path planning mission with a single micro air vehicle. To accurately reflect the performance of a micro air vehicle swarm where communication links are subject to interference and packet losses, and protocols at the data link, network, and transport layers affect network throughput, latency, and reliability, we integrate the open-source discrete-event network simulator ns-3 into the UB-ANC Emulator. We demonstrate through node-to-node and end-to-end measurements how the UB-ANC Emulator can be used to simulate multiple networked micro air vehicles with accurate modeling of mobility, control, wireless channel characteristics, and network protocols defined in ns-3.
APA, Harvard, Vancouver, ISO, and other styles
24

Xing, Fei, Yi Ping Yao, Zhi Wen Jiang, and Bing Wang. "Fine-Grained Parallel and Distributed Spatial Stochastic Simulation of Biological Reactions." Advanced Materials Research 345 (September 2011): 104–12. http://dx.doi.org/10.4028/www.scientific.net/amr.345.104.

Full text
Abstract:
To date, discrete event stochastic simulations of large scale biological reaction systems are extremely compute-intensive and time-consuming. Besides, it has been widely accepted that spatial factor plays a critical role in the dynamics of most biological reaction systems. The NSM (the Next Sub-Volume Method), a spatial variation of the Gillespie’s stochastic simulation algorithm (SSA), has been proposed for spatially stochastic simulation of those systems. While being able to explore high degree of parallelism in systems, NSM is inherently sequential, which still suffers from the problem of low simulation speed. Fine-grained parallel execution is an elegant way to speed up sequential simulations. Thus, based on the discrete event simulation framework JAMES II, we design and implement a PDES (Parallel Discrete Event Simulation) TW (time warp) simulator to enable the fine-grained parallel execution of spatial stochastic simulations of biological reaction systems using the ANSM (the Abstract NSM), a parallel variation of the NSM. The simulation results of classical Lotka-Volterra biological reaction system show that our time warp simulator obtains remarkable parallel speed-up against sequential execution of the NSM.I.IntroductionThe goal of Systems biology is to obtain system-level investigations of the structure and behavior of biological reaction systems by integrating biology with system theory, mathematics and computer science [1][3], since the isolated knowledge of parts can not explain the dynamics of a whole system. As the complement of “wet-lab” experiments, stochastic simulation, being called the “dry-computational” experiment, plays a more and more important role in computing systems biology [2]. Among many methods explored in systems biology, discrete event stochastic simulation is of greatly importance [4][5][6], since a great number of researches have present that stochasticity or “noise” have a crucial effect on the dynamics of small population biological reaction systems [4][7]. Furthermore, recent research shows that the stochasticity is not only important in biological reaction systems with small population but also in some moderate/large population systems [7].To date, Gillespie’s SSA [8] is widely considered to be the most accurate way to capture the dynamics of biological reaction systems instead of traditional mathematical method [5][9]. However, SSA-based stochastic simulation is confronted with two main challenges: Firstly, this type of simulation is extremely time-consuming, since when the types of species and the number of reactions in the biological system are large, SSA requires a huge amount of steps to sample these reactions; Secondly, the assumption that the systems are spatially homogeneous or well-stirred is hardly met in most real biological systems and spatial factors play a key role in the behaviors of most real biological systems [19][20][21][22][23][24]. The next sub-volume method (NSM) [18], presents us an elegant way to access the special problem via domain partition. To our disappointment, sequential stochastic simulation with the NSM is still very time-consuming, and additionally introduced diffusion among neighbor sub-volumes makes things worse. Whereas, the NSM explores a very high degree of parallelism among sub-volumes, and parallelization has been widely accepted as the most meaningful way to tackle the performance bottleneck of sequential simulations [26][27]. Thus, adapting parallel discrete event simulation (PDES) techniques to discrete event stochastic simulation would be particularly promising. Although there are a few attempts have been conducted [29][30][31], research in this filed is still in its infancy and many issues are in need of further discussion. The next section of the paper presents the background and related work in this domain. In section III, we give the details of design and implementation of model interfaces of LP paradigm and the time warp simulator based on the discrete event simulation framework JAMES II; the benchmark model and experiment results are shown in Section IV; in the last section, we conclude the paper with some future work.II. Background and Related WorkA. Parallel Discrete Event Simulation (PDES)The notion Logical Process (LP) is introduced to PDES as the abstract of the physical process [26], where a system consisting of many physical processes is usually modeled by a set of LP. LP is regarded as the smallest unit that can be executed in PDES and each LP holds a sub-partition of the whole system’s state variables as its private ones. When a LP processes an event, it can only modify the state variables of its own. If one LP needs to modify one of its neighbors’ state variables, it has to schedule an event to the target neighbor. That is to say event message exchanging is the only way that LPs interact with each other. Because of the data dependences or interactions among LPs, synchronization protocols have to be introduced to PDES to guarantee the so-called local causality constraint (LCC) [26]. By now, there are a larger number of synchronization algorithms have been proposed, e.g. the null-message [26], the time warp (TW) [32], breath time warp (BTW) [33] and etc. According to whether can events of LPs be processed optimistically, they are generally divided into two types: conservative algorithms and optimistic algorithms. However, Dematté and Mazza have theoretically pointed out the disadvantages of pure conservative parallel simulation for biochemical reaction systems [31]. B. NSM and ANSM The NSM is a spatial variation of Gillespie’ SSA, which integrates the direct method (DM) [8] with the next reaction method (NRM) [25]. The NSM presents us a pretty good way to tackle the aspect of space in biological systems by partitioning a spatially inhomogeneous system into many much more smaller “homogeneous” ones, which can be simulated by SSA separately. However, the NSM is inherently combined with the sequential semantics, and all sub-volumes share one common data structure for events or messages. Thus, directly parallelization of the NSM may be confronted with the so-called boundary problem and high costs of synchronously accessing the common data structure [29]. In order to obtain higher efficiency of parallel simulation, parallelization of NSM has to firstly free the NSM from the sequential semantics and secondly partition the shared data structure into many “parallel” ones. One of these is the abstract next sub-volume method (ANSM) [30]. In the ANSM, each sub-volume is modeled by a logical process (LP) based on the LP paradigm of PDES, where each LP held its own event queue and state variables (see Fig. 1). In addition, the so-called retraction mechanism was introduced in the ANSM too (see algorithm 1). Besides, based on the ANSM, Wang etc. [30] have experimentally tested the performance of several PDES algorithms in the platform called YH-SUPE [27]. However, their platform is designed for general simulation applications, thus it would sacrifice some performance for being not able to take into account the characteristics of biological reaction systems. Using the similar ideas of the ANSM, Dematté and Mazza have designed and realized an optimistic simulator. However, they processed events in time-stepped manner, which would lose a specific degree of precisions compared with the discrete event manner, and it is very hard to transfer a time-stepped simulation to a discrete event one. In addition, Jeschke etc.[29] have designed and implemented a dynamic time-window simulator to execution the NSM in parallel on the grid computing environment, however, they paid main attention on the analysis of communication costs and determining a better size of the time-window.Fig. 1: the variations from SSA to NSM and from NSM to ANSMC. JAMES II JAMES II is an open source discrete event simulation experiment framework developed by the University of Rostock in Germany. It focuses on high flexibility and scalability [11][13]. Based on the plug-in scheme [12], each function of JAMES II is defined as a specific plug-in type, and all plug-in types and plug-ins are declared in XML-files [13]. Combined with the factory method pattern JAMES II innovatively split up the model and simulator, which makes JAMES II is very flexible to add and reuse both of models and simulators. In addition, JAMES II supports various types of modelling formalisms, e.g. cellular automata, discrete event system specification (DEVS), SpacePi, StochasticPi and etc.[14]. Besides, a well-defined simulator selection mechanism is designed and developed in JAMES II, which can not only automatically choose the proper simulators according to the modeling formalism but also pick out a specific simulator from a serious of simulators supporting the same modeling formalism according to the user settings [15].III. The Model Interface and SimulatorAs we have mentioned in section II (part C), model and simulator are split up into two separate parts. Thus, in this section, we introduce the designation and implementation of model interface of LP paradigm and more importantly the time warp simulator.A. The Mod Interface of LP ParadigmJAMES II provides abstract model interfaces for different modeling formalism, based on which Wang etc. have designed and implemented model interface of LP paradigm[16]. However, this interface is not scalable well for parallel and distributed simulation of larger scale systems. In our implementation, we accommodate the interface to the situation of parallel and distributed situations. Firstly, the neighbor LP’s reference is replaced by its name in LP’s neighbor queue, because it is improper even dangerous that a local LP hold the references of other LPs in remote memory space. In addition, (pseudo-)random number plays a crucial role to obtain valid and meaningful results in stochastic simulations. However, it is still a very challenge work to find a good random number generator (RNG) [34]. Thus, in order to focus on our problems, we introduce one of the uniform RNGs of JAMES II to this model interface, where each LP holds a private RNG so that random number streams of different LPs can be independent stochastically. B. The Time Warp SimulatorBased on the simulator interface provided by JAMES II, we design and implement the time warp simulator, which contains the (master-)simulator, (LP-)simulator. The simulator works strictly as master/worker(s) paradigm for fine-grained parallel and distributed stochastic simulations. Communication costs are crucial to the performance of a fine-grained parallel and distributed simulation. Based on the Java remote method invocation (RMI) mechanism, P2P (peer-to-peer) communication is implemented among all (master-and LP-)simulators, where a simulator holds all the proxies of targeted ones that work on remote workers. One of the advantages of this communication approach is that PDES codes can be transferred to various hardwire environment, such as Clusters, Grids and distributed computing environment, with only a little modification; The other is that RMI mechanism is easy to realized and independent to any other non-Java libraries. Since the straggler event problem, states have to be saved to rollback events that are pre-processed optimistically. Each time being modified, the state is cloned to a queue by Java clone mechanism. Problem of this copy state saving approach is that it would cause loads of memory space. However, the problem can be made up by a condign GVT calculating mechanism. GVT reduction scheme also has a significant impact on the performance of parallel simulators, since it marks the highest time boundary of events that can be committed so that memories of fossils (processed events and states) less than GVT can be reallocated. GVT calculating is a very knotty for the notorious simultaneous reporting problem and transient messages problem. According to our problem, another GVT algorithm, called Twice Notification (TN-GVT) (see algorithm 2), is contributed to this already rich repository instead of implementing one of GVT algorithms in reference [26] and [28].This algorithm looks like the synchronous algorithm described in reference [26] (pp. 114), however, they are essentially different from each other. This algorithm has never stopped the simulators from processing events when GVT reduction, while algorithm in reference [26] blocks all simulators for GVT calculating. As for the transient message problem, it can be neglect in our implementation, because RMI based remote communication approach is synchronized, that means a simulator will not go on its processing until the remote the massage get to its destination. And because of this, the high-costs message acknowledgement, prevalent over many classical asynchronous GVT algorithms, is not needed anymore too, which should be constructive to the whole performance of the time warp simulator.IV. Benchmark Model and Experiment ResultsA. The Lotka-Volterra Predator-prey SystemIn our experiment, the spatial version of Lotka-Volterra predator-prey system is introduced as the benchmark model (see Fig. 2). We choose the system for two considerations: 1) this system is a classical experimental model that has been used in many related researches [8][30][31], so it is credible and the simulation results are comparable; 2) it is simple but helpful enough to test the issues we are interested in. The space of predator-prey System is partitioned into a2D NXNgrid, whereNdenotes the edge size of the grid. Initially the population of the Grass, Preys and Predators are set to 1000 in each single sub-volume (LP). In Fig. 2,r1,r2,r3stand for the reaction constants of the reaction 1, 2 and 3 respectively. We usedGrass,dPreyanddPredatorto stand for the diffusion rate of Grass, Prey and Predator separately. Being similar to reference [8], we also take the assumption that the population of the grass remains stable, and thusdGrassis set to zero.R1:Grass + Prey ->2Prey(1)R2:Predator +Prey -> 2Predator(2)R3:Predator -> NULL(3)r1=0.01; r2=0.01; r3=10(4)dGrass=0.0;dPrey=2.5;dPredato=5.0(5)Fig. 2: predator-prey systemB. Experiment ResultsThe simulation runs have been executed on a Linux Cluster with 40 computing nodes. Each computing node is equipped with two 64bit 2.53 GHz Intel Xeon QuadCore Processors with 24GB RAM, and nodes are interconnected with Gigabit Ethernet connection. The operating system is Kylin Server 3.5, with kernel 2.6.18. Experiments have been conducted on the benchmark model of different size of mode to investigate the execution time and speedup of the time warp simulator. As shown in Fig. 3, the execution time of simulation on single processor with 8 cores is compared. The result shows that it will take more wall clock time to simulate much larger scale systems for the same simulation time. This testifies the fact that larger scale systems will leads to more events in the same time interval. More importantly, the blue line shows that the sequential simulation performance declines very fast when the mode scale becomes large. The bottleneck of sequential simulator is due to the costs of accessing a long event queue to choose the next events. Besides, from the comparison between group 1 and group 2 in this experiment, we could also conclude that high diffusion rate increased the simulation time greatly both in sequential and parallel simulations. This is because LP paradigm has to split diffusion into two processes (diffusion (in) and diffusion (out) event) for two interactive LPs involved in diffusion and high diffusion rate will lead to high proportional of diffusion to reaction. In the second step shown in Fig. 4, the relationship between the speedups from time warp of two different model sizes and the number of work cores involved are demonstrated. The speedup is calculated against the sequential execution of the spatial reaction-diffusion systems model with the same model size and parameters using NSM.Fig. 4 shows the comparison of speedup of time warp on a64X64grid and a100X100grid. In the case of a64X64grid, under the condition that only one node is used, the lowest speedup (a little bigger than 1) is achieved when two cores involved, and the highest speedup (about 6) is achieved when 8 cores involved. The influence of the number of cores used in parallel simulation is investigated. In most cases, large number of cores could bring in considerable improvements in the performance of parallel simulation. Also, compared with the two results in Fig. 4, the simulation of larger model achieves better speedup. Combined with time tests (Fig. 3), we find that sequential simulator’s performance declines sharply when the model scale becomes very large, which makes the time warp simulator get better speed-up correspondingly.Fig. 3: Execution time (wall clock time) of Seq. and time warp with respect to different model sizes (N=32, 64, 100, and 128) and model parameters based on single computing node with 8 cores. Results of the test are grouped by the diffusion rates (Group 1: Sequential 1 and Time Warp 1. dPrey=2.5, dPredator=5.0; Group 2: dPrey=0.25, dPredator=0.5, Sequential 2 and Time Warp 2).Fig. 4: Speedup of time warp with respect to the number of work cores and the model size (N=64 and 100). Work cores are chose from one computing node. Diffusion rates are dPrey=2.5, dPredator=5.0 and dGrass=0.0.V. Conclusion and Future WorkIn this paper, a time warp simulator based on the discrete event simulation framework JAMES II is designed and implemented for fine-grained parallel and distributed discrete event spatial stochastic simulation of biological reaction systems. Several challenges have been overcome, such as state saving, roll back and especially GVT reduction in parallel execution of simulations. The Lotka-Volterra Predator-Prey system is chosen as the benchmark model to test the performance of our time warp simulator and the best experiment results show that it can obtain about 6 times of speed-up against the sequential simulation. The domain this paper concerns with is in the infancy, many interesting issues are worthy of further investigated, e.g. there are many excellent PDES optimistic synchronization algorithms (e.g. the BTW) as well. Next step, we would like to fill some of them into JAMES II. In addition, Gillespie approximation methods (tau-leap[10] etc.) sacrifice some degree of precision for higher simulation speed, but still could not address the aspect of space of biological reaction systems. The combination of spatial element and approximation methods would be very interesting and promising; however, the parallel execution of tau-leap methods should have to overcome many obstacles on the road ahead.AcknowledgmentThis work is supported by the National Natural Science Foundation of China (NSF) Grant (No.60773019) and the Ph.D. Programs Foundation of Ministry of Education of China (No. 200899980004). The authors would like to show their great gratitude to Dr. Jan Himmelspach and Dr. Roland Ewald at the University of Rostock, Germany for their invaluable advice and kindly help with JAMES II.ReferencesH. Kitano, "Computational systems biology." Nature, vol. 420, no. 6912, pp. 206-210, November 2002.H. Kitano, "Systems biology: a brief overview." Science (New York, N.Y.), vol. 295, no. 5560, pp. 1662-1664, March 2002.A. Aderem, "Systems biology: Its practice and challenges," Cell, vol. 121, no. 4, pp. 511-513, May 2005. [Online]. Available: http://dx.doi.org/10.1016/j.cell.2005.04.020.H. de Jong, "Modeling and simulation of genetic regulatory systems: A literature review," Journal of Computational Biology, vol. 9, no. 1, pp. 67-103, January 2002.C. W. Gardiner, Handbook of Stochastic Methods: for Physics, Chemistry and the Natural Sciences (Springer Series in Synergetics), 3rd ed. Springer, April 2004.D. T. Gillespie, "Simulation methods in systems biology," in Formal Methods for Computational Systems Biology, ser. Lecture Notes in Computer Science, M. Bernardo, P. Degano, and G. Zavattaro, Eds. Berlin, Heidelberg: Springer Berlin Heidelberg, 2008, vol. 5016, ch. 5, pp. 125-167.Y. Tao, Y. Jia, and G. T. Dewey, "Stochastic fluctuations in gene expression far from equilibrium: Omega expansion and linear noise approximation," The Journal of Chemical Physics, vol. 122, no. 12, 2005.D. T. Gillespie, "Exact stochastic simulation of coupled chemical reactions," Journal of Physical Chemistry, vol. 81, no. 25, pp. 2340-2361, December 1977.D. T. Gillespie, "Stochastic simulation of chemical kinetics," Annual Review of Physical Chemistry, vol. 58, no. 1, pp. 35-55, 2007.D. T. Gillespie, "Approximate accelerated stochastic simulation of chemically reacting systems," The Journal of Chemical Physics, vol. 115, no. 4, pp. 1716-1733, 2001.J. Himmelspach, R. Ewald, and A. M. Uhrmacher, "A flexible and scalable experimentation layer," in WSC '08: Proceedings of the 40th Conference on Winter Simulation. Winter Simulation Conference, 2008, pp. 827-835.J. Himmelspach and A. M. Uhrmacher, "Plug'n simulate," in 40th Annual Simulation Symposium (ANSS'07). Washington, DC, USA: IEEE, March 2007, pp. 137-143.R. Ewald, J. Himmelspach, M. Jeschke, S. Leye, and A. M. Uhrmacher, "Flexible experimentation in the modeling and simulation framework james ii-implications for computational systems biology," Brief Bioinform, vol. 11, no. 3, pp. bbp067-300, January 2010.A. Uhrmacher, J. Himmelspach, M. Jeschke, M. John, S. Leye, C. Maus, M. Röhl, and R. Ewald, "One modelling formalism & simulator is not enough! a perspective for computational biology based on james ii," in Formal Methods in Systems Biology, ser. Lecture Notes in Computer Science, J. Fisher, Ed. Berlin, Heidelberg: Springer Berlin Heidelberg, 2008, vol. 5054, ch. 9, pp. 123-138. [Online]. Available: http://dx.doi.org/10.1007/978-3-540-68413-8_9.R. Ewald, J. Himmelspach, and A. M. Uhrmacher, "An algorithm selection approach for simulation systems," pads, vol. 0, pp. 91-98, 2008.Bing Wang, Jan Himmelspach, Roland Ewald, Yiping Yao, and Adelinde M Uhrmacher. Experimental analysis of logical process simulation algorithms in james ii[C]// In M. D. Rossetti, R. R. Hill, B. Johansson, A. Dunkin, and R. G. Ingalls, editors, Proceedings of the Winter Simulation Conference, IEEE Computer Science, 2009. 1167-1179.Ewald, J. Rössel, J. Himmelspach, and A. M. Uhrmacher, "A plug-in-based architecture for random number generation in simulation systems," in WSC '08: Proceedings of the 40th Conference on Winter Simulation. Winter Simulation Conference, 2008, pp. 836-844.J. Elf and M. Ehrenberg, "Spontaneous separation of bi-stable biochemical systems into spatial domains of opposite phases." Systems biology, vol. 1, no. 2, pp. 230-236, December 2004.K. Takahashi, S. Arjunan, and M. Tomita, "Space in systems biology of signaling pathways? Towards intracellular molecular crowding in silico," FEBS Letters, vol. 579, no. 8, pp. 1783-1788, March 2005.J. V. Rodriguez, J. A. Kaandorp, M. Dobrzynski, and J. G. Blom, "Spatial stochastic modelling of the phosphoenolpyruvate-dependent phosphotransferase (pts) pathway in escherichia coli," Bioinformatics, vol. 22, no. 15, pp. 1895-1901, August 2006.D. Ridgway, G. Broderick, and M. Ellison, "Accommodating space, time and randomness in network simulation," Current Opinion in Biotechnology, vol. 17, no. 5, pp. 493-498, October 2006.J. V. Rodriguez, J. A. Kaandorp, M. Dobrzynski, and J. G. Blom, "Spatial stochastic modelling of the phosphoenolpyruvate-dependent phosphotransferase (pts) pathway in escherichia coli," Bioinformatics, vol. 22, no. 15, pp. 1895-1901, August 2006.W. G. Wilson, A. M. Deroos, and E. Mccauley, "Spatial instabilities within the diffusive lotka-volterra system: Individual-based simulation results," Theoretical Population Biology, vol. 43, no. 1, pp. 91-127, February 1993.K. Kruse and J. Elf. Kinetics in spatially extended systems. In Z. Szallasi, J. Stelling, and V. Periwal, editors, System Modeling in Cellular Biology. From Concepts to Nuts and Bolts, pages 177–198. MIT Press, Cambridge, MA, 2006.M. A. Gibson and J. Bruck, "Efficient exact stochastic simulation of chemical systems with many species and many channels," The Journal of Physical Chemistry A, vol. 104, no. 9, pp. 1876-1889, March 2000.R. M. Fujimoto, Parallel and Distributed Simulation Systems (Wiley Series on Parallel and Distributed Computing). Wiley-Interscience, January 2000.Y. Yao and Y. Zhang, “Solution for analytic simulation based on parallel processing,” Journal of System Simulation, vol. 20, No.24, pp. 6617–6621, 2008.G. Chen and B. K. Szymanski, "Dsim: scaling time warp to 1,033 processors," in WSC '05: Proceedings of the 37th conference on Winter simulation. Winter Simulation Conference, 2005, pp. 346-355.M. Jeschke, A. Park, R. Ewald, R. Fujimoto, and A. M. Uhrmacher, "Parallel and distributed spatial simulation of chemical reactions," in 2008 22nd Workshop on Principles of Advanced and Distributed Simulation. Washington, DC, USA: IEEE, June 2008, pp. 51-59.B. Wang, Y. Yao, Y. Zhao, B. Hou, and S. Peng, "Experimental analysis of optimistic synchronization algorithms for parallel simulation of reaction-diffusion systems," High Performance Computational Systems Biology, International Workshop on, vol. 0, pp. 91-100, October 2009.L. Dematté and T. Mazza, "On parallel stochastic simulation of diffusive systems," in Computational Methods in Systems Biology, M. Heiner and A. M. Uhrmacher, Eds. Berlin, Heidelberg: Springer Berlin Heidelberg, 2008, vol. 5307, ch. 16, pp. 191-210.D. R. Jefferson, "Virtual time," ACM Trans. Program. Lang. Syst., vol. 7, no. 3, pp. 404-425, July 1985.J. S. Steinman, "Breathing time warp," SIGSIM Simul. Dig., vol. 23, no. 1, pp. 109-118, July 1993. [Online]. Available: http://dx.doi.org/10.1145/174134.158473 S. K. Park and K. W. Miller, "Random number generators: good ones are hard to find," Commun. ACM, vol. 31, no. 10, pp. 1192-1201, October 1988.
APA, Harvard, Vancouver, ISO, and other styles
25

Cheng, Tan, Hui Chen, and Qingsong Wei. "The Role of Roller Rotation Pattern in the Spreading Process of Polymer/Short-Fiber Composite Powder in Selective Laser Sintering." Polymers 14, no. 12 (June 9, 2022): 2345. http://dx.doi.org/10.3390/polym14122345.

Full text
Abstract:
In this study, for the first time, a forward-rotating roller is proposed for the spreading of CF/PA12 composite powder in the selective laser sintering (SLS) process. The mesoscopic kinetic mechanism of composite particle spreading is investigated by utilizing the “multi-spherical” element within the discrete element method (DEM). The commercial software EDEM and the open-source DEM particle simulation code LIGGGHTS-PUBLIC are used for the simulations in this work. It is found that the forward-rotating roller produces a strong compaction on the powder pile than does the conventional counter-rotating roller, thus increasing the coordination number and mass flow rate of the particle flow, which significantly improves the powder bed quality. In addition, the forward-rotating pattern generates a braking friction force on the particles in the opposite direction to their spread, which affects the particle dynamics and deposition process. Therefore, appropriately increasing the roller rotation speed to make this force comparable to the roller dragging force could result in faster deposition of the composite particles to form a stable powder bed. This mechanism allows the forward-rotating roller to maintain a good powder bed quality, even at a high spreading speed, thus providing greater potential for the industry to improve the spreading efficiency of the SLS process.
APA, Harvard, Vancouver, ISO, and other styles
26

Khan, Athar Ajaz, and János Abonyi. "Simulation of Sustainable Manufacturing Solutions: Tools for Enabling Circular Economy." Sustainability 14, no. 15 (August 8, 2022): 9796. http://dx.doi.org/10.3390/su14159796.

Full text
Abstract:
At the current worrisome rate of global consumption, the linear economy model of producing goods, using them, and then disposing of them with no thought of the environmental, social, or economic consequences, is unsustainable and points to a deeply flawed manufacturing framework. Circular economy (CE) is presented as an alternative framework to address the management of emissions, scarcity of resources, and economic sustainability such that the resources are kept ‘in the loop’. In the context of manufacturing supply chains (SCs), the 6R’s of rethink, refuse, reduce, reuse, repair, and recycle have been proposed in line with the achievement of targeted net-zero emissions. In order to bring that about, the required changes in the framework for assessing the state of manufacturing SCs with regard to sustainability are indispensable. Verifiable and empirical model-based approaches such as modeling and simulation (M&S) techniques find pronounced use in realizing the ideal of CE. The simulation models find extensive use across various aspects of SCs, including analysis of the impacts, and support for optimal re-design and operation. Using the PRISMA framework to sift through published research, as gathered from SCOPUS, this review is based on 202 research papers spanning from 2015 to the present. This review provides an overview of the simulation tools being put to use in the context of sustainability in the manufacturing SCs, such that various aspects and contours of the collected research articles spanning from 2015 to the present, are highlighted. This article focuses on the three major simulation techniques in the literature, namely, Discrete Event Simulation (DES), Agent-Based Simulation (ABS), and System Dynamics (SD). With regards to their application in manufacturing SCs, each modeling technique has its pros and its cons which are evinced in case of data requirement, model magnification, model resolution, and environment interaction, among others. These limitations are remedied through use of hybrids wherein two or more than two modeling techniques are applied for the desired results. The article also indicates various open-source software solutions that are being employed in research and the industry. This article, in essence, has three objectives. First to present to the prospective researchers, the current state of research, the concerns that have been presented in the field of sustainability modeling, and how they have been resolved. Secondly, it serves as a comprehensive bibliography of peer-reviewed research published from 2015–2022 and, finally, indicating the limitations of the techniques with regards to sustainability assessment. The article also indicates the necessity of a new M&S framework and its prerequisites.
APA, Harvard, Vancouver, ISO, and other styles
27

Manríquez, F., N. Morales, G. Pinilla, and I. Piñeyro. "Discrete event simulation to design open-pit mine production policy in the event of snowfall." International Journal of Mining, Reclamation and Environment 33, no. 8 (September 26, 2018): 572–88. http://dx.doi.org/10.1080/17480930.2018.1514963.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Davis, Sarah, Emma Simpson, Jean Hamilton, Marrissa Martyn-St James, Andrew Rawdin, Ruth Wong, Edward Goka, Neil Gittoes, and Peter Selby. "Denosumab, raloxifene, romosozumab and teriparatide to prevent osteoporotic fragility fractures: a systematic review and economic evaluation." Health Technology Assessment 24, no. 29 (June 2020): 1–314. http://dx.doi.org/10.3310/hta24290.

Full text
Abstract:
Background Fragility fractures are fractures that result from mechanical forces that would not ordinarily result in fracture. Objectives The objectives were to evaluate the clinical effectiveness, safety and cost-effectiveness of non-bisphosphonates {denosumab [Prolia®; Amgen Inc., Thousand Oaks, CA, USA], raloxifene [Evista®; Daiichi Sankyo Company, Ltd, Tokyo, Japan], romosozumab [Evenity®; Union Chimique Belge (UCB) S.A. (Brussels, Belgium) and Amgen Inc.] and teriparatide [Forsteo®; Eli Lilly and Company, Indianapolis, IN, USA]}, compared with each other, bisphosphonates or no treatment, for the prevention of fragility fracture. Data sources For the clinical effectiveness review, nine electronic databases (including MEDLINE, EMBASE and the World Health Organization International Clinical Trials Registry Platform) were searched up to July 2018. Review methods A systematic review and network meta-analysis of fracture and femoral neck bone mineral density were conducted. A review of published economic analyses was undertaken and a model previously used to evaluate bisphosphonates was adapted. Discrete event simulation was used to estimate lifetime costs and quality-adjusted life-years for a simulated cohort of patients with heterogeneous characteristics. This was done for each non-bisphosphonate treatment, a strategy of no treatment, and the five bisphosphonate treatments previously evaluated. The model was populated with effectiveness evidence from the systematic review and network meta-analysis. All other parameters were estimated from published sources. An NHS and Personal Social Services perspective was taken, and costs and benefits were discounted at 3.5% per annum. Fracture risk was estimated from patient characteristics using the QFracture® (QFracture-2012 open source revision 38, Clinrisk Ltd, Leeds, UK) and FRAX® (web version 3.9, University of Sheffield, Sheffield, UK) tools. The relationship between fracture risk and incremental net monetary benefit was estimated using non-parametric regression. A probabilistic sensitivity analysis and scenario analyses were used to assess uncertainty. Results Fifty-two randomised controlled trials of non-bisphosphonates were included in the clinical effectiveness systematic review and an additional 51 randomised controlled trials of bisphosphonates were included in the network meta-analysis. All treatments had beneficial effects compared with placebo for vertebral, non-vertebral and hip fractures, with hazard ratios varying from 0.23 to 0.94, depending on treatment and fracture type. The effects on vertebral fractures and the percentage change in bone mineral density were statistically significant for all treatments. The rate of serious adverse events varied across trials (0–33%), with most between-group differences not being statistically significant for comparisons with placebo/no active treatment, non-bisphosphonates or bisphosphonates. The incremental cost-effectiveness ratios were > £20,000 per quality-adjusted life-year for all non-bisphosphonate interventions compared with no treatment across the range of QFracture and FRAX scores expected in the population eligible for fracture risk assessment. The incremental cost-effectiveness ratio for denosumab may fall below £30,000 per quality-adjusted life-year at very high levels of risk or for high-risk patients with specific characteristics. Raloxifene was dominated by no treatment (resulted in fewer quality-adjusted life-years) in most risk categories. Limitations The incremental cost-effectiveness ratios are uncertain for very high-risk patients. Conclusions Non-bisphosphonates are effective in preventing fragility fractures, but the incremental cost-effectiveness ratios are generally greater than the commonly applied threshold of £20,000–30,000 per quality-adjusted life-year. Study registration This study is registered as PROSPERO CRD42018107651. Funding This project was funded by the National Institute for Health Research (NIHR) Health Technology Assessment programme and will be published in full in Health Technology Assessment; Vol. 24, No. 29. See the NIHR Journals Library website for further project information.
APA, Harvard, Vancouver, ISO, and other styles
29

Mazurenko, O. I., and I. A. Rusinov. "DISCRETE-EVENT SIMULATION MODELLING OF SEA COAL TERMINAL WAREHOUSE USING ANYLOGIC." Vestnik Gosudarstvennogo universiteta morskogo i rechnogo flota imeni admirala S. O. Makarova 14, no. 2 (June 28, 2022): 181–98. http://dx.doi.org/10.21821/2309-5180-2022-14-2-181-198.

Full text
Abstract:
The issues of simulation modeling of complex transport nodes and systems are considered in the paper. It has been determined that warehouses in seaports act as buffers and smooth out the resulting difference in incoming and outgoing cargo traffic between land and sea transport. When designing offshore coal terminals, choosing the size, layout and equipment of a warehouse is one of the cornerstones. The basic principles for the placement of coal piles, depending on the size and capacity of an open warehouse, the size of ship lots and the average shelf life are provided in the paper. One of the main indicators that determine the correctness of choosing the size and layout of an open warehouse is the coefficient of utilization of the warehouse capacity. Formulas for its calculation are given. The principles of stacking coal in piles are further defined in the paper. There are four main laying methods - conical, chevron, layering and windowed. Next, the dependence of the coal storage density on the mass of the stack and the utilization rate of the warehouse on the average storage time of the cargo is determined. It is concluded that it is possible to fully assess the impact of stochastic events occurring in the terminal warehouse only with the use of simulation modeling. Further, it is determined that mixing and homogenization operations can be carried out in open warehouses. The differences and principles of these processes are given. Approaches to the selection of warehouse equipment and cyclical machines are identified, the advantages and disadvantages of the various types of warehouse machines are indicated. Definitions of technical and operational performance of machines are given. The Anylogic environment is chosen for modeling. Using environment elements, a digital copy of the coal terminal warehouse, serving the material flow - bulk cargoes, is created. Further, a detailed explanation of the principles and approach to creating a digital model is given, the internal and external links of the elements are clarified. In conclusion, information about the options for setting up and managing the model, and brief information about the results obtained are given.
APA, Harvard, Vancouver, ISO, and other styles
30

Zhao, Zhang, Ruixin Zhang, Jiandong Sun, and Shuaikang Lv. "Optimization of Overcast Stripping Technology Parameters Based on Discrete Event System Simulation." Advances in Civil Engineering 2022 (February 1, 2022): 1–12. http://dx.doi.org/10.1155/2022/7654893.

Full text
Abstract:
In the process of open-pit mining, the system parameters determine the economic benefit and production efficiency of the mine. Conventional optimization involves building a system model for the process parameters. However, complex large-scale systems such as open-pit mining are difficult to model, resulting in a failure to obtain effective solutions. This paper describes a system simulation method for the process parameters involved in open-pit mining. The nature and interaction of each component of the system are analyzed in detail, and the logical flow of each layer of the system is determined. Taking the basic operational linkages of the equipment as the system drivers, we obtained the operational flow of dragline information. The barycentric circular projection method is used to simplify the control logic of the system, and a system storage state model is constructed to identify dynamic changes in the system and obtain the operation parameters of the dragline. A discrete event system is used for quantitative modeling, and the event step method is employed to advance the simulation process and obtain decision information. Finally, simulations are performed using various system parameters. The simulation results show that the maximum efficiency is achieved when the dragline height is ∼13 m, giving a capacity of 4276.52 m3/h. Error analysis indicates that the modeling error is minimized using a simulation correction coefficient of α = 0.94.
APA, Harvard, Vancouver, ISO, and other styles
31

Berger, Moritz, and Matthias Schmid. "Semiparametric regression for discrete time-to-event data." Statistical Modelling 18, no. 3-4 (January 17, 2018): 322–45. http://dx.doi.org/10.1177/1471082x17748084.

Full text
Abstract:
Abstract: Time-to-event models are a popular tool to analyse data where the outcome variable is the time to the occurrence of a specific event of interest. Here, we focus on the analysis of time-to-event outcomes that are either intrinsically discrete or grouped versions of continuous event times. In the literature, there exists a variety of regression methods for such data. This tutorial provides an introduction to how these models can be applied using open source statistical software. In particular, we consider semiparametric extensions comprising the use of smooth nonlinear functions and tree-based methods. All methods are illustrated by data on the duration of unemployment of US citizens.
APA, Harvard, Vancouver, ISO, and other styles
32

Aleisa, Esra, Mohammad Al-Ahmad, and Abdulla M. Taha. "Design and management of a sewage pit through discrete-event simulation." SIMULATION 87, no. 11 (March 7, 2011): 989–1001. http://dx.doi.org/10.1177/0037549711398262.

Full text
Abstract:
This paper reports two discrete-event simulation studies to model the activities of a residential waste treatment facility and prepare it to accept additional wastewaters through tanker trucks. The first simulation study models the wastewater treatment facility to ensure its ability to handle the planned added capacity arriving through the pit, while the second study simulates various managerial strategies to handle the traffic, testing, and unload procedures of tanker trucks arriving at the facility. The simulation models were statistically validated and the outcomes of the study were implemented in reality. The wastewater treatment facility extension suggested by this study was implemented and launched in mid 2008 to accept residential wastewater tanker trucks. This has saved the environment over 6,000 m3 daily from being dumped into the open unlined terrestrial landfills. Simulation proved to be an excellent tool in the facility planning effort, as it ensured smooth flow lines of tanker truck load discharge and the best utilization of facilities on site.
APA, Harvard, Vancouver, ISO, and other styles
33

Mostafa, Sherif, Nicholas Chileshe, and Tariq Abdelhamid. "Lean and agile integration within offsite construction using discrete event simulation." Construction Innovation 16, no. 4 (October 3, 2016): 483–525. http://dx.doi.org/10.1108/ci-09-2014-0043.

Full text
Abstract:
Purpose The purpose of this study is to systematically analyse and synthesise the existing research published on offsite manufacturing/construction. The study aims to highlight and associate the core elements for adopting the offsite concept in different construction contexts. This ultimately facilitates the enhancement of the offsite uptake. Design/methodology/approach The research study was carried out through a systematic literature review (SLR). The SLR was conducted to identify and understand the existing themes in the offsite research landscape, evaluate contributions and compile knowledge, thereby identifying potential directions of future research. The grand electronic databases were explored to gather literature on the offsite concept, lean and agile principles and simulation. A total of 62 related articles published between 1992 and 2015 have been included in this study. The relevant literature was systematically analysed and synthesised to present the emerging offsite themes. Findings The descriptive and thematic analyses presented in this paper have identified related offsite research studies that have contributed to setting a firm foundation of the offsite concept in different construction contexts. Each of the 62 articles was examined for achieving the aim and objectives of this study, the method of data collection and coverage of offsite themes. The results of the analyses revealed that the articles mostly provide information on the offsite concept and its definitions (53 per cent) and offsite barriers and/or drivers (27 per cent). However, limited attention has been paid to the integration of lean and agile principles (13 per cent) and simulation (7 per cent) within the offsite concept, which are therefore more open to research within the offsite concept. Research limitations/implications The literature review highlights the main themes and components of the offsite construction concept. This forms a solid basis and motivation for researchers and practitioners to build on to enhance the uptake of the offsite concept in different contexts. This study also presents a research roadmap within the offsite concept, along with a recommendation for further research to be conducted using the research framework proposed in this study. The framework could lead to validation of using simulation to integrate lean and agile principles within the offsite concept. Originality/value This paper presents a systematic review of the literature related to offsite construction in different contexts. The emerging components, that is, offsite definitions, drivers and/or barriers, lean and agile principles and simulation have been highlighted and discussed thematically. A research framework that enables pursuit of the integration of lean and agile principles offsite through the lens of simulation has been proposed. The framework is expected to open up new opportunities on the effectiveness of offsite development in different contexts.
APA, Harvard, Vancouver, ISO, and other styles
34

Garcia Martinez, Ruben Febronio, Jose Abraham Valdivia Puga, Pedro Daniel Urbina Coronado, Axel Alejandro Gómez Ortigoza, Pedro Orta-Castañon, and Horacio Ahuett-Garza. "A flexible and open environment for discrete event simulations and smart manufacturing." International Journal on Interactive Design and Manufacturing (IJIDeM) 15, no. 4 (October 18, 2021): 509–24. http://dx.doi.org/10.1007/s12008-021-00778-w.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Courty, Laurent Guillaume, Adrián Pedrozo-Acuña, and Paul David Bates. "Itzï (version 17.1): an open-source, distributed GIS model for dynamic flood simulation." Geoscientific Model Development 10, no. 4 (May 2, 2017): 1835–47. http://dx.doi.org/10.5194/gmd-10-1835-2017.

Full text
Abstract:
Abstract. Worldwide, floods are acknowledged as one of the most destructive hazards. In human-dominated environments, their negative impacts are ascribed not only to the increase in frequency and intensity of floods but also to a strong feedback between the hydrological cycle and anthropogenic development. In order to advance a more comprehensive understanding of this complex interaction, this paper presents the development of a new open-source tool named Itzï that enables the 2-D numerical modelling of rainfall–runoff processes and surface flows integrated with the open-source geographic information system (GIS) software known as GRASS. Therefore, it takes advantage of the ability given by GIS environments to handle datasets with variations in both temporal and spatial resolutions. Furthermore, the presented numerical tool can handle datasets from different sources with varied spatial resolutions, facilitating the preparation and management of input and forcing data. This ability reduces the preprocessing time usually required by other models. Itzï uses a simplified form of the shallow water equations, the damped partial inertia equation, for the resolution of surface flows, and the Green–Ampt model for the infiltration. The source code is now publicly available online, along with complete documentation. The numerical model is verified against three different tests cases: firstly, a comparison with an analytic solution of the shallow water equations is introduced; secondly, a hypothetical flooding event in an urban area is implemented, where results are compared to those from an established model using a similar approach; and lastly, the reproduction of a real inundation event that occurred in the city of Kingston upon Hull, UK, in June 2007, is presented. The numerical approach proved its ability at reproducing the analytic and synthetic test cases. Moreover, simulation results of the real flood event showed its suitability at identifying areas affected by flooding, which were verified against those recorded after the event by local authorities.
APA, Harvard, Vancouver, ISO, and other styles
36

Salama, Abubakary. "Prediction of Haul Units’ Requirements for the Open Pit Mines Operating Towards Closure." Tanzania Journal of Engineering and Technology 38, no. 2 (December 31, 2019): 217–29. http://dx.doi.org/10.52339/tjet.v38i2.506.

Full text
Abstract:
The determination of haul units is one of the important steps in the equipment selection process. The process should aim at minimizing the operational costs and create an optimal match for the haulage system. When open pit mines operating towards closure, reducing the operational costs results in significant gaining, and the savings can be used to implement some of the closure plans. In this paper, the prediction of haul units for an open pit mine which operates towards the final pit limit was analysed using discrete event simulation and multiple linear regressions. The result shows that the changes in production rate have significantly impact on the number of haul units in operations. The obtained P-value suggests that there is a strong correlation between the haul units’ requirement, production rates, and haul distances. This reveals that the prediction of haul units’ requirements can be achieved using a developed regression model. It was observed that, the approach of using discrete event simulation and multiple linear regressions is useful in integrating fleet size decision in a multi-period production schedule.
APA, Harvard, Vancouver, ISO, and other styles
37

Wang, Bin, and Corrado Fidelibus. "An Open-Source Code for Fluid Flow Simulations in Unconventional Fractured Reservoirs." Geosciences 11, no. 2 (February 22, 2021): 106. http://dx.doi.org/10.3390/geosciences11020106.

Full text
Abstract:
In this article, an open-source code for the simulation of fluid flow, including adsorption, transport, and indirect hydromechanical coupling in unconventional fractured reservoirs is described. The code leverages cutting-edge numerical modeling capabilities like automatic differentiation, stochastic fracture modeling, multicontinuum modeling, and discrete fracture models. In the fluid mass balance equation, specific physical mechanisms, unique to organic-rich source rocks, are included, like an adsorption isotherm, a dynamic permeability-correction function, and an Embedded Discrete Fracture Model (EDFM) with fracture-to-well connectivity. The code is validated against an industrial simulator and applied for a study of the performance of the Barnett shale reservoir, where adsorption, gas slippage, diffusion, indirect hydromechanical coupling, and propped fractures are considered. It is the first open-source code available to facilitate the modeling and production optimization of fractured shale-gas reservoirs. The modular design also facilitates rapid prototyping and demonstration of new models. This article also contains a quantitative analysis of the accuracy and limitations of EDFM for gas production simulation in unconventional fractured reservoirs.
APA, Harvard, Vancouver, ISO, and other styles
38

Fang, Luning, Ruochun Zhang, Colin Vanden Heuvel, Radu Serban, and Dan Negrut. "Chrono::GPU: An Open-Source Simulation Package for Granular Dynamics Using the Discrete Element Method." Processes 9, no. 10 (October 13, 2021): 1813. http://dx.doi.org/10.3390/pr9101813.

Full text
Abstract:
We report on an open-source, publicly available C++ software module called Chrono::GPU, which uses the Discrete Element Method (DEM) to simulate large granular systems on Graphics Processing Unit (GPU) cards. The solver supports the integration of granular material with geometries defined by triangle meshes, as well as co-simulation with the multi-physics simulation engine Chrono. Chrono::GPU adopts a smooth contact formulation and implements various common contact force models, such as the Hertzian model for normal force and the Mindlin friction force model, which takes into account the history of tangential displacement, rolling frictional torques, and cohesion. We report on the code structure and highlight its use of mixed data types for reducing the memory footprint and increasing simulation speed. We discuss several validation tests (wave propagation, rotating drum, direct shear test, crater test) that compare the simulation results against experimental data or results reported in the literature. In another benchmark test, we demonstrate linear scaling with a problem size up to the GPU memory capacity; specifically, for systems with 130 million DEM elements. The simulation infrastructure is demonstrated in conjunction with simulations of the NASA Curiosity rover, which is currently active on Mars.
APA, Harvard, Vancouver, ISO, and other styles
39

Davis, Sarah, Marrissa Martyn-St James, Jean Sanderson, John Stevens, Edward Goka, Andrew Rawdin, Susi Sadler, et al. "A systematic review and economic evaluation of bisphosphonates for the prevention of fragility fractures." Health Technology Assessment 20, no. 78 (October 2016): 1–406. http://dx.doi.org/10.3310/hta20780.

Full text
Abstract:
BackgroundFragility fractures are fractures that result from mechanical forces that would not ordinarily result in fracture.ObjectivesTo evaluate the clinical effectiveness and safety of bisphosphonates [alendronic acid (Fosamax®and Fosamax®Once Weekly, Merck Sharp & Dohme Ltd), risedronic acid (Actonel®and Actonel Once a Week®, Warner Chilcott UK Ltd), ibandronic acid (Bonviva®, Roche Products Ltd) and zoledronic acid (Aclasta®, Novartis Pharmaceuticals UK Ltd)] for the prevention of fragility fracture and to assess their cost-effectiveness at varying levels of fracture risk.Data sourcesFor the clinical effectiveness review, six electronic databases and two trial registries were searched: MEDLINE, EMBASE, The Cochrane Library, Cumulative Index to Nursing and Allied Health Literature, Web of Science and BIOSIS Previews, Clinicaltrials.gov and World Health Organization International Clinical Trials Registry Platform. Searches were limited by date from 2008 until September 2014.Review methodsA systematic review and network meta-analysis (NMA) of effectiveness studies were conducted. A review of published economic analyses was undertaken and a de novo health economic model was constructed. Discrete event simulation was used to estimate lifetime costs and quality-adjusted life-years (QALYs) for each bisphosphonate treatment strategy and a strategy of no treatment for a simulated cohort of patients with heterogeneous characteristics. The model was populated with effectiveness evidence from the systematic review and NMA. All other parameters were estimated from published sources. A NHS and Personal Social Services perspective was taken, and costs and benefits were discounted at 3.5% per annum. Fracture risk was estimated from patient characteristics using the QFracture®(QFracture-2012 open source revision 38, Clinrisk Ltd, Leeds, UK) and FRAX®(web version 3.9, University of Sheffield, Sheffield, UK) tools. The relationship between fracture risk and incremental net benefit (INB) was estimated using non-parametric regression. Probabilistic sensitivity analysis (PSA) and scenario analyses were used to assess uncertainty.ResultsForty-six randomised controlled trials (RCTs) were included in the clinical effectiveness systematic review, with 27 RCTs providing data for the fracture NMA and 35 RCTs providing data for the femoral neck bone mineral density (BMD) NMA. All treatments had beneficial effects on fractures versus placebo, with hazard ratios varying from 0.41 to 0.92 depending on treatment and fracture type. The effects on vertebral fractures and percentage change in BMD were statistically significant for all treatments. There was no evidence of a difference in effect on fractures between bisphosphonates. A statistically significant difference in the incidence of influenza-like symptoms was identified from the RCTs for zoledronic acid compared with placebo. Reviews of observational studies suggest that upper gastrointestinal symptoms are frequently reported in the first month of oral bisphosphonate treatment, but pooled analyses of placebo-controlled trials found no statistically significant difference. A strategy of no treatment was estimated to have the maximum INB for patients with a 10-year QFracture risk under 1.5%, whereas oral bisphosphonates provided maximum INB at higher levels of risk. However, the PSA suggested that there is considerable uncertainty regarding whether or not no treatment is the optimal strategy until the QFracture score is around 5.5%. In the model using FRAX, the mean INBs were positive for all oral bisphosphonate treatments across all risk categories. Intravenous bisphosphonates were estimated to have lower INBs than oral bisphosphonates across all levels of fracture risk when estimated using either QFracture or FRAX.LimitationsWe assumed that all treatment strategies are viable alternatives across the whole population.ConclusionsBisphosphonates are effective in preventing fragility fractures. However, the benefit-to-risk ratio in the lowest-risk patients may be debatable given the low absolute QALY gains and the potential for adverse events. We plan to extend the analysis to include non-bisphosphonate therapies.Study registrationThis study is registered as PROSPERO CRD42013006883.FundingThe National Institute for Health Research Health Technology Assessment programme.
APA, Harvard, Vancouver, ISO, and other styles
40

Valdes, Fabio. "MFPMiner: Mining Meaningful Frequent Patterns from Spatio-textual Trajectories." ACM Transactions on Spatial Algorithms and Systems 8, no. 1 (March 31, 2022): 1–30. http://dx.doi.org/10.1145/3498728.

Full text
Abstract:
In the second decade of this century, technical progress has led to a worldwide proliferation of devices for tracking the movement behavior of a person, a vehicle, or another kind of entity. One of the consequences of this development is a massive and still growing amount of movement and movement-related data recorded by cellphones, automobiles, vessels, aircraft, and further GPS-enabled entities. As a result, the requirements for managing and analyzing movement records also increase, serving commercial, administrative, or private purposes. Since the development of hardware components cannot keep pace with the data growth, exploring methods of analyzing such trajectory datasets has become a very active and influential research field. For many application scenarios, besides the spatial trajectory of an entity, it is desirable to take additional semantic information into consideration. These descriptions also change with time and may represent, e.g., the course of streets passed by a bus, the sequence of region names traversed by an aircraft, or the points of interest in proximity of the positions of a taxi. Such data may be directly recorded by a sensor (such as the altitude of an aircraft) or computed from the spatial trajectory combined with some underlying information (for example, street names). It is often helpful or even necessary to focus on such semantic information for efficient analyses, as changes usually occur less frequently than it is the case for the spatial trajectory, where data points usually arrive in very close temporal distances. However, any kind of querying requires a deep semantic knowledge of the dataset at hand, particularly for retrieving the set of trajectories that match a certain mobility pattern, that is, a sequence of temporal, spatial, and semantic specifications. In this article, we introduce a framework named MFPMiner 1 for retrieving all mobility patterns fulfilling a user-specified frequency threshold from a spatio-textual trajectory dataset. The resulting patterns and their relative frequency can be regarded as a knowledge base of the considered data. They may be directly visualized or applied for a pattern matching query yielding the set of matching trajectories. We demonstrate the functionality of our approach in an application scenario and provide an experimental evaluation of its performance on real and synthetic datasets by comparing it to three competitive methods. The framework has been fully implemented in a DBMS environment and is freely available open source software.
APA, Harvard, Vancouver, ISO, and other styles
41

Glukhova, S. A., and M. A. Yurkin. "Scattering of generalized Bessel beams simulated with the discrete dipole approximation." Journal of Physics: Conference Series 2015, no. 1 (November 1, 2021): 012046. http://dx.doi.org/10.1088/1742-6596/2015/1/012046.

Full text
Abstract:
Abstract We consider the simulation of scattering of the high-order vector Bessel beams in the discrete dipole approximation framework (DDA). For this purpose, a new general classification of all existing Bessel beam types was developed based on the superposition of transverse Hertz vector potentials. Next, we implemented these beams in ADDA code – an open-source parallel implementation of the DDA. The code enables easy and efficient simulation of Bessel beams scattering by arbitrary-shaped particles. Moreover, these results pave the way for the following research related to the Bessel beam scattering near a substrate and optical forces.
APA, Harvard, Vancouver, ISO, and other styles
42

Baniata, Hamza, and Attila Kertesz. "FoBSim: an extensible open-source simulation tool for integrated fog-blockchain systems." PeerJ Computer Science 7 (April 16, 2021): e431. http://dx.doi.org/10.7717/peerj-cs.431.

Full text
Abstract:
A lot of hard work and years of research are still needed for developing successful Blockchain (BC) applications. Although it is not yet standardized, BC technology was proven as to be an enhancement factor for security, decentralization, and reliability, leading to be successfully implemented in cryptocurrency industries. Fog computing (FC) is one of the recently emerged paradigms that needs to be improved to serve Internet of Things (IoT) environments of the future. As hundreds of projects, ideas, and systems were proposed, one can find a great R&D potential for integrating BC and FC technologies. Examples of organizations contributing to the R&D of these two technologies, and their integration, include Linux, IBM, Google, Microsoft, and others. To validate an integrated Fog-Blockchain protocol or method implementation, before the deployment phase, a suitable and accurate simulation environment is needed. Such validation should save a great deal of costs and efforts on researchers and companies adopting this integration. Current available simulation environments facilitate Fog simulation, or BC simulation, but not both. In this paper, we introduce a Fog-Blockchain simulator, namely FoBSim, with the main goal to ease the experimentation and validation of integrated Fog-Blockchain approaches. According to our proposed workflow of simulation, we implement different Consensus Algorithms (CA), different deployment options of the BC in the FC architecture, and different functionalities of the BC in the simulation. Furthermore, technical details and algorithms on the simulated integration are provided. We validate FoBSim by describing the technologies used within FoBSim, highlighting FoBSim’s novelty compared to the state-of-the-art, discussing the event validity in FoBSim, and providing a clear walk-through validation. Finally, we simulate case studies, then present and analyze the obtained results, where deploying the BC network in the fog layer shows enhanced efficiency in terms of total run time and total storage cost.
APA, Harvard, Vancouver, ISO, and other styles
43

Stankevich, Elena, Igor Tananko, and Michele Pagano. "Optimization of Open Queuing Networks with Batch Services." Mathematics 10, no. 16 (August 22, 2022): 3027. http://dx.doi.org/10.3390/math10163027.

Full text
Abstract:
In this paper, open queuing networks with Poisson arrivals and single-server infinite buffer queues are considered. Unlike traditional queuing models, customers are served (with exponential service time) in batches, so that the nodes are non-work-conserving. The main contribution of this work is the design of an efficient algorithm to find the batch sizes which minimize the average response time of the network. As preliminary steps at the basis of the proposed algorithm, an analytical expression of the average sojourn time in each node is derived, and it is shown that this function, depending on the batch size, has a single minimum. The goodness of the proposed algorithm and analytical formula were verified through a discrete-event simulation for an open network with a non-tree structure.
APA, Harvard, Vancouver, ISO, and other styles
44

Li, Jingxia, Aijun Zhou, Yongfeng Liao, Zixin Zhao, Xiaoli Mao, and Shuoxin Zhang. "Forest Ecological Diversity Change Prediction Discrete Dynamic Model." Discrete Dynamics in Nature and Society 2022 (January 19, 2022): 1–11. http://dx.doi.org/10.1155/2022/4869363.

Full text
Abstract:
Forest ecosystem is the most important terrestrial ecosystem type of species diversity in the world. The protection and sustainable use of forest ecological diversity is of great significance to forest protection and sustainable management. In order to maintain biodiversity, the traditional concept of forest resources management must be changed. In recent years, the mining and application of big data have become the frontier content of international ecological diversity and macroecology research. By establishing the discrete dynamic model of forest ecological diversity change, this study finds a method that can keep the number of trees stable in a short time and ensure the sustainable development of forest resources. This article studies and constructs the forest information system and constructs an open-source WebGIS (network geographic information system) scheme for the field of spatial information and supporting OGC (occupationally generated content). At the same time, the grey theory GM (1,1) model is used to predict the development trend of forest resources in China. The results show that under the promotion of policies and research projects, China’s forestry development has realized the double growth of forest area and volume, and the forest coverage rate has reached a new level.
APA, Harvard, Vancouver, ISO, and other styles
45

Gururaja Rao, C., V. Nagabhushana Rao, and C. Krishna Das. "Simulation studies on multi-mode heat transfer from an open cavity with a flush-mounted discrete heat source." Heat and Mass Transfer 44, no. 6 (July 14, 2007): 727–37. http://dx.doi.org/10.1007/s00231-007-0301-4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Denil, Joachim, Paul De Meulenaere, Serge Demeyer, and Hans Vangheluwe. "DEVS for AUTOSAR-based system deployment modeling and simulation." SIMULATION 93, no. 6 (February 6, 2017): 489–513. http://dx.doi.org/10.1177/0037549716684552.

Full text
Abstract:
AUTOSAR (AUTomotive Open System ARchitecture) is an open and standardized automotive software architecture, developed by automobile manufacturers, suppliers, and tool developers. Its design is a direct consequence of the increasingly important role played by software in vehicles. As design choices during the software deployment phase have a large impact on the behavior of the system, designers need to explore various trade-offs. Examples of such design choices are the mapping of software components to processors, the priorities of tasks and messages, and buffer allocation. In this paper, we evaluate the appropriateness of DEVS, the Discrete-Event System specification, for modeling and subsequent performance evaluation of AUTOSAR-based systems. Moreover, a DEVS simulation model is constructed for AUTOSAR-based electronic control units connected by a communication bus. To aid developers in evaluating a deployment solution, the simulation model is extended with co-simulation with a plant and environment model, evaluation at different levels of detail, and fault injection. Finally, we examine how the simulation model supports the relationship between the supplier and the original equipment manufacturer in the automotive industry. We demonstrate and validate our work by means of a power window case study.
APA, Harvard, Vancouver, ISO, and other styles
47

Yurkin, M. A., and A. E. Moskalensky. "Open-source implementation of the discrete-dipole approximation for a scatterer in an absorbing host medium." Journal of Physics: Conference Series 2015, no. 1 (November 1, 2021): 012167. http://dx.doi.org/10.1088/1742-6596/2015/1/012167.

Full text
Abstract:
Abstract Theoretical description of light scattering by single particles is a well-developed field, but most of it applies to particles located in vacuum or non-absorbing host medium. Although the case of absorbing host medium has also been discussed in literature, a complete description and unambiguous definition of scattering quantities are still lacking. Similar situation is for simulation methods – some computer codes exist, but their choice is very limited, compared to the case of vacuum. Here we describe the extension of the popular open-source code ADDA to support the absorbing host medium. It is based on the discrete dipole approximation and is, thus, applicable to particles with arbitrary shape and internal structure. We performed test simulations for spheres and compared them with that using the Lorenz-Mie theory. Moreover, we developed a unified description of the energy budget for scattering by a particle in a weakly absorbing host medium, relating all existing local (expressed as volume integrals over scatterer volume) and far-field scattering quantities.
APA, Harvard, Vancouver, ISO, and other styles
48

Marin, Juan Alexander, Cristhian Camilo Mosquera, and Yony Fernando Ceballos. "Proposal of Improvement for a Textile Finishing Company in the Medellin city Through of Discrete Simulation." Scientia et Technica 26, no. 1 (March 30, 2021): 21–27. http://dx.doi.org/10.22517/23447214.24540.

Full text
Abstract:
The textile industry in Colombia is a source of employment for more than 200.000 people and more than 50% of this production is undertaken in Medellin. Modeling and improving textile processes allow this economic line to be competitive internationally. In this paper, we make a description about the use of discrete event simulation in a textile finishing company through the presentation of the results of four scenarios, which finally shows the potential of discrete simulations in productive environments and its high impact when modelling part of reality without the necessity of experimenting with the real system. The method used in this paper is summarized in three major stages: the first one is the simulation methodology, the second one is the data to support the simulation, and the final stage is an analysis of the results with the comparison of the four scenarios. The simulation was statistically validated and verified with the real behaviors of the company and it is executed by using software tools such as EasyFit®, Microsoft Excel® and Simul8®.
APA, Harvard, Vancouver, ISO, and other styles
49

Gao, Hui, Da Wei Zhang, Bin Liu, and Long Chen Duan. "Surface Lunar Soil Excavation Simulation Based on Three-Dimensional Discrete Element Method." Applied Mechanics and Materials 376 (August 2013): 366–70. http://dx.doi.org/10.4028/www.scientific.net/amm.376.366.

Full text
Abstract:
One of the important objectives of lunar exploration is to obtain the lunar soil samples. However, the sampling process is very different from that on the Earth due to special characteristics of the lunar soil and surface environment. In order to ensure that the lunar exploration and sampling are successful, large numbers of ground experiments and computer simulations must be taken. In this paper, the surface lunar soil excavation simulation is investigated by three-dimensional discrete element method (DEM). It is implemented based on the open source LIGGGHTS, which takes the lunar soil as spherical particles. The interaction between the excavation tool and lunar soil is demonstrated. The excavation force and torque have also been calculated in real time. Moreover, the comparison of the excavation in different environments between the Earth and Moon corresponding to their different gravity accelerations was done. This paper shows that three-dimensional discrete element method can be used for the surface lunar soil excavation simulation and can provide important reference results for actual operations.
APA, Harvard, Vancouver, ISO, and other styles
50

Amoretti, Michele. "Modeling and Simulation of Network-on-Chip Systems with DEVS and DEUS." Scientific World Journal 2014 (2014): 1–9. http://dx.doi.org/10.1155/2014/982569.

Full text
Abstract:
Networks on-chip (NoCs) provide enhanced performance, scalability, modularity, and design productivity as compared with previous communication architectures for VLSI systems on-chip (SoCs), such as buses and dedicated signal wires. Since the NoC design space is very large and high dimensional, evaluation methodologies rely heavily on analytical modeling and simulation. Unfortunately, there is no standard modeling framework. In this paper we illustrate how to design and evaluate NoCs by integrating the Discrete Event System Specification (DEVS) modeling framework and the simulation environment called DEUS. The advantage of such an approach is that both DEVS and DEUS support modularity—the former being a sound and complete modeling framework and the latter being an open, general-purpose platform, characterized by a steep learning curve and the possibility to simulate any system at any level of detail.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography