Journal articles on the topic 'Workflow'

To see the other types of publications on this topic, follow the link: Workflow.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Workflow.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Song, Tianhong, Sven Köhler, Bertram Ludäscher, James Hanken, Maureen Kelly, David Lowery, James A. Macklin, Paul J. Morris, and Robert A. Morris. "Towards Automated Design, Analysis and Optimization of Declarative Curation Workflows." International Journal of Digital Curation 9, no. 2 (October 29, 2014): 111–22. http://dx.doi.org/10.2218/ijdc.v9i2.337.

Full text
Abstract:
Data curation is increasingly important. Our previous work on a Kepler curation package has demonstrated advantages that come from automating data curation pipelines by using workflow systems. However, manually designed curation workflows can be error-prone and inefficient due to a lack of user understanding of the workflow system, misuse of actors, or human error. Correcting problematic workflows is often very time-consuming. A more proactive workflow system can help users avoid such pitfalls. For example, static analysis before execution can be used to detect the potential problems in a workflow and help the user to improve workflow design. In this paper, we propose a declarative workflow approach that supports semi-automated workflow design, analysis and optimization. We show how the workflow design engine helps users to construct data curation workflows, how the workflow analysis engine detects different design problems of workflows and how workflows can be optimized by exploiting parallelism.
APA, Harvard, Vancouver, ISO, and other styles
2

Deng, Ning, Xiao Dong Zhu, Yuan Ning Liu, Yan Pu Li, and Ying Chen. "A Workflow Management Model Based on Workflow Node Property." Applied Mechanics and Materials 442 (October 2013): 450–57. http://dx.doi.org/10.4028/www.scientific.net/amm.442.450.

Full text
Abstract:
Workflow management systems are the powerful tools as well as the best supports for industries which involve series of complex workflows. Specifically, two of the main objectives of workflows management system are (1) ensuring the correctness and integration of workflow advancement, and (2) carrying workflow forward to the maximum extent automatically. To ensure the correctness and integration of workflow management system, in this paper, a workflow management method based on the workflow node property is proposed, and a workflow management system model is given. In addition, in the given model, an automatic advance mode is proposed to make the workflow is able to be carried on automatically.
APA, Harvard, Vancouver, ISO, and other styles
3

Suetake, Hirotaka, Tomoya Tanjo, Manabu Ishii, Bruno P. Kinoshita, Takeshi Fujino, Tsuyoshi Hachiya, Yuichi Kodama, et al. "Sapporo: A workflow execution service that encourages the reuse of workflows in various languages in bioinformatics." F1000Research 11 (August 4, 2022): 889. http://dx.doi.org/10.12688/f1000research.122924.1.

Full text
Abstract:
The increased demand for efficient computation in data analysis encourages researchers in biomedical science to use workflow systems. Workflow systems, or so-called workflow languages, are used for the description and execution of a set of data analysis steps. Workflow systems increase the productivity of researchers, specifically in fields that use high-throughput DNA sequencing applications, where scalable computation is required. As systems have improved the portability of data analysis workflows, research communities are able to share workflows to reduce the cost of building ordinary analysis procedures. However, having multiple workflow systems in a research field has resulted in the distribution of efforts across different workflow system communities. As each workflow system has its unique characteristics, it is not feasible to learn every single system in order to use publicly shared workflows. Thus, we developed Sapporo, an application to provide a unified layer of workflow execution upon the differences of various workflow systems. Sapporo has two components: an application programming interface (API) that receives the request of a workflow run and a browser-based client for the API. The API follows the Workflow Execution Service API standard proposed by the Global Alliance for Genomics and Health. The current implementation supports the execution of workflows in four languages: Common Workflow Language, Workflow Description Language, Snakemake, and Nextflow. With its extensible and scalable design, Sapporo can support the research community in utilizing valuable resources for data analysis.
APA, Harvard, Vancouver, ISO, and other styles
4

Liu, Yong Shan, Yan Qing Shen, and Tian Bao Hao. "Research on Reliability Modeling of Cross-Organizational Workflows Based on Hierarchical Colored Petri Nets." Advanced Materials Research 186 (January 2011): 505–9. http://dx.doi.org/10.4028/www.scientific.net/amr.186.505.

Full text
Abstract:
To reduce the complexity of cross-organizational workflow modeling and verification, a reliable modeling method of cross-organizational workflows based on hierarchical colored Petri nets is proposed. At the foundation of discussing formal definitions of global and local workflows, the paper develops reliability modeling constraints of cross-organizational workflows from structure to logic. Under these constraints, a top-down cross-organizational workflow model is built. Those substitution transitions which input places were safe in global workflow are refined by reliable local workflows to guarantee the reliability of cross-organizational workflow model.
APA, Harvard, Vancouver, ISO, and other styles
5

Lamprecht, Anna-Lena, Magnus Palmblad, Jon Ison, Veit Schwämmle, Mohammad Sadnan Al Manir, Ilkay Altintas, Christopher J. O. Baker, et al. "Perspectives on automated composition of workflows in the life sciences." F1000Research 10 (September 7, 2021): 897. http://dx.doi.org/10.12688/f1000research.54159.1.

Full text
Abstract:
Scientific data analyses often combine several computational tools in automated pipelines, or workflows. Thousands of such workflows have been used in the life sciences, though their composition has remained a cumbersome manual process due to a lack of standards for annotation, assembly, and implementation. Recent technological advances have returned the long-standing vision of automated workflow composition into focus. This article summarizes a recent Lorentz Center workshop dedicated to automated composition of workflows in the life sciences. We survey previous initiatives to automate the composition process, and discuss the current state of the art and future perspectives. We start by drawing the “big picture” of the scientific workflow development life cycle, before surveying and discussing current methods, technologies and practices for semantic domain modelling, automation in workflow development, and workflow assessment. Finally, we derive a roadmap of individual and community-based actions to work toward the vision of automated workflow development in the forthcoming years. A central outcome of the workshop is a general description of the workflow life cycle in six stages: 1) scientific question or hypothesis, 2) conceptual workflow, 3) abstract workflow, 4) concrete workflow, 5) production workflow, and 6) scientific results. The transitions between stages are facilitated by diverse tools and methods, usually incorporating domain knowledge in some form. Formal semantic domain modelling is hard and often a bottleneck for the application of semantic technologies. However, life science communities have made considerable progress here in recent years and are continuously improving, renewing interest in the application of semantic technologies for workflow exploration, composition and instantiation. Combined with systematic benchmarking with reference data and large-scale deployment of production-stage workflows, such technologies enable a more systematic process of workflow development than we know today. We believe that this can lead to more robust, reusable, and sustainable workflows in the future.
APA, Harvard, Vancouver, ISO, and other styles
6

Oliva, Gustavo Ansaldi, Marco Aurélio Gerosa, Fabio Kon, Virginia Smith, and Dejan Milojicic. "A Static Change Impact Analysis Approach based on Metrics and Visualizations to Support the Evolution of Workflow Repositories." International Journal of Web Services Research 13, no. 2 (April 2016): 74–101. http://dx.doi.org/10.4018/ijwsr.2016040105.

Full text
Abstract:
In ever-changing business environments, organizations continuously refine their processes to benefit from and meet the constraints of new technology, new business rules, and new market requirements. Workflow management systems (WFMSs) support organizations in evolving their processes by providing them with technological mechanisms to design, enact, and monitor workflows. However, workflows repositories often grow and start to encompass a variety of interdependent workflows. Without appropriate tool support, keeping track of such interdependencies and staying aware of the impact of a change in a workflow schema becomes hard. Workflow designers are often blindsided by changes that end up inducing side- and ripple-effects. This poses threats to the reliability of the workflows and ultimately hampers the evolvability of the workflow repository as a whole. In this paper, the authors introduce a change impact analysis approach based on metrics and visualizations to support the evolution of workflow repositories. They implemented the approach and later integrated it as a module in the HP Operations Orchestration (HP OO) WFMS. The authors conducted an exploratory study in which they thoroughly analyzed the workflow repositories of 8 HP OO customers. They characterized the customer repositories from a change impact perspective and compared them against each other. The authors were able to spot the workflows with high change impact among thousands of workflows in each repository. They also found that while the out-of-the-box repository included in HP OO had 10 workflows with high change impact, customer repositories included 11 (+10%) to 35 (+250%) workflows with this same characteristic. This result indicates the extent to which customers should put additional effort in evolving their repositories. The authors' approach contributes to the body of knowledge on static workflow evolution and complements existing dynamic workflow evolution approaches. Their techniques also aim to help organizations build more flexible and reliable workflow repositories.
APA, Harvard, Vancouver, ISO, and other styles
7

BLIN, MARIE JOSÉ, JACQUES WAINER, and CLAUDIA BAUZER MEDEIROS. "A REUSE-ORIENTED WORKFLOW DEFINITION LANGUAGE." International Journal of Cooperative Information Systems 12, no. 01 (March 2003): 1–36. http://dx.doi.org/10.1142/s0218843003000553.

Full text
Abstract:
This paper presents a new formalism for workflow process definition, which combines research in programming languages and in database systems. This formalism is based on creating a library of workflow building blocks, which can be progressively combined and nested to construct complex workflows. Workflows are specified declaratively, using a simple high level language, which allows the dynamic definition of exception handling and events, as well as dynamically overriding workflow definition. This ensures a high degree of flexibility in data and control flow specification, as well as in reuse of workflow specifications to construct other workflows. The resulting workflow execution environment is well suited to supporting cooperative work.
APA, Harvard, Vancouver, ISO, and other styles
8

WANG, JIACUN, and DEMIN LI. "RESOURCE ORIENTED WORKFLOW NETS AND WORKFLOW RESOURCE REQUIREMENT ANALYSIS." International Journal of Software Engineering and Knowledge Engineering 23, no. 05 (June 2013): 677–93. http://dx.doi.org/10.1142/s0218194013400135.

Full text
Abstract:
Petri nets are a powerful formalism in modeling workflows. A workflow determines the flow of work according to pre-defined business process. In many situations, business processes are constrained by scarce resources. The lack of resources can cause contention, the need for some tasks to wait for others to complete, which slows down the accomplishment of larger goals. In our previous work, a resource-constrained workflow model was introduced and a resource requirement analysis approach was developed for emergency response workflows, in which support of on-the-fly workflow change is critical [14]. In this paper, we propose a Petri net based approach for recourse requirements analysis, which can be used for more general purposes. The concept of resource-oriented workflow nets (ROWN) is introduced and the transition firing rules of ROWN are presented. Resource requirements for general workflows can be done through reachability analysis. An efficient resource analysis algorithm is developed for a class of well-structured workflows, in which when a task execution is started it is guaranteed to finish successfully. For a task that may fail in the middle of execution, an equivalent non-failing task model in terms of resource consumption is developed.
APA, Harvard, Vancouver, ISO, and other styles
9

Zulfiqar, Mahnoor, Michael R. Crusoe, Birgitta König-Ries, Christoph Steinbeck, Kristian Peters, and Luiz Gadelha. "Implementation of FAIR Practices in Computational Metabolomics Workflows—A Case Study." Metabolites 14, no. 2 (February 10, 2024): 118. http://dx.doi.org/10.3390/metabo14020118.

Full text
Abstract:
Scientific workflows facilitate the automation of data analysis tasks by integrating various software and tools executed in a particular order. To enable transparency and reusability in workflows, it is essential to implement the FAIR principles. Here, we describe our experiences implementing the FAIR principles for metabolomics workflows using the Metabolome Annotation Workflow (MAW) as a case study. MAW is specified using the Common Workflow Language (CWL), allowing for the subsequent execution of the workflow on different workflow engines. MAW is registered using a CWL description on WorkflowHub. During the submission process on WorkflowHub, a CWL description is used for packaging MAW using the Workflow RO-Crate profile, which includes metadata in Bioschemas. Researchers can use this narrative discussion as a guideline to commence using FAIR practices for their bioinformatics or cheminformatics workflows while incorporating necessary amendments specific to their research area.
APA, Harvard, Vancouver, ISO, and other styles
10

Willoughby, Cerys, and Jeremy G. Frey. "Documentation and Visualisation of Workflows for Effective Communication, Collaboration and Publication @ Source." International Journal of Digital Curation 12, no. 1 (September 16, 2017): 72–87. http://dx.doi.org/10.2218/ijdc.v12i1.532.

Full text
Abstract:
Workflows processing data from research activities and driving in silico experiments are becoming an increasingly important method for conducting scientific research. Workflows have the advantage that not only can they be automated and used to process data repeatedly, but they can also be reused – in part or whole – enabling them to be evolved for use in new experiments. A number of studies have investigated strategies for storing and sharing workflows for the benefit of reuse. These have revealed that simply storing workflows in repositories without additional context does not enable workflows to be successfully reused. These studies have investigated what additional resources are needed to facilitate users of workflows and in particular to add provenance traces and to make workflows and their resources machine-readable. These additions also include adding metadata for curation, annotations for comprehension, and including data sets to provide additional context to the workflow. Ultimately though, these mechanisms still rely on researchers having access to the software to view and run the workflows. We argue that there are situations where researchers may want to understand a workflow that goes beyond what provenance traces provide and without having to run the workflow directly; there are many situations in which it can be difficult or impossible to run the original workflow. To that end, we have investigated the creation of an interactive workflow visualization that captures the flow chart element of the workflow with additional context including annotations, descriptions, parameters, metadata and input, intermediate, and results data that can be added to the record of a workflow experiment to enhance both curation and add value to enable reuse. We have created interactive workflow visualisations for the popular workflow creation tool KNIME, which does not provide users with an in-built function to extract provenance information that can otherwise only be viewed through the tool itself. Making use of the strengths of KNIME for adding documentation and user-defined metadata we can extract and create a visualisation and curation package that encourages and enhances curation@source, facilitating effective communication, collaboration, and reuse of workflows.
APA, Harvard, Vancouver, ISO, and other styles
11

Jackson, Michael, Kostas Kavoussanakis, and Edward W. J. Wallace. "Using prototyping to choose a bioinformatics workflow management system." PLOS Computational Biology 17, no. 2 (February 25, 2021): e1008622. http://dx.doi.org/10.1371/journal.pcbi.1008622.

Full text
Abstract:
Workflow management systems represent, manage, and execute multistep computational analyses and offer many benefits to bioinformaticians. They provide a common language for describing analysis workflows, contributing to reproducibility and to building libraries of reusable components. They can support both incremental build and re-entrancy—the ability to selectively re-execute parts of a workflow in the presence of additional inputs or changes in configuration and to resume execution from where a workflow previously stopped. Many workflow management systems enhance portability by supporting the use of containers, high-performance computing (HPC) systems, and clouds. Most importantly, workflow management systems allow bioinformaticians to delegate how their workflows are run to the workflow management system and its developers. This frees the bioinformaticians to focus on what these workflows should do, on their data analyses, and on their science. RiboViz is a package to extract biological insight from ribosome profiling data to help advance understanding of protein synthesis. At the heart of RiboViz is an analysis workflow, implemented in a Python script. To conform to best practices for scientific computing which recommend the use of build tools to automate workflows and to reuse code instead of rewriting it, the authors reimplemented this workflow within a workflow management system. To select a workflow management system, a rapid survey of available systems was undertaken, and candidates were shortlisted: Snakemake, cwltool, Toil, and Nextflow. Each candidate was evaluated by quickly prototyping a subset of the RiboViz workflow, and Nextflow was chosen. The selection process took 10 person-days, a small cost for the assurance that Nextflow satisfied the authors’ requirements. The use of prototyping can offer a low-cost way of making a more informed selection of software to use within projects, rather than relying solely upon reviews and recommendations by others.
APA, Harvard, Vancouver, ISO, and other styles
12

Kaur, Avinash, Pooja Gupta, and Manpreet Singh. "Hybrid Balanced Task Clustering Algorithm for Scientific Workflows in Cloud Computing." Scalable Computing: Practice and Experience 20, no. 2 (May 2, 2019): 237–58. http://dx.doi.org/10.12694/scpe.v20i2.1515.

Full text
Abstract:
Scientific Workflow is a composition of both coarse-grained and fine-grained computational tasks displaying varying execution requirements. Large-scale data transfer is involved in scientific workflows, so efficient techniques are required to reduce the makespan of the workflow. Task clustering is an efficient technique used in such a scenario that involves combining multiple tasks with shorter execution time into a single cluster to be executed on a resource. This leads to a reduction of scheduling overheads in scientific workflows and thus improvement of performance. However available task clustering methods involve clustering the tasks horizontally without the consideration of the structure of tasks in a workflow. We propose hybrid balanced task clustering algorithm that uses the parameter of impact factor of workflows along with the structure of workflow. According to this technique, tasks can be considered for clustering either vertically or horizontally based on the value of the impact factor. This minimizes the system overheads and the makespan for execution of a workflow. A simulation based evaluation is performed on real workflows that shows the proposed algorithm is efficient in recommending clusters. It shows improvement of 5-10\% in makespan time of workflow depending on the type of workflow used.
APA, Harvard, Vancouver, ISO, and other styles
13

Jia, Nan, and Xiao Dong Fu. "A Method to Calculate the Process Similarity of the Manufacturing System Based on Tree Edit Distance." Advanced Materials Research 548 (July 2012): 699–703. http://dx.doi.org/10.4028/www.scientific.net/amr.548.699.

Full text
Abstract:
For various applications in today’s workflow systems, such as process-discovering or clustering, it is necessary to measure the distance between two workflow models. In this paper, we proposed a method to calculate the distance between structured workflows based on tree edit distance. First, we transform workflows into structure trees, and calculate the edit distance between structure trees. Three properties of the workflow distance of workflows are proved, i.e., reflexivity, symmetry, triangle inequality. These properties make the distance measure can be used as a quantitative tool in effective workflow model management activities. We illustrate the methodology with case study, by which its features are shown.
APA, Harvard, Vancouver, ISO, and other styles
14

Singh, Gurmeet, Karan Vahi, Arun Ramakrishnan, Gaurang Mehta, Ewa Deelman, Henan Zhao, Rizos Sakellariou, et al. "Optimizing Workflow Data Footprint." Scientific Programming 15, no. 4 (2007): 249–68. http://dx.doi.org/10.1155/2007/701609.

Full text
Abstract:
In this paper we examine the issue of optimizing disk usage and scheduling large-scale scientific workflows onto distributed resources where the workflows are data-intensive, requiring large amounts of data storage, and the resources have limited storage resources. Our approach is two-fold: we minimize the amount of space a workflow requires during execution by removing data files at runtime when they are no longer needed and we demonstrate that workflows may have to be restructured to reduce the overall data footprint of the workflow. We show the results of our data management and workflow restructuring solutions using a Laser Interferometer Gravitational-Wave Observatory (LIGO) application and an astronomy application, Montage, running on a large-scale production grid-the Open Science Grid. We show that although reducing the data footprint of Montage by 48% can be achieved with dynamic data cleanup techniques, LIGO Scientific Collaboration workflows require additional restructuring to achieve a 56% reduction in data space usage. We also examine the cost of the workflow restructuring in terms of the application's runtime.
APA, Harvard, Vancouver, ISO, and other styles
15

Nguyen, P., M. Hilario, and A. Kalousis. "Using Meta-mining to Support Data Mining Workflow Planning and Optimization." Journal of Artificial Intelligence Research 51 (November 29, 2014): 605–44. http://dx.doi.org/10.1613/jair.4377.

Full text
Abstract:
Knowledge Discovery in Databases is a complex process that involves many different data processing and learning operators. Today's Knowledge Discovery Support Systems can contain several hundred operators. A major challenge is to assist the user in designing workflows which are not only valid but also -- ideally -- optimize some performance measure associated with the user goal. In this paper we present such a system. The system relies on a meta-mining module which analyses past data mining experiments and extracts meta-mining models which associate dataset characteristics with workflow descriptors in view of workflow performance optimization. The meta-mining model is used within a data mining workflow planner, to guide the planner during the workflow planning. We learn the meta-mining models using a similarity learning approach, and extract the workflow descriptors by mining the workflows for generalized relational patterns accounting also for domain knowledge provided by a data mining ontology. We evaluate the quality of the data mining workflows that the system produces on a collection of real world datasets coming from biology and show that it produces workflows that are significantly better than alternative methods that can only do workflow selection and not planning.
APA, Harvard, Vancouver, ISO, and other styles
16

He, Pan, Jie Xu, Kai Gui Wu, and Jun Hao Wen. "A Dynamic Service Pool Size Configuration Mechanism for Service-Oriented Workflow." Advanced Materials Research 186 (January 2011): 499–504. http://dx.doi.org/10.4028/www.scientific.net/amr.186.499.

Full text
Abstract:
Service-oriented workflows are the fundamental structures in service-oriented applications and changes in the workflow could cause dramatic changes in system reliability. In several ways to re-heal workflows in execution, re-sizing service pools in the workflow is practical and easy to implement. In order to quickly adjust to workflow or environmental changes, this paper presents a dynamic service pool size configuration mechanism from the point of view of maintaining workflow reliability. An architecture-based reliability model is used to evaluate the overall reliability of a workflow with service pools and an optimal method is proposed to get the combination of service pool size aiming at minimizing the sum of service pool size subject to the workflow reliability requirement. A case study is used to explain this method and experiment results show how to change service pool size to meet the workflow reliability requirements.
APA, Harvard, Vancouver, ISO, and other styles
17

Acuña, Ruben, Jacques Chomilier, and Zoé Lacroix. "Managing and Documenting Legacy Scientific Workflows." Journal of Integrative Bioinformatics 12, no. 3 (September 1, 2015): 65–87. http://dx.doi.org/10.1515/jib-2015-277.

Full text
Abstract:
Summary Scientific legacy workflows are often developed over many years, poorly documented and implemented with scripting languages. In the context of our cross-disciplinary projects we face the problem of maintaining such scientific workflows. This paper presents the Workflow Instrumentation for Structure Extraction (WISE) method used to process several ad-hoc legacy workflows written in Python and automatically produce their workflow structural skeleton. Unlike many existing methods, WISE does not assume input workflows to be preprocessed in a known workflow formalism. It is also able to identify and analyze calls to external tools. We present the method and report its results on several scientific workflows.
APA, Harvard, Vancouver, ISO, and other styles
18

Shao, Wei Ping, Chun Yan Wang, Yong Ping Hao, Peng Fei Zeng, and Xiao Lei Xu. "Ontology-Based Workflow Semantic Representation and Modeling Method." Advanced Materials Research 129-131 (August 2010): 50–54. http://dx.doi.org/10.4028/www.scientific.net/amr.129-131.50.

Full text
Abstract:
An ontology-based workflow (workflow-ontology) representation method was proposed after analyzing that not only structure information but also semantic information were needed in a workflow model. Workflow-ontology concepts were composed by class and subclass of the workflow. Concepts’ properties including their values and characteristics were redefined, and then, workflow-ontology modeling method was put forward based on the ontology expresses and definitions above. With the example of applying in products examined and approved workflows, the corresponding workflow-ontology model (WFO) was built.
APA, Harvard, Vancouver, ISO, and other styles
19

Abdul Aziz, Maslina, Jemal H. Abawajy, and Morshed Chowdhury. "Scheduling Workflow Applications with Makespan and Reliability Constraints." Indonesian Journal of Electrical Engineering and Computer Science 12, no. 2 (November 1, 2018): 482. http://dx.doi.org/10.11591/ijeecs.v12.i2.pp482-488.

Full text
Abstract:
In the last few years, workflows are becoming richer and more complex. Workflow scheduling management system to be robust, flexible with multicriteria scheduling algorithms. It needs to satisfy the Quality of Service (QoS) parameters. However, QoS parameters and workflow system objectives are often contradictory. In our analysis, we derived an efficient strategy to minimize the overall processing time for scheduling workflows modelled by using Directed Acyclic Graph (DAG). We studied the problem of workflow scheduling that lead to optimizing makespan and reliability. The proposed algorithm handles unsuccessful job execution or resource failure by dynamically scheduling workflows to available resources. Based on the experiments results, our proposed Failure-Aware Workflow Scheduling (FAWS) Algorithm can significantly optimize the makespan and minimize the reliability by rescheduling the failed task to the unused resources. The effectiveness of the FAWS algorithm was validated based on a simulation-driven analysis based on the workflow application.
APA, Harvard, Vancouver, ISO, and other styles
20

VAN DER AALST, W. M. P. "THE APPLICATION OF PETRI NETS TO WORKFLOW MANAGEMENT." Journal of Circuits, Systems and Computers 08, no. 01 (February 1998): 21–66. http://dx.doi.org/10.1142/s0218126698000043.

Full text
Abstract:
Workflow management promises a new solution to an age-old problem: controlling, monitoring, optimizing and supporting business processes. What is new about workflow management is the explicit representation of the business process logic which allows for computerized support. This paper discusses the use of Petri nets in the context of workflow management. Petri nets are an established tool for modeling and analyzing processes. On the one hand, Petri nets can be used as a design language for the specification of complex workflows. On the other hand, Petri net theory provides for powerful analysis techniques which can be used to verify the correctness of workflow procedures. This paper introduces workflow management as an application domain for Petri nets, presents state-of-the-art results with respect to the verification of workflows, and highlights some Petri-net-based workflow tools.
APA, Harvard, Vancouver, ISO, and other styles
21

Lakhwani, Kamlesh, Gajanand Sharma, Ramandeep Sandhu, Naresh Kumar Nagwani, Sandeep Bhargava, Varsha Arya, and Ammar Almomani. "Adaptive and Convex Optimization-Inspired Workflow Scheduling for Cloud Environment." International Journal of Cloud Applications and Computing 13, no. 1 (June 21, 2023): 1–25. http://dx.doi.org/10.4018/ijcac.324809.

Full text
Abstract:
Scheduling large-scale and resource-intensive workflows in cloud infrastructure is one of the main challenges for cloud service providers (CSPs). Cloud infrastructure is more efficient when virtual machines and other resources work up to their full potential. The main factor that influences the quality of cloud services is the distribution of workflow on virtual machines (VMs). Scheduling tasks to VMs depends on the type of workflow and mechanism of resource allocation. Scientific workflows include large-scale data transfer and consume intensive resources of cloud infrastructures. Therefore, scheduling of tasks from scientific workflows on VMs requires efficient and optimized workflow scheduling techniques. This paper proposes an optimised workflow scheduling approach that aims to improve the utilization of cloud resources without increasing execution time and execution cost.
APA, Harvard, Vancouver, ISO, and other styles
22

Lu, Pingping, Gongxuan Zhang, Zhaomeng Zhu, Xiumin Zhou, Jin Sun, and Junlong Zhou. "A Review of Cost and Makespan-Aware Workflow Scheduling in Clouds." Journal of Circuits, Systems and Computers 28, no. 06 (June 12, 2019): 1930006. http://dx.doi.org/10.1142/s021812661930006x.

Full text
Abstract:
Scientific workflow is a common model to organize large scientific computations. It borrows the concept of workflow in business activities to manage the complicated processes in scientific computing automatically or semi-automatically. The workflow scheduling, which maps tasks in workflows to parallel computing resources, has been extensively studied over years. In recent years, with the rise of cloud computing as a new large-scale distributed computing model, it is of great significance to study workflow scheduling problem in the cloud. Compared with traditional distributed computing platforms, cloud platforms have unique characteristics such as the self-service resource management model and the pay-as-you-go billing model. Therefore, the workflow scheduling in cloud needs to be reconsidered. When scheduling workflows in clouds, the monetary cost and the makespan of the workflow executions are concerned with both the cloud service providers (CSPs) and the customers. In this paper, we study a series of cost-and-time-aware workflow scheduling algorithms in cloud environments, which aims to provide researchers with a choice of appropriate cloud workflow scheduling approaches in various scenarios. We conducted a broad review of different cloud workflow scheduling algorithms and categorized them based on their optimization objectives and constraints. Also, we discuss the possible future research direction of the clouds workflow scheduling.
APA, Harvard, Vancouver, ISO, and other styles
23

Dinçer, Sevde Gülizar, and Tuğrul Yazar. "A comparative analysis of the digital re-constructions of muqarnas systems: The case study of Sultanhanı muqarnas in Central Anatolia." International Journal of Architectural Computing 19, no. 3 (February 11, 2021): 360–85. http://dx.doi.org/10.1177/1478077121992487.

Full text
Abstract:
This paper presents a comparative case study on the digital modeling workflows of a particular muqarnas system. After the literature review and the definition of the context, several digital modeling workflows were described as element-based, tessellation-based and block-based workflows by using computer-aided design and parametric modeling software. As the case study of this research, these workflows were tested on a muqarnas design located at the Sultanhanı Caravanserai in Central Anatolia. Then, workflows were compared according to three qualities: analytical, generative, and performative. The outcomes of element-based workflow has more analytical solutions for the study, where tessellation-based workflow has more generative potential and block-based workflow is more performative.
APA, Harvard, Vancouver, ISO, and other styles
24

Assuncao, Luis, Carlos Goncalves, and Jose C. Cunha. "Autonomic Workflow Activities." International Journal of Adaptive, Resilient and Autonomic Systems 5, no. 2 (April 2014): 57–82. http://dx.doi.org/10.4018/ijaras.2014040104.

Full text
Abstract:
Workflows have been successfully applied to express the decomposition of complex scientific applications. This has motivated many initiatives that have been developing scientific workflow tools. However the existing tools still lack adequate support to important aspects namely, decoupling the enactment engine from workflow tasks specification, decentralizing the control of workflow activities, and allowing their tasks to run autonomous in distributed infrastructures, for instance on Clouds. Furthermore many workflow tools only support the execution of Direct Acyclic Graphs (DAG) without the concept of iterations, where activities are executed millions of iterations during long periods of time and supporting dynamic workflow reconfigurations after certain iteration. We present the AWARD (Autonomic Workflow Activities Reconfigurable and Dynamic) model of computation, based on the Process Networks model, where the workflow activities (AWA) are autonomic processes with independent control that can run in parallel on distributed infrastructures, e. g. on Clouds. Each AWA executes a Task developed as a Java class that implements a generic interface allowing end-users to code their applications without concerns for low-level details. The data-driven coordination of AWA interactions is based on a shared tuple space that also enables support to dynamic workflow reconfiguration and monitoring of the execution of workflows. We describe how AWARD supports dynamic reconfiguration and discuss typical workflow reconfiguration scenarios. For evaluation we describe experimental results of AWARD workflow executions in several application scenarios, mapped to a small dedicated cluster and the Amazon (Elastic Computing EC2) Cloud.
APA, Harvard, Vancouver, ISO, and other styles
25

Zhang, Haoqi, Eric Horvitz, and David Parkes. "Automated Workflow Synthesis." Proceedings of the AAAI Conference on Artificial Intelligence 27, no. 1 (June 30, 2013): 1020–26. http://dx.doi.org/10.1609/aaai.v27i1.8681.

Full text
Abstract:
By coordinating efforts from humans and machines, human computation systems can solve problems that machines cannot tackle alone. A general challenge is to design efficient human computation algorithms or workflows with which to coordinate the work of the crowd. We introduce a method for automated workflow synthesis aimed at ideally harnessing human efforts by learning about the crowd's performance on tasks and synthesizing an optimal workflow for solving a problem. We present experimental results for human sorting tasks, which demonstrate both the benefit of understanding and optimizing the structure of workflows based on observations. Results also demonstrate the benefits of using value of information to guide experiments for identifying efficient workflows with fewer experiments.
APA, Harvard, Vancouver, ISO, and other styles
26

Silva Junior, Daniel, Esther Pacitti, Aline Paes, and Daniel de Oliveira. "Provenance-and machine learning-based recommendation of parameter values in scientific workflows." PeerJ Computer Science 7 (July 5, 2021): e606. http://dx.doi.org/10.7717/peerj-cs.606.

Full text
Abstract:
Scientific Workflows (SWfs) have revolutionized how scientists in various domains of science conduct their experiments. The management of SWfs is performed by complex tools that provide support for workflow composition, monitoring, execution, capturing, and storage of the data generated during execution. In some cases, they also provide components to ease the visualization and analysis of the generated data. During the workflow’s composition phase, programs must be selected to perform the activities defined in the workflow specification. These programs often require additional parameters that serve to adjust the program’s behavior according to the experiment’s goals. Consequently, workflows commonly have many parameters to be manually configured, encompassing even more than one hundred in many cases. Wrongly parameters’ values choosing can lead to crash workflows executions or provide undesired results. As the execution of data- and compute-intensive workflows is commonly performed in a high-performance computing environment e.g., (a cluster, a supercomputer, or a public cloud), an unsuccessful execution configures a waste of time and resources. In this article, we present FReeP—Feature Recommender from Preferences, a parameter value recommendation method that is designed to suggest values for workflow parameters, taking into account past user preferences. FReeP is based on Machine Learning techniques, particularly in Preference Learning. FReeP is composed of three algorithms, where two of them aim at recommending the value for one parameter at a time, and the third makes recommendations for n parameters at once. The experimental results obtained with provenance data from two broadly used workflows showed FReeP usefulness in the recommendation of values for one parameter. Furthermore, the results indicate the potential of FReeP to recommend values for n parameters in scientific workflows.
APA, Harvard, Vancouver, ISO, and other styles
27

Pfaff, Claas-Thido, Karin Nadrowski, Sophia Ratcliffe, Christian Wirth, and Helge Bruelheide. "Readable workflows need simple data." F1000Research 3 (May 14, 2014): 110. http://dx.doi.org/10.12688/f1000research.3940.1.

Full text
Abstract:
Sharing scientific analyses via workflows has great potential to improve the reproducibility of science as well as communicating research results. This is particularly useful for trans-disciplinary research fields such as biodiversity - ecosystem functioning (BEF), where syntheses need to merge data ranging from genes to the biosphere. Here we argue that enabling simplicity in the very beginning of workflows, at the point of data description and merging, offers huge potentials in reducing workflow complexity and in fostering data and workflow reuse. We illustrate our points using a typical analysis in BEF research, the aggregation of carbon pools in a forest ecosystem. We introduce indicators for the complexity of workflow components including data sources. We show that workflow complexity decreases exponentially during the course of the analysis and that simple text-based measures help to identify bottlenecks in a workflow and group workflow components according to tasks. We thus suggest that focusing on simplifying steps of data aggregation and imputation will greatly improve workflow readability and thus reproducibility. Providing feedback to data providers about the complexity of their datasets may help to produce better focused data that can be used more easily in further studies. At the same time, providing feedback about the complexity of workflow components may help to exchange shorter and simpler workflows for easier reuse. Additionally, identifying repetitive tasks informs software development in providing automated solutions. We discuss current initiatives in software and script development that implement quality control for simplicity and social tools of script valuation. Taken together we argue that focusing on simplifying data sources and workflow components will improve and accelerate data and workflow reuse and simplify the reproducibility of data-driven science.
APA, Harvard, Vancouver, ISO, and other styles
28

Bahsi, Emir M., Emrah Ceyhan, and Tevfik Kosar. "Conditional Workflow Management: A Survey and Analysis." Scientific Programming 15, no. 4 (2007): 283–97. http://dx.doi.org/10.1155/2007/680291.

Full text
Abstract:
Workflows form the essential part of the process execution both in a single machine and in distributed environments. Although providing conditional structures is not mandatory for a workflow management system, support for conditional workflows is very important in terms of error handling, flexibility and robustness. Several of the existing workflow management systems already support conditional structures via use of different constructs. In this paper, we study the most widely used workflow management systems and their support for conditional structures such as if, switch, and while. We compare implementation of common conditional structures using each of these workflow management systems via case studies, and discuss capabilities of each system.
APA, Harvard, Vancouver, ISO, and other styles
29

Xu, Hong Zhen, Bin Tang, Ying Gui, and Huai Ping Wang. "A Dynamic Workflow Management Model Based on Web Services." Key Engineering Materials 439-440 (June 2010): 599–604. http://dx.doi.org/10.4028/www.scientific.net/kem.439-440.599.

Full text
Abstract:
Workflow technology has emerged as one of those technologies designed to support modeling, designing and executing business processes. One of the major limitations of current workflow management systems is the lack of flexibility to support dynamic management of workflows. In this paper, we propose a dynamic workflow management model based on web services. We integrate web services and ontology technologies to support dynamic specifying, monitoring, analyzing, designing, configuring and executing workflows. We explain the need and functionality of the main modules and interfaces of the model, and introduce its application in a case study. An important feature of this model is to support for planning and adaptive workflow management.
APA, Harvard, Vancouver, ISO, and other styles
30

Wen, Yiping, Junjie Hou, Zhen Yuan, and Dong Zhou. "Heterogeneous Information Network-Based Scientific Workflow Recommendation for Complex Applications." Complexity 2020 (March 19, 2020): 1–16. http://dx.doi.org/10.1155/2020/4129063.

Full text
Abstract:
Scientific workflow is a valuable tool for various complicated large-scale data processing applications. In recent years, the increasingly growing number of scientific processes available necessitates the development of recommendation techniques to provide automatic support for modelling scientific workflows. In this paper, with the help of heterogeneous information network (HIN) and tags of scientific workflows, we organize scientific workflows as a HIN and propose a novel scientific workflow similarity computation method based on metapath. In addition, the density peak clustering (DPC) algorithm is introduced into the recommendation process and a scientific workflow recommendation approach named HDSWR is proposed. The effectiveness and efficiency of our approach are evaluated by extensive experiments with real-world scientific workflows.
APA, Harvard, Vancouver, ISO, and other styles
31

Chen, Wei Zeng. "The Operator Workflow Analysis and Skill Standards Determined for Multi-Variety and Small Batch Processing Post." Applied Mechanics and Materials 220-223 (November 2012): 112–16. http://dx.doi.org/10.4028/www.scientific.net/amm.220-223.112.

Full text
Abstract:
The method of the traditional workflow design and the skill standards hadn’t been in meeting its demands according to the characteristic of Multi-variety and small-batch processing. The operator workflow analyses were divided into three parts to be discussed in this paper as a common content workflow, character content workflow and temporary content workflow. The general content workflow was optimized by using the classics work study method and choosing the scene team on the other two modules. A new analysis method which took crombag skill defines as the indication basis, was established for the operator workflows, and considering the essential factor as the analysis objects. The designed new module type workflows were divided the work contents of the standard into the Multi-variety and small-batch processing, which promoted the post work ordering and highly effective. Its characteristic of the rationale and the objective basis reduces the ingredients, and provides the operator training and recruitment.
APA, Harvard, Vancouver, ISO, and other styles
32

Wu, Na, Decheng Zuo, and Zhan Zhang. "Dynamic Fault-Tolerant Workflow Scheduling with Hybrid Spatial-Temporal Re-Execution in Clouds." Information 10, no. 5 (May 5, 2019): 169. http://dx.doi.org/10.3390/info10050169.

Full text
Abstract:
Improving reliability is one of the major concerns of scientific workflow scheduling in clouds. The ever-growing computational complexity and data size of workflows present challenges to fault-tolerant workflow scheduling. Therefore, it is essential to design a cost-effective fault-tolerant scheduling approach for large-scale workflows. In this paper, we propose a dynamic fault-tolerant workflow scheduling (DFTWS) approach with hybrid spatial and temporal re-execution schemes. First, DFTWS calculates the time attributes of tasks and identifies the critical path of workflow in advance. Then, DFTWS assigns appropriate virtual machine (VM) for each task according to the task urgency and budget quota in the phase of initial resource allocation. Finally, DFTWS performs online scheduling, which makes real-time fault-tolerant decisions based on failure type and task criticality throughout workflow execution. The proposed algorithm is evaluated on real-world workflows. Furthermore, the factors that affect the performance of DFTWS are analyzed. The experimental results demonstrate that DFTWS achieves a trade-off between high reliability and low cost objectives in cloud computing environments.
APA, Harvard, Vancouver, ISO, and other styles
33

Belhajjame, Khalid. "On the Anonymization of Workflow Provenance without Compromising the Transparency of Lineage." Journal of Data and Information Quality 14, no. 1 (March 31, 2022): 1–27. http://dx.doi.org/10.1145/3460207.

Full text
Abstract:
Workflows have been adopted in several scientific fields as a tool for the specification and execution of scientific experiments. In addition to automating the execution of experiments, workflow systems often include capabilities to record provenance information, which contains, among other things, data records used and generated by the workflow as a whole but also by its component modules. It is widely recognized that provenance information can be useful for the interpretation, verification, and re-use of workflow results, justifying its sharing and publication among scientists. However, workflow execution in some branches of science can manipulate sensitive datasets that contain information about individuals. To address this problem, we investigate, in this article, the problem of anonymizing the provenance of workflows. In doing so, we consider a popular class of workflows in which component modules use and generate collections of data records as a result of their invocation, as opposed to a single data record. The solution we propose offers guarantees of confidentiality without compromising lineage information, which provides transparency as to the relationships between the data records used and generated by the workflow modules. We provide algorithmic solutions that show how the provenance of a single module and an entire workflow can be anonymized and present the results of experiments that we conducted for their evaluation.
APA, Harvard, Vancouver, ISO, and other styles
34

Held, Markus, Wolfgang Küchlin, and Wolfgang Blochinger. "MoBiFlow." International Journal of Service Science, Management, Engineering, and Technology 2, no. 4 (October 2011): 67–78. http://dx.doi.org/10.4018/ijssmet.2011100107.

Full text
Abstract:
Web-based problem solving environments provide sharing, execution and monitoring of scientific workflows. Where they depend on general purpose workflow development systems, the workflow notations are likely far too powerful and complex, especially in the area of biology, where programming skills are rare. On the other hand, application specific workflow systems may use special purpose languages and execution engines, suffering from a lack of standards, portability, documentation, stability of investment etc. In both cases, the need to support yet another application on the desk-top places a burden on the system administration of a research lab. In previous research the authors have developed the web based workflow systems Calvin and Hobbes, which enable biologists and computer scientists to approach these problems in collaboration. Both systems use a server-centric Web 2.0 based approach. Calvin is tailored to molecular biology applications, with a simple graphical workflow-language and easy access to existing BioMoby web services. Calvin workflows are compiled to industry standard BPEL workflows, which can be edited and refined in collaboration between researchers and computer scientists using the Hobbes tool. Together, Calvin and Hobbes form our workflow platform MoBiFlow, whose principles, design, and use cases are described in this paper.
APA, Harvard, Vancouver, ISO, and other styles
35

Represa, Jaime Garcia, Felix Larrinaga, Pal Varga, William Ochoa, Alain Perez, Dániel Kozma, and Jerker Delsing. "Investigation of Microservice-Based Workflow Management Solutions for Industrial Automation." Applied Sciences 13, no. 3 (January 31, 2023): 1835. http://dx.doi.org/10.3390/app13031835.

Full text
Abstract:
In an era ruled by data and information, engineers need new tools to cope with the increased complexity of industrial operations. New architectural models for industry enable open communication environments, where workflows can play a major role in providing flexible and dynamic interactions between systems. Workflows help engineers maintain precise control over their factory equipment and Information Technology (IT) services, from the initial design stages to plant operations. The current application of workflows departs from the classic business workflows that focus on office automation systems in favor of a manufacturing-oriented approach that involves direct interaction with cyber-physical systems (CPSs) on the shop floor. This paper identifies relevant industry-related challenges that hinder the adoption of workflow technology, which are classified within the context of a cohesive workflow lifecycle. The classification compares the various workflow management solutions and systems used to monitor and execute workflows. These solutions have been developed alongside the Eclipse Arrowhead framework, which provides a common infrastructure for designing systems according to the microservice architectural principles. This paper investigates and compares various solutions for workflow management and execution in light of the associated industrial requirements. Further, it compares various microservice-based approaches and their implementation. The objective is to support industrial stakeholders in their decision-making with regard to choosing among workflow management solutions.
APA, Harvard, Vancouver, ISO, and other styles
36

Shang, Shi Feng, Jing He Huo, and Zeng Zhang. "QBARM: A Queue Theory-Based Adaptive Resource Usage Model." Advanced Materials Research 756-759 (September 2013): 2523–27. http://dx.doi.org/10.4028/www.scientific.net/amr.756-759.2523.

Full text
Abstract:
Workflow is becoming a more and more important tool for business operations, scientific research and engineering. Cloud computing provides an elastic, on-demand and high cost-efficient resource allocation model for workflow executions. During workflow execution, the load will change from time to time and therefore, it becomes an interesting topic to optimize resource utilization of workflows in the cloud computing environment. In this paper, a workflow framework is proposed that can adaptively use cloud resources. In detail, after users specify the desired goal to achieve, the proposed workflow framework then monitors the workflow execution, and utilizes different pricing models to acquire cloud resources according to the change of workflow load. In this way, the cost of workflow execution is reduced. .
APA, Harvard, Vancouver, ISO, and other styles
37

McPhillips, Timothy, Tianhong Song, Tyler Kolisnik, Steve Aulenbach, Khalid Belhajjame, R. Kyle Bocinsky, Yang Cao, et al. "YesWorkflow: A User-Oriented, Language-Independent Tool for Recovering Workflow Information from Scripts." International Journal of Digital Curation 10, no. 1 (May 21, 2015): 298–313. http://dx.doi.org/10.2218/ijdc.v10i1.370.

Full text
Abstract:
Scientific workflow management systems offer features for composing complex computational pipelines from modular building blocks, executing the resulting automated workflows, and recording the provenance of data products resulting from workflow runs. Despite the advantages such features provide, many automated workflows continue to be implemented and executed outside of scientific workflow systems due to the convenience and familiarity of scripting languages (such as Perl, Python, R, and MATLAB), and to the high productivity many scientists experience when using these languages. YesWorkflow is a set of software tools that aim to provide such users of scripting languages with many of the benefits of scientific workflow systems. YesWorkflow requires neither the use of a workflow engine nor the overhead of adapting code to run effectively in such a system. Instead, YesWorkflow enables scientists to annotate existing scripts with special comments that reveal the computational modules and dataflows otherwise implicit in these scripts. YesWorkflow tools extract and analyze these comments, represent the scripts in terms of entities based on the typical scientific workflow model, and provide graphical renderings of this workflow-like view of the scripts. Future version of YesWorkflow will also allow the prospective provenance of the data products of these scripts to be queried in ways similar to those available to users of scientific workflow systems.
APA, Harvard, Vancouver, ISO, and other styles
38

SMANCHAT, Sucha, and Kanchana VIRIYAPANT. "Scheduling Dynamic Parallel Loop Workflow in Cloud Environment." Walailak Journal of Science and Technology (WJST) 15, no. 1 (August 4, 2016): 19–27. http://dx.doi.org/10.48048/wjst.2018.2267.

Full text
Abstract:
Scientific workflows have been employed to automate large scale scientific experiments by leveraging computational power provided on-demand by cloud computing platforms. Among these workflows, a parallel loop workflow is used for studying the effects of different input values of a scientific experiment. Because of its independent loop characteristic, a parallel loop workflow can be dynamically executed as parallel workflow instances to accelerate the execution. Such execution negates workflow traversal used in existing works to calculate execution time and cost during scheduling in order to maintain time and cost constraints. In this paper, we propose a novel scheduling technique that is able to handle dynamic parallel loop workflow execution through a new method for evaluating execution progress together with a workflow instance arrival control and a cloud resource adjustment mechanism. The proposed technique, which aims at maintaining a workflow deadline while reducing cost, is tested using 3 existing task scheduling heuristics as its task mapping strategies. The simulation results show that the proposed technique is practical and performs better when the time constraint is more relaxed. It also prefers task scheduling heuristics that allow for a more accurate progress evaluation.
APA, Harvard, Vancouver, ISO, and other styles
39

WALKER, CORAL, DASHAN LU, and DAVID W. WALKER. "AUTOMATIC PORTAL GENERATION BASED ON WORKFLOW DESCRIPTION." Parallel Processing Letters 21, no. 02 (June 2011): 155–71. http://dx.doi.org/10.1142/s012962641100014x.

Full text
Abstract:
Distributed scientific and engineering computations on service-oriented architectures are often represented as data-driven workflows. Workflows are a convenient abstraction that allows users to compose applications in a visual programming environment, and execute them by means of a workflow execution engine. For a large class of scientific applications web-based portals can provide a user-friendly problem-solving environment that hides the complexities of executing workflow applications in a distributed environment. However, the creation and configuration of an application portal requires considerable expertise in portal technologies, which scientific end-users generally do not have. To address this problem this paper presents tools for automatically converting a workflow into a fully configured portal which can then be used to execute the workflow.
APA, Harvard, Vancouver, ISO, and other styles
40

Stromeyer, Sofia, Daniel Wiedemeier, Albert Mehl, and Andreas Ender. "Time Efficiency of Digitally and Conventionally Produced Single-Unit Restorations." Dentistry Journal 9, no. 6 (June 1, 2021): 62. http://dx.doi.org/10.3390/dj9060062.

Full text
Abstract:
The purpose of this in vitro study was to compare the time efficiency of digital chairside and labside workflows with a conventional workflow for single-unit restorations. The time efficiency in this specific sense was defined as the time, which has to be spent in a dental office by a dental professional performing the relevant steps. A model with interchangeable teeth on position 36 was created. These teeth were differently prepared, responding to several clinical situations to perform single-unit restorations. Different manufacturing techniques were used: For the digital workflows, CEREC Omnicam (CER) and Trios 3 (TN/TI) were used. The conventional workflow, using a dual-arch tray impression technique, served as the control group. For the labside workflow (_L) and the conventional impression procedure (CO), the time necessary for the impressions and temporary restorations was recorded and served as operating time. The chairside workflow time was divided by the time for the entire workflow (_C) including scan, design, milling and finishing the milled restoration, and in the actual working time (_CW) leaving out the chairside milling of the restoration. Labside workflow time ranged from 9 min 27 s (CER_L) to 12 min 41 s (TI_L). Entire chairside time ranged from 43 min 35 s (CER_C) to 58 min 43 s (TI_C). Pure chairside working time ranged from 15 min 21 s (CER_CW) to 23 min 17 s (TI_CW). Conventional workflow time was 10 min 39 s (CO) on average. The digital labside workflow and the conventional workflow require a similar amount of time. The digital chairside workflow is more time consuming.
APA, Harvard, Vancouver, ISO, and other styles
41

Bukhari, S. Sabahat H., and Yunni Xia. "A Novel Completion-Time-Minimization Scheduling Approach of Scientific Workflows Over Heterogeneous Cloud Computing Systems." International Journal of Web Services Research 16, no. 4 (October 2019): 1–20. http://dx.doi.org/10.4018/ijwsr.2019100101.

Full text
Abstract:
The cloud computing paradigm provides an ideal platform for supporting large-scale scientific-workflow-based applications over the internet. However, the scheduling and execution of scientific workflows still face various challenges such as cost and response time management, which aim at handling acquisition delays of physical servers and minimizing the overall completion time of workflows. A careful investigation into existing methods shows that most existing approaches consider static performance of physical machines (PMs) and ignore the impact of resource acquisition delays in their scheduling models. In this article, the authors present a meta-heuristic-based method to scheduling scientific workflows aiming at reducing workflow completion time through appropriately managing acquisition and transmission delays required for inter-PM communications. The authors carry out extensive case studies as well based on real-world commercial cloud sand multiple workflow templates. Experimental results clearly show that the proposed method outperforms the state-of-art ones such as ICPCP, CEGA, and JIT-C in terms of workflow completion time.
APA, Harvard, Vancouver, ISO, and other styles
42

Khan, Fakhri Alam, Sardar Hussain, Ivan Janciak, and Peter Brezany. "Towards Next Generation Provenance Systems for e-Science." International Journal of Information System Modeling and Design 2, no. 3 (July 2011): 24–48. http://dx.doi.org/10.4018/jismd.2011070102.

Full text
Abstract:
e-Science helps scientists to automate scientific discovery processes and experiments, and promote collaboration across organizational boundaries and disciplines. These experiments involve data discovery, knowledge discovery, integration, linking, and analysis through different software tools and activities. Scientific workflow is one technique through which such activities and processes can be interlinked, automated, and ultimately shared amongst the collaborating scientists. Workflows are realized by the workflow enactment engine, which interprets the process definition and interacts with the workflow participants. Since workflows are typically executed on a shared and distributed infrastructure, the information on the workflow activities, data processed, and results generated (also known as provenance), needs to be recorded in order to be reproduced and reused. A range of solutions and techniques have been suggested for the provenance of data collection and analysis; however, these are predominantly workflow enactment engine and domain dependent. This paper includes taxonomy of existing provenance techniques and a novel solution named VePS (The Vienna e-Science Provenance System) for e-Science provenance collection.
APA, Harvard, Vancouver, ISO, and other styles
43

Cravo, Glória. "Workflow Modelling and Analysis Based on the Construction of Task Models." Scientific World Journal 2015 (2015): 1–7. http://dx.doi.org/10.1155/2015/481767.

Full text
Abstract:
We describe the structure of a workflow as a graph whose vertices represent tasks and the arcs are associated to workflow transitions in this paper. To each task an input/output logic operator is associated. Furthermore, we associate a Boolean term to each transition present in the workflow. We still identify the structure of workflows and describe their dynamism through the construction of new task models. This construction is very simple and intuitive since it is based on the analysis of all tasks present on the workflow that allows us to describe the dynamism of the workflow very easily. So, our approach has the advantage of being very intuitive, which is an important highlight of our work. We also introduce the concept of logical termination of workflows and provide conditions under which this property is valid. Finally, we provide a counter-example which shows that a conjecture presented in a previous article is false.
APA, Harvard, Vancouver, ISO, and other styles
44

HAI NGUYEN, THANH, ENRICO PONTELLI, and TRAN CAO SON. "Phylotastic: An Experiment in Creating, Manipulating, and Evolving Phylogenetic Biology Workflows Using Logic Programming." Theory and Practice of Logic Programming 18, no. 3-4 (July 2018): 656–72. http://dx.doi.org/10.1017/s1471068418000236.

Full text
Abstract:
AbstractEvolutionary Biologists have long struggled with the challenge of developing analysis workflows in a flexible manner, thus facilitating the reuse of phylogenetic knowledge. An evolutionary biology workflow can be viewed as a plan which composes web services that can retrieve, manipulate, and produce phylogenetic trees. The Phylotastic project was launched two years ago as a collaboration between evolutionary biologists and computer scientists, with the goal of developing an open architecture to facilitate the creation of such analysis workflows. While composition of web services is a problem that has been extensively explored in the literature, including within the logic programming domain, the incarnation of the problem in Phylotastic provides a number of additional challenges. Along with the need to integrate preferences and formal ontologies in the description of the desired workflow, evolutionary biologists tend to construct workflows in an incremental manner, by successively refining the workflow, by indicating desired changes (e.g., exclusion of certain services, modifications of the desired output). This leads to the need of successive iterations of incremental replanning, to develop a new workflow that integrates the requested changes while minimizing the changes to the original workflow. This paper illustrates how Phylotastic has addressed the challenges of creating and refining phylogenetic analysis workflows using logic programming technology and how such solutions have been used within the general framework of the Phylotastic project.
APA, Harvard, Vancouver, ISO, and other styles
45

Hung, Ling-Hong, Jiaming Hu, Trevor Meiss, Alyssa Ingersoll, Wes Lloyd, Daniel Kristiyanto, Yuguang Xiong, Eric Sobie, and Ka Yee Yeung. "Building Containerized Workflows Using the BioDepot-Workflow-Builder." Cell Systems 9, no. 5 (November 2019): 508–14. http://dx.doi.org/10.1016/j.cels.2019.08.007.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Dankerl, Peter, Matthias Stefan May, Christian Canstein, Michael Uder, and Marc Saake. "Cutting Staff Radiation Exposure and Improving Freedom of Motion during CT Interventions: Comparison of a Novel Workflow Utilizing a Radiation Protection Cabin versus Two Conventional Workflows." Diagnostics 11, no. 6 (June 16, 2021): 1099. http://dx.doi.org/10.3390/diagnostics11061099.

Full text
Abstract:
This study aimed to evaluate the radiation exposure to the radiologist and the procedure time of prospectively matched CT interventions implementing three different workflows—the radiologist—(I) leaving the CT room during scanning; (II) wearing a lead apron and staying in the CT room; (III) staying in the CT room in a prototype radiation protection cabin without lead apron while utilizing a wireless remote control and a tablet. We prospectively evaluated the radiologist’s radiation exposure utilizing an electronic personal dosimeter, the intervention time, and success in CT interventions matched to the three different workflows. We compared the interventional success, the patient’s dose of the interventional scans in each workflow (total mAs and total DLP), the radiologist’s personal dose (in µSV), and interventional time. To perform workflow III, a prototype of a radiation protection cabin, with 3 mm lead equivalent walls and a foot switch to operate the doors, was built in the CT examination room. Radiation exposure during the maximum tube output at 120 kV was measured by the local admission officials inside the cabin at the same level as in the technician’s control room (below 0.5 μSv/h and 1 mSv/y). Further, to utilize the full potential of this novel workflow, a sterile packed remote control (to move the CT table and to trigger the radiation) and a sterile packed tablet anchored on the CT table (to plan and navigate during the CT intervention) were operated by the radiologist. There were 18 interventions performed in workflow I, 16 in workflow II, and 27 in workflow III. There were no significant differences in the intervention time (workflow I: 23 min ± 12, workflow II: 20 min ± 8, and workflow III: 21 min ± 10, p = 0.71) and the patient’s dose (total DLP, p = 0.14). However, the personal dosimeter registered 0.17 ± 0.22 µSv for workflow II, while I and III both documented 0 µSv, displaying significant difference (p < 0.001). All workflows were performed completely and successfully in all cases. The new workflow has the potential to reduce interventional CT radiologists’ radiation dose to zero while relieving them from working in a lead apron all day.
APA, Harvard, Vancouver, ISO, and other styles
47

Hanoosh, Zaid. "A Survey on Multiple Workflow Scheduling Algorithms in Cloud Environment." Al-Furat Journal of Innovations in Electronics and Computer Engineering 3, no. 1 (April 20, 2024): 36–44. http://dx.doi.org/10.46649/fjiece.v3.1.4a.13.4.2024.

Full text
Abstract:
The workflow approach is a standard for displaying processes and their implementation process. With the advent of electrical sciences, more cumbersome workflows were created with more processing requirements. New distributed systems, such as grid systems and computing clouds, allow users to access heterogeneous resources that are geographically located at different points and execute their workflow tasks. Therefore, the simultaneous receipt and execution of several workflows is obvious. As a result of discussing scheduling algorithms, it is necessary to consider arrangements for implementing multiple workflows on a shared resource set. Improving the execution of multiple workflows can accelerate the process of obtaining results when sending processes to the cloud. In this paper, we first discuss the classification of multiple workflow scheduling algorithms, and then briefly describe the scheduling algorithms for the cloud environment, andfinally, the algorithms of papers were compared with each other
APA, Harvard, Vancouver, ISO, and other styles
48

Blischak, John D., Peter Carbonetto, and Matthew Stephens. "Creating and sharing reproducible research code the workflowr way." F1000Research 8 (October 14, 2019): 1749. http://dx.doi.org/10.12688/f1000research.20843.1.

Full text
Abstract:
Making scientific analyses reproducible, well documented, and easily shareable is crucial to maximizing their impact and ensuring that others can build on them. However, accomplishing these goals is not easy, requiring careful attention to organization, workflow, and familiarity with tools that are not a regular part of every scientist's toolbox. We have developed an R package, workflowr, to help all scientists, regardless of background, overcome these challenges. Workflowr aims to instill a particular "workflow" — a sequence of steps to be repeated and integrated into research practice — that helps make projects more reproducible and accessible.This workflow integrates four key elements: (1) version control (via Git); (2) literate programming (via R Markdown); (3) automatic checks and safeguards that improve code reproducibility; and (4) sharing code and results via a browsable website. These features exploit powerful existing tools, whose mastery would take considerable study. However, the workflowr interface is simple enough that novice users can quickly enjoy its many benefits. By simply following the workflowr "workflow", R users can create projects whose results, figures, and development history are easily accessible on a static website — thereby conveniently shareable with collaborators by sending them a URL — and accompanied by source code and reproducibility safeguards. The workflowr R package is open source and available on CRAN, with full documentation and source code available at https://github.com/jdblischak/workflowr.
APA, Harvard, Vancouver, ISO, and other styles
49

Khalfallah, Malik, and Parisa Ghodous. "CODVerif." International Journal of Systems and Service-Oriented Engineering 12, no. 1 (January 1, 2022): 1–23. http://dx.doi.org/10.4018/ijssoe.315582.

Full text
Abstract:
CODVerif is an approach that aims to verify the data being inserted in a data store continuously. CODVerif leverages the combination of ontology and workflow technologies in order to define workflows that are specific to the domain of “monitoring data insertion.” These domain-specific workflows are constrained on two dimensions: (1) They use a set of workflow elements that are specific to the “monitoring data insertion” domain. (2) The logic that these workflows support is predefined by relying on a set of common data insertion scenarios. Nevertheless, CODVerif is flexible enough to allow users to define continuous data verification workflows with higher complexity logic thanks to workflow operators that can be applied on the “monitoring data insertion domain”-specific workflows. To illustrate the applicability of CODVerif, the authors deploy it in a customer relationship management (CRM) application and show how CODVerif is used to support users to verify the data they populate in the CRM. They have also evaluated the CODVerif approach.
APA, Harvard, Vancouver, ISO, and other styles
50

Lai, Yi Feng, Yan Ru Tan, Shu Xin Oh, Xianqi Wang, and Angelina Hui Min Tan. "Putting pharmacists in specialist outpatient clinics: a pilot approach to integrate services under one roof." Proceedings of Singapore Healthcare 27, no. 4 (February 22, 2018): 290–93. http://dx.doi.org/10.1177/2010105818760049.

Full text
Abstract:
Patient journey with multiple stopovers has long been recognized as suboptimal and attempts to co-locate pharmacists with physicians have been explored in various healthcare systems to integrate processes and improve patient experience. This prospective study aims to evaluate and compare process efficiency between a decentralized prescription review workflow (intervention) and conventional prescription filling workflow (control). Both workflows were concurrently assessed in selected specialist outpatient clinics. Outcome measure was end-to-end prescription processing time between intervention and control workflows. A total of 1117 complete prescription time–motion data entered analysis. There was significant reduction in patients’ waiting time of approximately 25% (803.6 ± 409.0 s vs 618.6 ± 468.3 s, p < 0.001). For patients undergoing intervention workflow, instant collection of medication was achieved 96% of the time. However, reduction in dispensing time spent in intervention arm was not observed compared to control workflow. The findings may support further modification and implementation of the decentralized workflow in other healthcare institutions in order to realize team-based patient-centered care that ensures timely supply of medications that are optimized for the purpose of treatment.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography