Academic literature on the topic 'Multiple-stage processes'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Multiple-stage processes.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Multiple-stage processes"

1

MEIKRANTZ, D. H., S. B. MEIKRANTZ, and L. L. MACALUSO. "ANNULAR CENTRIFUGAL CONTACTORS FOR MULTIPLE STAGE EXTRACTION PROCESSES." Chemical Engineering Communications 188, no. 1 (October 2001): 115–27. http://dx.doi.org/10.1080/00986440108912900.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Jearkpaporn, D., C. M. Borror, G. C. Runger, and D. C. Montgomery. "Process monitoring for mean shifts for multiple stage processes." International Journal of Production Research 45, no. 23 (December 2007): 5547–70. http://dx.doi.org/10.1080/00207540701325371.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Zeng, Rong, Liang Huang, Jianjun Li, Hongwei Li, Hui Zhu, and Xiaoting Zhang. "Quantification of multiple softening processes occurring during multi-stage thermoforming of high-strength steel." International Journal of Plasticity 120 (September 2019): 64–87. http://dx.doi.org/10.1016/j.ijplas.2019.04.010.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Imaizumi, Mitsuhiro, and Mitsutaka Kimura. "Multiple-Stage Policies for a Server System with Illegal Access." International Journal of Reliability, Quality and Safety Engineering 25, no. 04 (June 6, 2018): 1850016. http://dx.doi.org/10.1142/s021853931850016x.

Full text
Abstract:
As the Internet technology has developed, the demands for the improvement of the reliability and security of the system connected with the Internet have increased. Although various services are performed on the Internet, illegal access on the Internet has become a problem in recent years. This paper formulates stochastic models for a system with illegal access. The server has the function of IDS, and illegal access is checked in multiple stages which consist of simple check, detailed check and dynamic check. We apply the theory of Markov renewal processes to a system with illegal access, and derive the mean time and the expected checking number until a server system becomes faulty. Further, optimal policies which minimize the expected cost are discussed. Finally, numerical examples are given.
APA, Harvard, Vancouver, ISO, and other styles
5

Barad, Miryam, and Daniel Braha. "Control Limits for Multi-Stage Manufacturing Processes with Binomial Yield (Single and Multiple Production Runs)." Journal of the Operational Research Society 47, no. 1 (January 1996): 98. http://dx.doi.org/10.2307/2584255.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Zhao, Chunhui, Fuli Wang, Ningyun Lu, and Mingxing Jia. "Stage-based soft-transition multiple PCA modeling and on-line monitoring strategy for batch processes." Journal of Process Control 17, no. 9 (October 2007): 728–41. http://dx.doi.org/10.1016/j.jprocont.2007.02.005.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Barad, Miryam, and Daniel Braha. "Control Limits for Multi-stage Manufacturing Processes with Binomial Yield (Single and Multiple Production Runs)." Journal of the Operational Research Society 47, no. 1 (January 1996): 98–112. http://dx.doi.org/10.1057/jors.1996.9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Khabirov, F. A., T. I. Khaybullin, E. V. Granatov, and S. R. Shakirzianova. "Effect of cerebrolysin on remyelination processes in multiple sclerosis patients in stage of relapse regression." Zhurnal nevrologii i psikhiatrii im. S.S. Korsakova 116, no. 12 (2016): 48. http://dx.doi.org/10.17116/jnevro201611612148-53.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Mukherjee, Indrajit, and Pradip Kumar Ray. "Optimal process design of two-stage multiple responses grinding processes using desirability functions and metaheuristic technique." Applied Soft Computing 8, no. 1 (January 2008): 402–21. http://dx.doi.org/10.1016/j.asoc.2007.02.005.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Kapur, Narinder, John Millar, Chris Colbourn, Pat Abbott, Philip Kennedy, and Tom Docherty. "Very Long-Term Amnesia in Association with Temporal Lobe Epilepsy: Evidence for Multiple-Stage Consolidation Processes." Brain and Cognition 35, no. 1 (October 1997): 58–70. http://dx.doi.org/10.1006/brcg.1997.0927.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Multiple-stage processes"

1

Yee, Kevin Wing Kan Chemical Sciences &amp Engineering Faculty of Engineering UNSW. "Operability analysis of a multiple-stage membrane process." Publisher:University of New South Wales. Chemical Sciences & Engineering, 2008. http://handle.unsw.edu.au/1959.4/41287.

Full text
Abstract:
Membrane processes have found increasing industrial applications worldwide. For membrane processes to deliver their desired performances and mitigate the effect of disturbances, automatic controllers must be installed. Before the installation of controllers, operability analysis is a crucial step to evaluate how well the processes can be controlled, and to determine how process design can be improved for better control. However, existing applications of operability analysis in membrane processes are limited. This thesis extends the application of operability analysis to a multiple-stage membrane process, exemplified by a detailed case study of a 12-stage industrial whey ultrafiltration (UF) process. Process dynamic models are determined to describe the transient behaviour of process performance caused by disturbances and long-term fouling. Steady-state nonlinear operability analysis is conducted to identify inherent limitations of the process. Using the process dynamic models, dynamic operability analysis is performed to determine the effects of dynamic behaviour on process and controller design. Steady-state operability analysis shows that the whey UF process is not able to mitigate the effects of high concentrations of true protein in the fresh whey feed. The ability of the process to mitigate the effects of disturbances is also adversely affected by long-term membrane fouling. Mid-run washing is therefore necessary to restore control performance after long periods of operation. Besides demonstrating the adverse effects of long-term membrane fouling on operability, dynamic operability analysis identifies the manipulated variables that can deliver the best control performance. It also indicates that control performance can be improved by installing equipment (e.g. buffer tanks) upstream of the process. Dynamic operability analysis shows that recycling of the retentate stream has a profound effect on the plant-wide dynamics and reduces significantly the achievable speed of process response under automatic control. However, retentate recycling is essential during operation to minimize membrane fouling. Although reducing the number of stages in the whey UF process can improve the achievable speed of process response under automatic control, process performance will fluctuate significantly from its desired level. A trade-off therefore exists between process performance and control performance that should be addressed during process and controller design.
APA, Harvard, Vancouver, ISO, and other styles
2

Kam, Kiew M. "Simulation and implementation of nonlinear control systems for mineral processes." Curtin University of Technology, School of Chemical Engineering, 2000. http://espace.library.curtin.edu.au:80/R/?func=dbin-jump-full&object_id=10063.

Full text
Abstract:
Differential geometric nonlinear control of a multiple stage evaporator system of the liquor burning facility associated with the Bayer process for alumina production at Alcoa Wagerup alumina refinery, Western Australia was investigated.Mathematical models for differential geometric analysis and nonlinear controller synthesis for the evaporator system were developed. Two models, that were structurally different from each other, were used in the thesis for simulation studies. Geometric nonlinear control structure, consisting of nonlinear state feedback control laws and multi-loop single-input single-output proportional-integral controllers, were designed for the industrial evaporator system. The superiority of the geometric nonlinear control structure for regulatory control of the evaporator system was successfully demonstrated through computer simulations and real-time simulator implementation. The implementation trial has verified the practicality and feasibility of these type of controllers. It also re-solved some practical issues of the geometric nonlinear control structure for industrial control applications. In addition, the implementation trial also established a closer link between the academic nonlinear control theory and the industrial control practices.Geometric nonlinear output feedback controller, consisting of the geometric nonlinear control structure and reduce-order observer was proposed for actual plant implementation on the evaporator system on-site. Its superior performance was verified through computer simulations, but its feasibility on the evaporator system on-site has yet to be investigated either through simulator implementation or actual plant implementation. This investigation was not performed due to the time constraint on the preparation of this thesis and the inavailability of the plant personnel required for this implementation.Robust ++
nonlinear control structures that are simple and computationally efficient have been proposed for enhancing the performance of geometric nonlinear controllers in the presence of plant/model mismatch and/or external disturbances. The robust nonlinear control structures are based on model error compensation methods. Robustness properties of the proposed robust nonlinear control structures on the evaporator system were investigated through computer simulations and the results indicated improved performance over the implemented geometric nonlinear controller in terms of model uncertainty and disturbance reductions.A software package was developed in MAPLE computing environment for the analysis of nonlinear processes and the design of geometric nonlinear controllers. This developed symbolic package is useful for obtaining fast and exact solutions for the analysis and design of nonlinear control systems. Procedures were also developed to simulate the geometric nonlinear control systems. It was found that MAPLE, while it is superior for the analyses and designs, is not viable for simulations of nonlinear control systems. This was due to limitation of MAPLE on the physical, or virtual, memory management. The use of both symbolic and numeric computation for solutions of nonlinear control system analysis, design and simulation is recommended.To sum up, geometric nonlinear controllers have been designed for an industrial multiple stage evaporator system and their simplicity, practicality, feasibility and superiority for industrial control practices have been demonstrated either through computer simulations or real-time implementation. It is hoped that the insights provided in this thesis will encourage more industry-based projects in nonlinear control, and thereby assist in closing the widening gap between academic nonlinear control theory and industrial control ++
practice.Keywords: geometric nonlinear control, input-output linearization, multiple stage evaporator, robust geometric nonlinear control, control performance enhancement.
APA, Harvard, Vancouver, ISO, and other styles
3

Monsalve, Carlos. "Representation of business processes at multiple levels of abstraction (strategic, tactical and operational) during the requirements elicitation stage of a software project, and the measurement of their functional size with ISO 19761." Mémoire, École de technologie supérieure, 2012. http://espace.etsmtl.ca/1098/1/MONSALVE_Carlos.pdf.

Full text
Abstract:
Cette thèse vise d’abord à apporter une aide et un soutien aux ingénieurs de logiciels et aux analystes d’affaires afin qu’ils puissent mieux modéliser les processus d’affaires lorsque ces modèles sont destinés à la spécification des exigences logicielles et assignées à la mesure de la taille fonctionnelle à la seule fin que ces personnes puissent estimer correctement tout projet. Quant à la thèse, elle-même, elle vise un but précis: contribuer à la représentation des processus d'affaires lorsqu’ils sont utilisés au moment de la phase d'«élicitation» des exigences logicielles. Pour atteindre ce but, deux objectifs de recherche ont été clairement définis: 1. Proposer une nouvelle approche de modélisation qui génère des modèles de processus d’affaires qui doivent être utilisés dans une activité d’«élicitation» des exigences logicielles. Mentionnons que l'approche de modélisation ne devrait pas augmenter de manière significative la complexité des notations graphiques utilisées pour représenter les processus d'affaires, pour peu que cette approche doive permettre la participation active des différents acteurs impliqués dans un projet de logiciel typique pour représenter, de façon cohérente et structurée, leurs besoins et leurs contraintes. 2. Élaborer une «procédure» afin de pouvoir mesurer la taille fonctionnelle d’une application logicielle à partir des modèles de processus d’affaires. Cette «procédure» de mesure doit respecter la norme COSMIC ISO 19761; cette marche à suivre doit pouvoir être appliquée indépendamment de la notation graphique utilisée pour représenter les processus d'affaires. Afin d’atteindre le premier objectif, cette thèse propose une nouvelle approche de modélisation (surnommée BPM+) qui offre la possibilité de modéliser des processus d’affaires selon trois niveaux d'abstraction: 1) le niveau stratégique, 2) le niveau tactique et 3) le niveau opérationnel. À partir d’une revue de la littérature, une version a priori de BPM+ a été conçue. Cette version a priori a été ensuite améliorée à la suite d’une étude de cas dans le milieu industriel. Cette dernière est devenue plus performante lorsque nous l’avons soumise aux analyses ontologiques pour l’ensemble des concepts des exigences logicielles et que des enquêtes scientifiques ont été élaborées auprès d’experts concernés. Finalement, une version révisée du BPM+ a été proposée. Cette version révisée a été par la suite évaluée par une deuxième étude de cas. La version finale de BPM+ a donc été fondée sur plusieurs confirmations et preuves obtenues à partir de diverses sources. Quant au second objectif, la «procédure» de mesure a été élaborée à partir d’une comparaison analytique entre les spécifications de COSMIC et celles des notations graphiques sélectionnées pour cette recherche (i.e. BPMN et Qualigram). Cette comparaison a permis de définir un ensemble de lignes directrices de modélisation pour le type de logiciels d’affaires. La comparaison analytique a permis également de définir un ensemble de règles de correspondance entre les concepts des notations graphiques et les concepts de COSMIC. En outre, les lignes directrices de modélisation ont été adaptées pour le type de logiciels en temps réel. La «procédure» de mesure a été évaluée en comparant ses résultats à ceux qui ont été obtenus dans des études de cas de référence. Les résultats obtenus par cette recherche démontrent ce qui suit: 1. BPM+ permet de générer des modèles de processus d’affaires qui représentent, de façon cohérente et structurée, les besoins des différents acteurs impliqués; 2. La notation Qualigram est mieux adaptée à la conception de BPM+. De surcroît, la notation Qualigram est plus facile d’utilisation pour les parties prenantes qui ne sont pas impliquées en informatique, tandis que BPMN est plus facile pour celles qui sont impliquées en informatique; 3. La «procédure» de mesure a été appliquée avec succès en utilisant deux différentes notations graphiques: Qualigram et BPMN. Celle-ci a également été mis en application avec succès à deux types différents de logiciels: le type de logiciels d'affaires et le type de logiciels en temps réel; 4. La précision de la «procédure» de mesure a été en conformité avec toutes les règles de la norme ISO /IEC 19761.
APA, Harvard, Vancouver, ISO, and other styles
4

Lisnati, Jayadi Ester, Najmus Sadat, and Hugo Richit. "Humanitarian Supply Chain: Improvement of Lead Time Effectiveness and Costs Efficiency : A multiple case study on the preparedness stage of humanitarian organizations with their partners." Thesis, Linnéuniversitetet, Institutionen för ekonomistyrning och logistik (ELO), 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-96013.

Full text
Abstract:
Title: Humanitarian Supply Chain: Improvement of lead time effectiveness and cost-efficiency. A multiple case study on the preparedness stage of humanitarian organizations with their partners. Authors: Ester Lisnati Jayadi, Hugo Richit, Najmus Sadat. Background: 315 natural disasters events were reported, causing 11,804 deaths, affecting 68 million people, and costing US$131.7 billion in economic losses worldwide. This fact emerges the importance of humanitarian organizations (HOs) to act in reducing suffering and improving peoples’ life. However, the greater donations and support to HOs still do not solve this enormous issue at all which forces HOs to pursue greater accountability by improving their effectiveness and efficiency in terms of time and cost in disaster activities, especially in preparedness activities. No single actors like HOs have sufficient resources to solve the disaster problem alone; thus, they need partners to work hand in hand to relieve the suffering. Performance measurement through integration called performance management process is the key to enabling HOs and their partners to achieve the lead time effectiveness and cost-efficiency. Purpose: The purposes of the study are to explore which performance measurements are needed between humanitarian organizations with their partners and to explore how to integrate their relationship to improve lead time effectiveness and cost-efficiency. Then, the suggestions can be made by fulfilling the purposes. Method: A multiple case study by utilizing qualitative data through semi-structured interviews. Findings and Conclusions: RQ 1. What HSC (humanitarian supply chain) performance measurements are needed in the HSC’s preparedness stage in order to achieve the lead time effectiveness and cost efficiency? The performance required measurements in HSC’s preparedness stage to achieve lead-time effectiveness, and cost-efficiency are organizational procedures, learning and evaluation, HO’s mission, feedback, budgeting, fund management, sourcing, human and resource management, IT utilization, infrastructure utilization, human resources utilization, delivery time, knowledge management, information sharing, and employee management. RQ 2. How to integrate the HSC performance management process in the HSC’s preparedness stage to improve the lead time effectiveness and cost efficiency? By implementing a proposed performance management process, aligning vision and mission, trusting each other, utilizing IT technologies, improving the language, and applying standardization in HSC. Keywords: Humanitarian Supply Chain Management. Humanitarian Supply Chain. Preparedness Stage. Natural Disasters. Humanitarian Organizations. Partners. Dyads. Multiple Case Studies. Performance Measurements. Performance Management Process. Supply Chain Process Integration
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Multiple-stage processes"

1

Guo, Yong, and Claudia F. Lucchinetti. Taking a Microscopic Look at Multiple Sclerosis. Oxford University Press, 2016. http://dx.doi.org/10.1093/med/9780199341016.003.0005.

Full text
Abstract:
The pathology of multiple sclerosis is complex, extends beyond the white matter plaque, and is influenced by stage of demyelinating activity, clinical course, disease duration, and treatment. Technological advances in immunology, molecular biology, and “omic” biology have provided novel insights into the mechanisms for development of white matter plaques, axonal damage, cortical demyelination, and disease progression. Detailed, systematic, and statistically rigorous pathological studies on clinically well-characterized MS cohorts have helped define the heterogeneous pathological substrates of MS and unravel the complex molecular pathogenic mechanisms, with the ultimate goal of identifying targets for therapeutic interventions. It is increasingly clear that the use of human tissues is imperative to improve current diagnostic, prognostic, and therapeutic modalities. Preclinical animal models have been invaluable for discovery of key immune processes, basic disease mechanisms, and candidate immune targeting strategies, but the conclusions have yet be reconciled with the essential features of the human disease.
APA, Harvard, Vancouver, ISO, and other styles
2

Raeff, Catherine. Exploring the Complexities of Human Action. Oxford University Press, 2020. http://dx.doi.org/10.1093/oso/9780190050436.001.0001.

Full text
Abstract:
Exploring the Complexities of Human Action offers a bold theoretical framework for thinking systematically and integratively about what people do as they go about their complex lives in all corners of the world. The book offers a vision of humanity that promotes empathic understanding of complex human beings that can bring people together to pursue common goals. Raeff sets the stage for conceptualizing human action by characterizing what people do in terms of the complexities of holism, dynamics, variability, and multicausality. She also constructively questions some conventional practices and assumptions in psychology (e.g., fragmenting, objectifying, aggregating, deterministic causality). The author then articulates a systems conceptualization of action that emphasizes multiple and interrelated processes. This integrative conceptualization holds that action is constituted by simultaneously occurring and interrelated individual, social, cultural, bodily, and environmental processes. Action is further conceptualized in terms of simultaneously occurring and interrelated psychological processes (e.g., sensing, perceiving, thinking, feeling, interacting, self/identity), as well as developmental processes. This theoretical framework is informed by research in varied cultures, and accessible examples are used to illustrate major concepts and claims. The book also discusses some implications and applications of the theoretical framework for investigating the complexities of human action. The book shows how the theoretical framework can be used to think about a wide range of action, from eating to art. Raeff uses the theoretical framework to consider varied vexing human issues, including mind–body connections, diversity, extremism, and freedom, as well as how action is simultaneously universal, culturally particular, and individualized.
APA, Harvard, Vancouver, ISO, and other styles
3

Newstok, Scott. Making ‘Music at the Editing Table’. Edited by James C. Bulman. Oxford University Press, 2017. http://dx.doi.org/10.1093/oxfordhb/9780199687169.013.2.

Full text
Abstract:
Orson Welles was as multifaceted as Shakespeare in drawing his material from manifold sources across multiple media. His 1952 film Othello strategically echoes Verdi and Boito’s 1887 opera Otello, and thereby vindicates his adaptation’s liberties by triangulating and transmediating his sources. Invoking Verdi also permitted Welles to contrast his own Shakespeare films with those of Laurence Olivier, whom Welles dismissed as merely a transcriber of stage versions. In contrast, Welles described his own editing practice as being more akin to musical composition. Attending to Welles’s recurrent annexation of opera offers a more suggestive account of his creative process and ultimate achievement.
APA, Harvard, Vancouver, ISO, and other styles
4

Tischer, Daniel, and John Hoffmire. Moving Towards 100% Employee Ownership Through ESOPs. Edited by Jonathan Michie, Joseph R. Blasi, and Carlo Borzaga. Oxford University Press, 2017. http://dx.doi.org/10.1093/oxfordhb/9780199684977.013.20.

Full text
Abstract:
The literature on Employee Stock Ownership Plans (ESOPs) has developed significantly over the past decades. Yet, despite ESOPs being well conceptualized, the deals struck in the real world are often more complex endeavours than suggested. While there are examples of ESOP deals as a one-stage process, it is often the case that ownership is transferred in multiple steps financed through subordinated debt. In addressing this added complexity, we will introduce concepts of ESOPs before providing a detailed description of what an add-on transaction entails. In doing so, we are particularly interested in describing key steps with focus on the impact on business and employee-owners. The paper will provide readers with additional insights into the widely used practice of multi-tranche ESOPs. Understanding the agents involved in the process, as well as the impact and potential pitfalls of add-on transactions are crucial factors in developing ESOPs as an alternative to external buy-outs.
APA, Harvard, Vancouver, ISO, and other styles
5

Boyd, Brian. Making Adaptation Studies Adaptive. Edited by Thomas Leitch. Oxford University Press, 2017. http://dx.doi.org/10.1093/oxfordhb/9780199331000.013.34.

Full text
Abstract:
An evolutionary (or “adaptationist”) perspective on adaptation studies offers ways past the “fidelity discourse” that has long vexed adaptation scholars. Biological adaptation forgoes exact fidelity to solve the new problems posed by inevitably changing environments, in a process that is fertile as well as faithful. Artistic adaptation also looks two ways, toward retention or fidelity and toward innovation or fertility. The complex and multiple adaptations and hybridizations of art and nature, of page, stage, screen, and painting in Nabokov’s 1969 novel Ada suggest that the more exactly you know your world, or the world of art, the more you can transform them as you wish. Charlie Kaufman’s 2002 screenplay Adaptation. resembles Ada not only in spotlighting orchids but also in being meta-adaptational, addressing, like Ada, both fidelity within adaptation and the creative fertility to be found in building on prior design but moving beyond fidelity.
APA, Harvard, Vancouver, ISO, and other styles
6

Bass, Cristina, Barbara Bauce, and Gaetano Thiene. Arrhythmogenic right ventricular cardiomyopathy: diagnosis. Oxford University Press, 2018. http://dx.doi.org/10.1093/med/9780198784906.003.0360.

Full text
Abstract:
Arrhythmogenic cardiomyopathy is a heart muscle disease clinically characterized by life-threatening ventricular arrhythmias and pathologically by an acquired and progressive dystrophy of the ventricular myocardium with fibrofatty replacement. The clinical manifestations of arrhythmogenic cardiomyopathy vary according to the ‘phenotypic’ stage of the underlying disease process. Since there is no ‘gold standard’ to reach the diagnosis of arrhythmogenic cardiomyopathy, multiple categories of diagnostic information have been combined. Different diagnostic categories include right ventricular morphofunctional abnormalities (by echocardiography and/or angiography and/or cardiovascular magnetic resonance imaging), histopathological features on endomyocardial biopsy, electrocardiogram, arrhythmias, and family history, including genetics. The diagnostic criteria were revised in 2010 to improve diagnostic sensitivity, but with the important prerequisite of maintaining diagnostic specificity. Quantitative parameters have been put forward and abnormalities are defined based on the comparison with normal subject data. A definite diagnosis of arrhythmogenic cardiomyopathy is achieved when two major, or one major and two minor, or four minor criteria from different categories are met. The main differential diagnoses are idiopathic right ventricular outflow tract tachycardia, myocarditis, sarcoidosis, dilated cardiomyopathy, right ventricular infarction, congenital heart diseases with right ventricular overload, and athlete’s heart. Among diagnostic tools, contrast-enhanced cardiovascular magnetic resonance is playing a major role in detecting subepicardial-midmural left ventricular free wall involvement, even preceding morphofunctional abnormalities. Moreover, electroanatomical mapping is an invasive tool able to detect early right ventricular free wall involvement in terms of low-voltage areas. Both techniques are increasingly used in the diagnostic work-up although are not yet part of diagnostic criteria.
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Multiple-stage processes"

1

Ogawa, Rei. "Mechanobiology of Cutaneous Scarring." In Textbook on Scar Management, 11–18. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-44766-3_2.

Full text
Abstract:
AbstractThe last phase of cutaneous wound healing produces the scar. Under normal circumstances, the immature scar then undergoes the scar maturation process over several months. This process involves tissue remodeling, which associates with a natural decrease in the inflammation and the numbers of blood vessels, collagen fibers, and fibroblasts. However, sometimes the scar maturation process is not properly engaged because inflammation continues in the scar. Consequently, the immature scar stage is prolonged. This results in the pathological scars called hypertrophic scars and keloids. Many factors that prolong the inflammatory stage have been identified. However, multiple lines of evidence acquired in recent years suggest that mechanical force can be an important cause of pathological scar development.
APA, Harvard, Vancouver, ISO, and other styles
2

Chen, Chun-Yen, Teng-Wen Chang, and Chi-Fu Hsiao. "Developing a Digital Interactive Fabrication Process in Co-existing Environment." In Proceedings of the 2020 DigitalFUTURES, 27–35. Singapore: Springer Singapore, 2021. http://dx.doi.org/10.1007/978-981-33-4400-6_3.

Full text
Abstract:
Abstract In the stage of prototype practice, the maker mainly works by himself, but it needs to test and adapt to find correct fabrication method when maker didn’t have clearly fabrication description. Therefore, rapid prototyping is very important in the prototype practice of the maker. “Design- Fabrication-Assembly” (DFA)- an integration prototyping process which helps designers in creating kinetic skin by following a holistic process. However, DFA lacks a medium for communication between design, fabrication and assembly status. This paper proposes a solution called co-existing Fabrication System (CoFabs) by combining multiple sensory components and visualization feedbacks. We combine mixed reality (MR) and the concept of digital twin (DT)–a device that uses a virtual interface to control a physical mechanism for fabrication and assembly. By integrating virtual and physical, CoFab allows designers using different methods of observation to prototype more rigorously and interactively correct design decisions in real-time.
APA, Harvard, Vancouver, ISO, and other styles
3

Niedt, Greg. "A Tale of Three Villages: Contested Discourses of Place-Making in Central Philadelphia." In The Life and Afterlife of Gay Neighborhoods, 159–80. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-66073-4_7.

Full text
Abstract:
AbstractAs the acceptance of queer identities has proceeded in fits and starts over the last few decades, the question has been raised, is it still necessary to have dedicated queer spaces? City dwellers often reason that with supposed improvements in safety and social mixing, the “gay ghettos” that form a transitional stage in neighborhood revitalization should now become common areas. Yet the capitalist logic that drives this thinking often trades the physical threat of exclusion or violence for an existential one, jeopardizing a distinctive culture that remains valuable in the self-realization process of local queer citizens. This is visible not only in changing demographics, but also in the production of discourse across multiple levels; language and semiotics help to constitute neighborhoods, but also to conceptualize them. This chapter examines how public signs and artifacts reify and sustain three competing narratives of a single central Philadelphia neighborhood in flux: the traditionally queer “Gayborhood” that developed shortly after World War II, the officially designated “Washington Square West,” and the realtor-coined, recently gentrifying “Midtown Village.” I argue that the naming and describing of these spaces, and how their associated discourses are reflected by their contents, continues to play a role in the ongoing struggle for queer acceptance. Combining observational data of multimodal public texts (storefronts, flyers, street signs, etc.) and critical discourse analysis within the linguistic/semiotic landscapes paradigm, I present a critique of the presumed inevitability of queer erasure here. This is supplemented with a comparison of grassroots, bottom-up, and official, top-down documents in various media (maps, brochures, websites, social media, etc.) that perpetuate the different discourses. Ultimately, a change in urban scenery and how a neighborhood is envisioned only masks the fact that spaces of queer expression, marked by their eroding distinctiveness rather than their deviance, are still needed.
APA, Harvard, Vancouver, ISO, and other styles
4

Travin, Sergey Olegovich. "Kinetics and Mechanism of Ecochemical Processes." In Handbook of Research on Emerging Developments and Environmental Impacts of Ecological Chemistry, 109–36. IGI Global, 2020. http://dx.doi.org/10.4018/978-1-7998-1241-8.ch005.

Full text
Abstract:
Significant efforts of mankind and huge funds were spent to study the mechanisms of environmental processes. Recent decades have been marked by exponential growth of computer power and the accompanying decrease in the cost of computing. With regard to the mathematical modeling of physical and chemical processes that determine the quality of natural waters, atmosphere, and soil, this has led to the development of an extensive approach based on an increase in the number of components and reactions between them taken into consideration. In this chapter, the authors describe features of ecochemical systems and discuss the moments that complicates their prediction. Using the method of numerical experiment, they investigate the behavior of periodic systems with multiple stationary states. One conclusion is that the actual manifestation cannot be used to determine at what stage the impact occurred and to what stage of the food chain it relates. Another conclusion is that systems involving multiple stationary states are prone to bifurcations and chaotic jumps from one limit cycle to another.
APA, Harvard, Vancouver, ISO, and other styles
5

Thompson, Kate, and Lina Markauskaite. "Identifying Group Processes and Affect in Learners." In Cases on the Assessment of Scenario and Game-Based Virtual Worlds in Higher Education, 175–210. IGI Global, 2014. http://dx.doi.org/10.4018/978-1-4666-4470-0.ch006.

Full text
Abstract:
In the last five years, the analytical techniques for identifying the processes of online learning have developed to the point where applications for the assessment of learning can be discussed. This would be most appropriate for twenty-first century skills—such as collaboration, decision-making, and teamwork skills—which are the core learning outcomes in immersive learning environments. The state of the art in this field is still at the stage of discovering patterns of the processes of learning, identifying stages, and suggesting their meaning. However, already it is important to consider what technologies can offer and what information teachers need in order to evaluate students' situated performance and to provide useful feedback. This chapter describes an imagined virtual world, one that affords the range of twenty-first century skills, in order to illustrate types of analyes that could be conducted on learning process data. Such analytical methods could provide both descriptive information about the performance of learners and depict structures and patterns of their learning processes. The future assessment of learning in immersive virtual worlds may draw on data about deep embodied processes and multiple senses that usually underpin professional skills, such as affect, visual perception, and movement. This type of assessment could also provide deeper insights into many psychological processes in collaborative learning, decision-making, and problem-solving in virtual worlds, such as motivation, self-efficacy, and engagement. Overall, the view of the assessment presented in this chapter extends beyond the formal learning outcomes that are usually required by tertiary education quality and standards agencies and assessed in traditional courses in higher education to include a range of new capacities that may not be required but are essential for successful performance in contemporary workplaces.
APA, Harvard, Vancouver, ISO, and other styles
6

Thompson, Kate, and Lina Markauskaite. "Identifying Group Processes and Affect in Learners." In Gamification, 1479–505. IGI Global, 2015. http://dx.doi.org/10.4018/978-1-4666-8200-9.ch075.

Full text
Abstract:
In the last five years, the analytical techniques for identifying the processes of online learning have developed to the point where applications for the assessment of learning can be discussed. This would be most appropriate for twenty-first century skills—such as collaboration, decision-making, and teamwork skills—which are the core learning outcomes in immersive learning environments. The state of the art in this field is still at the stage of discovering patterns of the processes of learning, identifying stages, and suggesting their meaning. However, already it is important to consider what technologies can offer and what information teachers need in order to evaluate students' situated performance and to provide useful feedback. This chapter describes an imagined virtual world, one that affords the range of twenty-first century skills, in order to illustrate types of analyes that could be conducted on learning process data. Such analytical methods could provide both descriptive information about the performance of learners and depict structures and patterns of their learning processes. The future assessment of learning in immersive virtual worlds may draw on data about deep embodied processes and multiple senses that usually underpin professional skills, such as affect, visual perception, and movement. This type of assessment could also provide deeper insights into many psychological processes in collaborative learning, decision-making, and problem-solving in virtual worlds, such as motivation, self-efficacy, and engagement. Overall, the view of the assessment presented in this chapter extends beyond the formal learning outcomes that are usually required by tertiary education quality and standards agencies and assessed in traditional courses in higher education to include a range of new capacities that may not be required but are essential for successful performance in contemporary workplaces.
APA, Harvard, Vancouver, ISO, and other styles
7

Senthilkumar, V., Velmurugan C., K. R. Balasubramanian, and M. Kumaran. "Additive Manufacturing of Multi-Material and Composite Parts." In Advances in Civil and Industrial Engineering, 127–46. IGI Global, 2020. http://dx.doi.org/10.4018/978-1-7998-4054-1.ch007.

Full text
Abstract:
Additive manufacturing (AM) technology can be employed to produce multimaterial parts. In this approach, multiple types of materials are used for the fabrication of a single part. Custom-built functionally graded, heterogeneous, or porous structures and composite materials can be fabricated thorough this process. In this method, metals, plastics, and ceramics have been used with suitable AM methods to obtain multi-material products depending on functional requirements. The process of making composite materials by AM can either be performed during the material deposition process or by a hybrid process in which the combination of different materials can be performed before or after AM as a previous or subsequent stage of production of a component. Composite processes can be employed to produce functionally graded materials (FGM).
APA, Harvard, Vancouver, ISO, and other styles
8

Hernandez, Alexander A. "Green IT Adoption Practices in Education Sector." In Waste Management, 1379–95. IGI Global, 2020. http://dx.doi.org/10.4018/978-1-7998-1210-4.ch063.

Full text
Abstract:
Green IT is a resource efficient and effective consumption to reduce organizations processes impacts to the environment using information technology. This article aims to explore GIT practices of higher education institutions in the Philippines, where a qualitative multiple-case study is used. The study found that higher education institutions Green IT adoption covers the use of paperless and digital archiving systems, resource efficient IT equipment, responsible electronic waste disposal, recycling and reuse, and initiated awareness programs to educate the employees about Green IT and sustainability. The study also found that these practices are in its early stage of adoption in higher education institutions in the Philippines. This article also presents practical and research implications to further the uptake of Green IT in higher education institutions.
APA, Harvard, Vancouver, ISO, and other styles
9

Pagani, Margherita. "The Critical Role of Digital Rights Management Processes in the Context of the Digital Media Management Value Chain." In Information Security and Ethics, 3499–509. IGI Global, 2008. http://dx.doi.org/10.4018/978-1-59904-937-3.ch235.

Full text
Abstract:
This paper set out to analyze the impact generated by the adoption of Digital Rights Management (DRM) processes on the typical Digital Media Management Value Chain activities and try to analyze the processes in the context of the business model. Given the early stage of the theory development in the field of DRM the study follows the logic of grounded theory (Glaser and Strauss, 1967) by building the research on a multiple-case study methodology (Eisenhardt, 1989). The companies selected are successful players which have adopted DRM processes. These companies are Endemol, Digital Island, Adobe Systems, Intertrust, and the Motion Picture Association. In this paper we provide in-depth longitudinal data on these five players to show how companies implement DRM processes. Twelve DRM solution vendors are also analyzed in order to compare the strategies adopted. After giving a definition of Intellectual Property and Digital Rights Management (section 1) the paper provides a description of the typical Digital Media Management Value Chain Activities and players involved along the different phases examined (section 2). An in-depth description of Digital Rights Management processes is discussed in section 3. Digital Rights Management processes are considered in the context of business model and they are distinguished into content processes, finance processes and Rights Management processes. We conclude with a discussion of the model and main benefits generated by the integration of digital rights management and propose the most interesting directions for future research (section 4).
APA, Harvard, Vancouver, ISO, and other styles
10

Saha, Pallab. "E-Business Process Management and IT Governance." In Electronic Business, 1843–52. IGI Global, 2009. http://dx.doi.org/10.4018/978-1-60566-056-1.ch114.

Full text
Abstract:
E-business process management (e-BPM) entails management of e-business processes with the customer initiating the process and involves non-linear processes with strong focus on value networks leveraging collaboration and alliances, rather than just business processes within the confines of the organization (Kim & Ramkaran, 2004). E-BPM requires organizations to take a process approach to managing their e-business processes (Smith & Fingar, 2003). The advent of business process reengineering (BPR) (Davenport, 1993; Hammer & Champy, 1993) resulted in numerous organizations initiating BPR programs. While BPR aims to enhance an organization’s process capability by adopting engineering discipline, e-BPM goes a step further and targets to improve the organizational process management capability (Smith & Fingar, 2004). Organizations target end-to-end business processes that deliver maximum customer value through e-BPM (Smith & Fingar, 2003). However, by their very nature, end-to-end business processes more often than not span multiple enterprises incorporating their individual value chains (Porter, 1985; Smith & Fingar, 2003; Smith, Neal, Ferrara, & Hayden, 2002) and involve e-business processes (Kim & Ramkaran, 2004). Integrating fragments of processes across multiple functions and organizations not only involves shared activities and tasks among business and trading partners, but also the capability to integrate disparate IT systems (Kalakota & Robinson, 2003). Effective management of e-business processes depends to a great extent on the enabling information technologies. In fact, Smith and Fingar in 2003 have stated that BPM is about technology. Porter’s value chain is about end-to-end business processes needed to get from a customer order to the delivery of the final product or service (Porter, 1985). The pervasive use of technology has created a critical dependency on IT that demands for a specific focus on governance of IT (Grembergen, 2004). Explicitly or implicitly, organizations specify business activities as business processes, and without realizing these tend to be e-business processes. However, given the current business conditions and a clear understanding by organizations about the complexities of their e-business processes, management of e-business processes is taking center stage (Smith et al., 2002). In the current business scenario where e-business processes, along with information are considered key organizational assets and management of business processes a strategic capability (Kalakota & Robinson, 2003), it is imperative that organizations clearly delineate the need for relevant and pertinent information as it provides visibility and transparency. Additionally, IT being the single most important predictor of the business value of IT (Weill & Ross, 2004) drives the need to analyze and understand the implications of e-BPM on IT governance. The key objective of this article is to investigate the implications of e-BPM on IT governance through the analysis of available literature. In particular, the article argues that a direct influence of e-BPM on IT governance performance is inevitable. While the importance of both effective e-BPM and IT governance is intuitively clear, there is currently little research on elements of IT governance that get enabled by e-BPM. More importantly, there is the lack of a theoretical framework that could be used to analyze. To address this shortcoming, the article also presents an analysis framework. The analysis framework is particularly useful as it incorporates elements from prevalent IT governance frameworks. Using the analysis framework, the article then examines the implications of e-BPM on IT governance and develops research propositions. The aim of developing the propositions is to enable further investigation and research thereby contributing to IT management theory.
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Multiple-stage processes"

1

Qi, Yongsheng, Pu Wang, Xiuzhe Chen, and Xunjin Gao. "A Novel Stage-Based Multiple PCA Montoring Approach for Batch Processes." In 2010 International Conference on Computational and Information Sciences (ICCIS). IEEE, 2010. http://dx.doi.org/10.1109/iccis.2010.18.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Chunhui Zhao, Fuli Wang, Mingxing Jia, and Yuqing Chang. "Stage-based Multiple PCA Modeling and On-line Monitoring Strategy for Batch Processes." In 2006 6th World Congress on Intelligent Control and Automation. IEEE, 2006. http://dx.doi.org/10.1109/wcica.2006.1714189.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Sirin, Göknur, Torgeir Welo, Bernard Yannou, and Eric Landel. "Value Creation in Collaborative Analysis Model Development Processes." In ASME 2014 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. American Society of Mechanical Engineers, 2014. http://dx.doi.org/10.1115/detc2014-34696.

Full text
Abstract:
Integration and coordination of engineering analysis model is a vast development field in the context of complex product development. Engineers’ siloed way of working in combination with lack of efficiency in current model development process may cause inconsistency based on model interfaces, human errors, miscommunication between teams and misinterpretations. In lean terms, this may create multiple wastes, including waiting, overproduction leading to excess inventory, unnecessary processing and may be the most harmful: defects (e.g., incorrect models) with rework consequences. Hence, product manufacturing companies must establish effective processes to add value throughout the multidisciplinary distributed modeling environment. The goal of this paper is to propose a pull-control model development process, providing model architecture integration and coherent control in early design phase. This paper proposes also an appropriate reuse strategy; this allows for utilizing plug-and-play type modular product models managed through a single-source of authority concept. A pull-control development process helps prevent potential rework arising from inconsistencies related to definitions, know-how and stakeholders communication at an early stage of the design process. Also, the proposed black box models reuse strategy helps reduce human-related error such as lack of domain knowledge, experience and misinterpretations. The proposed method is used to identify and visualize potential improvement in terms of increased model transparency and reuse when transforming from the present to the suggested future modeling strategy. The research has been conducted by synthesizing findings from a literature review, in combination with observations and analysis of current analysis model development practices within the automotive OEM Renault in France.
APA, Harvard, Vancouver, ISO, and other styles
4

Suriano, Saumuy, Hui Wang, and S. Jack Hu. "Monitoring Multistage Surface Spatial Variations Using Functional Morphing." In ASME 2013 International Manufacturing Science and Engineering Conference collocated with the 41st North American Manufacturing Research Conference. American Society of Mechanical Engineers, 2013. http://dx.doi.org/10.1115/msec2013-1203.

Full text
Abstract:
In multistage manufacturing processes, the machined surface shape of a part changes as it goes through each stage. Process monitoring at multiple stages is necessary for root cause diagnosis and surface variation reduction. However, due to measurement time and capacity constraints, it is challenging to collect sufficient surface measurements at all intermediate stages for monitoring. This paper proposes a functional morphing based algorithm to monitor the surface variation propagation using end of line multi-resolution measurements supplemented with low resolution measurements at intermediate stages. The surface changes over multiple stages are captured by a functional morphing model which integrates geometric transformations with engineering insights. The model estimates a morphed surface prediction at an intermediate stage of interest using end-of-line surface measurements. This morphed surface is combined with the low-resolution measurements at that stage to improve the surface prediction accuracy. The model can be further improved by incorporating the effects of correlated process variables. Based on the model, abnormal surface variations can be detected and located by a single-linkage cluster monitoring algorithm as developed in our previous work. The case study of a two-stage machining process demonstrates that the method successfully monitors multistage surfaces using reduced measurement resolution at intermediate stages.
APA, Harvard, Vancouver, ISO, and other styles
5

Nagel, Robert L., Robert B. Stone, and Daniel A. McAdams. "A Theory for the Development of Conceptual Functional Models for Automation of Manual Processes." In ASME 2007 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. ASMEDC, 2007. http://dx.doi.org/10.1115/detc2007-35620.

Full text
Abstract:
Conceptual design is a vital stage in the development of any product, and its importance only increases with the complexity of a design. Functional modeling with the Functional Basis provides a framework for the conceptual design of electromechanical products. This framework is just as applicable to the conceptual design of automated solutions where an engineered product with components spanning multiple engineering domains is designed to replace or aid a human and his or her tools in a human-centric process. This paper presents research toward the simplification of the generation of conceptual functional models for automation solutions. The presented methodology involves the creation of functional and process models to fully explore existing human operated tasks for potential automation. Generated functional and process models are strategically combined to create a new conceptual functional model for an automation solution to potentially automate the human-centric task. The presented methodology is applied to the generation of a functional model for a conceptual automation solution. Then conceptual automation solutions generated through the presented methodology are compared to existing automation solutions to demonstrate the effectiveness of the presented methodology.
APA, Harvard, Vancouver, ISO, and other styles
6

Sharma, Sidharath, Martyn L. Jupp, Ambrose K. Nickson, and John M. Allport. "Ported Shroud Flow Processes and Their Effect on Turbocharger Compressor Operation." In ASME Turbo Expo 2017: Turbomachinery Technical Conference and Exposition. American Society of Mechanical Engineers, 2017. http://dx.doi.org/10.1115/gt2017-63678.

Full text
Abstract:
The ported shroud (PS) self-recirculating casing treatment is widely used to delay the onset of the surge by enhancing the aerodynamic stability of the turbocharger compressor. The increase in the stable operation region of the turbocharger compressor is achieved by recirculating the low momentum fluid that blocks the blade passage to the compressor inlet through a ported shroud cavity. While the ported shroud design delays surge, it comes with a small penalty in efficiency. This work presents an investigation of the flow processes associated with a ported shroud compressor and quantifies the effect of these flow mechanisms on the compressor operation. The full compressor stage is numerically modelled using a Reynolds Averaged Navier-Stokes (RANS) approach employing the shear stress transport (SST) turbulence model for steady state simulations at the design and near surge conditions. The wheel rotation is modelled using a multiple reference frame (MRF) approach. The results show that the flow exits the PS cavity at the near surge condition in the form of three jet-like structures of varying velocity amplitudes. Net entropy generation in the compressor model is used to assess the influence of the ported shroud design on the compressor losses, and the results indicate a small Inlet-PS mixing region is the primary source of entropy generation in the near surge conditions. The analysis also explores the trends of entropy generation at the design and the near surge condition across the different speed lines. The results show that the primary source of entropy generation is the impeller region for the design condition and the inlet-PS cavity region for the near surge condition.
APA, Harvard, Vancouver, ISO, and other styles
7

Srinivasan, Anand, and Chuck Impastato. "Application of Integral Geared Compressors in the Process Gas Industry." In ASME Turbo Expo 2013: Turbine Technical Conference and Exposition. American Society of Mechanical Engineers, 2013. http://dx.doi.org/10.1115/gt2013-95870.

Full text
Abstract:
The use of integral geared compressors (IGCs) is becoming increasingly popular in the process gas industry. IGCs offer certain unique advantages over inline type machines, making it ideally suited for real gas compression. Multiple rotor speeds can be effortlessly achieved by virtue of the gear ratios within each unit, thus resulting in an aerodynamically efficient design for the stages. The ability to cool the gas after every stage of compression using inter-coolers benefits the thermodynamic efficiencies. Multiple processes can be combined into one unit reducing overall costs and reduced real estate requirements. In this paper, the technical challenges associated with designing, building and testing these units have been presented. Case studies of applications of IGCs for real gas compression have also been presented.
APA, Harvard, Vancouver, ISO, and other styles
8

Gischner, B., P. Lazo, K. Richard, and R. Wood. "Enhancing Interoperability Throughout the Design & Manufacturing Process." In SNAME Maritime Convention. SNAME, 2005. http://dx.doi.org/10.5957/smc-2005-p21.

Full text
Abstract:
As part of the NSRP program, various tools and standards have been developed to enable the efficient exchange of product models during the design process. In particular, the ISE Project has developed and demonstrated the capability for successful transfer of structural, piping, and HVAC product models that could be used in detail design. During 2005, NSRP awarded a project for ISE Interoperability Modules (known as ISE4). This project will expand the testing and implementation of ISE tools to support both early stage design and manufacturing efforts. An International Standard (ISO 10303-215: Application Protocol for Ship Arrangements) will be used to exchange product model information during early stage design. Another task in this project entitled “Steel Processing” will focus on using ISE tools and the STEP Standard to define a shipbuilding product model format that will support the requirements of multiple, disparate manufacturing processes. The ISE4 Project also includes tasks to enhance interoperability by providing exchange capabilities for Engineering Analysis and Electrical data. This paper and presentation will show how these new ISE tools will facilitate the exchange of ship product models to support interoperability from early stage design (using Ship Arrangements) through manufacture.
APA, Harvard, Vancouver, ISO, and other styles
9

Campbell, Michael M. "Software Tools to Support Advanced Design Techniques and Processes." In ASME 2008 9th Biennial Conference on Engineering Systems Design and Analysis. ASMEDC, 2008. http://dx.doi.org/10.1115/esda2008-59063.

Full text
Abstract:
Collaboration between engineering and manufacturing can significantly reduce product costs, and increase product quality. The definition, capture and re-use of standard design features, with their associated proven manufacturing processes in the design stage can significantly reduce manufacturing cost and time to market. Today, 3D models are becoming the central repository for more and more of the critical information which is necessary throughout the Product Development Process. Significant process improvements are possible when organizations embrace the idea of a model-centric design approach, where not only geometry and attributes are captured in the 3D CAD model, but also other data relevant downstream data such as GD&T, 3D annotations, and now even manufacturing process information. The strategies for actually machining and producing designs are important assets for companies. Now, existing manufacturing process knowledge can be capture by the manufacturing engineer using XML based template, and through the use of new CAD technology, this knowledge can be attach to design features. The design feature geometry and attributes (along with the embedded process knowledge) can then be made available to the broader organization, through catalogs of company standard design features such as holes, pocket, step, groove, flange, .. etc. During the engineering activities, as the design model evolves, the design engineer is able to re-use these standard features, creating a 3D model that not only includes the geometric definition of the product, but also the validated, proven process by which that geometry can best be produced. Downstream, once the design is handed off to manufacturing, the manufacturing or process engineer has access to tools that will allow him to extract the process information from the 3D model and define rules to automate the creation of the machining process plan for this model. Specific fixtures required for the different steps of the process can be easily developed using the in-process 3D model, which is generated automatically based on stock removal. Multiple scenarios, based on varying machining resources, production quantity and cycle time, can be analyzed, allowing the process engineer to develop and optimized process plan. This model-centric approach, which leverages product and process data re-use, improves product quality and reduces manufacturing process planning and production time. Typical savings are realized in tool design, increased production throughput and savings due to improved process quality from using validated processes prior to production.
APA, Harvard, Vancouver, ISO, and other styles
10

Fu, J. Sophia, Zhenghui Sha, Yun Huang, Mingxian Wang, Yan Fu, and Wei Chen. "Two-Stage Modeling of Customer Choice Preferences in Engineering Design Using Bipartite Network Analysis." In ASME 2017 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. American Society of Mechanical Engineers, 2017. http://dx.doi.org/10.1115/detc2017-68099.

Full text
Abstract:
Customers’ choice decisions often involve two stages during which customers first use noncompensatory rules to form a consideration set and then make the final choice through careful compensatory tradeoffs. In this work, we propose a two-stage network-based modeling approach to study customers’ consideration and choice behaviors in a separate but integrated manner. The first stage models customer preferences in forming a consideration set of multiple alternatives, and the second stage models customers’ choice preference given individuals’ consideration sets. Specifically, bipartite exponential random graph (ERG) models are used in both stages to capture customers’ interdependent choices. For comparison, we also model customers’ choice decisions when consideration set information is not available. Using data from the 2013 China auto market, our results suggest that exogenous attributes (i.e., car attributes, customer demographics, and perceived satisfaction ratings) and the endogenous network structural factor (i.e., vehicle popularity) significantly influence customers’ decisions. Moreover, our results highlight the differences between customer preferences in the consideration stage and the purchase stage. To the authors’ knowledge, this is the first attempt of developing a two-stage network-based approach to analytically model customers’ consideration and purchase decisions, respectively. Second, this work further demonstrates the benefits of the network approach versus traditional logistic regressions for modeling customer preferences. In particular, network approaches are effective for modeling the inherent interdependencies underlying customers’ decision-making processes. The insights drawn from this study have general implications for the choice modeling in engineering design.
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Multiple-stage processes"

1

Tarko, Andrew P., Mario Romero, Cristhian Lizarazo, and Paul Pineda. Statistical Analysis of Safety Improvements and Integration into Project Design Process. Purdue University, 2020. http://dx.doi.org/10.5703/1288284317121.

Full text
Abstract:
RoadHAT is a tool developed by the Center for Road Safety and implemented for the INDOT safety management practice to help identify both safety needs and relevant road improvements. This study has modified the tool to facilitate a quick and convenient comparison of various design alternatives in the preliminary design stage for scoping small and medium safety-improvement projects. The modified RoadHAT 4D incorporates a statistical estimation of the Crash Reduction Factors based on a before-and-after analysis of multiple treated and control sites with EB correction for the regression-to-mean effect. The new version also includes the updated Safety Performance Functions, revised average costs of crashes, and the comprehensive table of Crash Modification Factors—all updated to reflect current Indiana conditions. The documentation includes updated Guidelines for Roadway Safety Improvements. The improved tool will be implemented at a sequence of workshops for the final end users and preceded with a beta-testing phase involving a small group of INDOT engineers.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography