Academic literature on the topic 'Input-output specification model'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Input-output specification model.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Input-output specification model"

1

ALEM, Habtamu. "Effects of model specification, short-run, and long-run inefficiency: an empirical analysis of stochastic frontier models." Agricultural Economics (Zemědělská ekonomika) 64, No. 11 (November 26, 2018): 508–16. http://dx.doi.org/10.17221/341/2017-agricecon.

Full text
Abstract:
This paper examines the recent advances in stochastic frontier (SF) models and its implications for the performance of Norwegian crop-producing farms. In contrast to the previous studies, we used a cost function in multiple input-output frameworks to estimate both long-run (persistent) and short-run (transient) inefficiency. The empirical analysis is based on unbalanced farm-level panel data for 1991–2013 with 3 885 observations from 455 Norwegian farms specialising in crop production. We estimated seven SF panel data models grouped into four categories regarding the assumptions used to the nature of inefficiency. The estimated cost efficiency scores varied from 53–95%, showing that the results are sensitive to how the inefficiency is modeled and interpreted.
APA, Harvard, Vancouver, ISO, and other styles
2

Xia, Xue, Yan Ru Zhong, Yu Chu Qin, and Liu Jing Ji. "Research on Operational Model of New-Generation GPS Based on Dynamic Description Logic." Applied Mechanics and Materials 128-129 (October 2011): 702–5. http://dx.doi.org/10.4028/www.scientific.net/amm.128-129.702.

Full text
Abstract:
Operation and Operator are the key technologies in the new-generation geometrical product specification and verification (GPS). In order to solve the geometrical specification problems of product functional requirements and the ambiguity problems of product specification, this paper utilizes a new method based on dynamic description logic to describe the fundamental concepts of geometrical specification. It analyzes the geometrical features of geometrical product functional specification. By establishing the model of the operations, describing the input and output parameters in the specification and verification process, and analyzing the preconditions and results of the executions of specification operator and verification operator, the paper simplifies the operation results and overcomes the shortcomings of ambiguity and inconsistency in product specification process. Finally, it takes the specification process of perpendicularity as an example to prove the feasibility and validity of this method.
APA, Harvard, Vancouver, ISO, and other styles
3

MO, YUCHANG, and XINMIN YANG. "A NEW APPROACH TO VERIFY STATECHART SPECIFICATIONS FOR REACTIVE SYSTEMS." International Journal of Software Engineering and Knowledge Engineering 18, no. 06 (September 2008): 785–802. http://dx.doi.org/10.1142/s0218194008003908.

Full text
Abstract:
The application domain, such as communication networks and embedded controllers for telephony, automobiles, trains and avionics systems, requires very high quality reactive systems, so an important phase in the development of reactive systems is the verification of their conceptual models before implementation. Normally in the requirement analysis phase, system property can be represented as an input and output labeled transition system (IOLTS), which is a transition system labeled by inputs and outputs between the system and the environment. This paper describes an attempt to propose an approach to verify Statechart specifications for reactive systems given IOLTS property specifications. We develop a suitable semantics model — observable semantics, an abstract semantics model only which describes outside observable behavior and suffers from less state explosion problem by reducing infinite or large state spaces to small ones. Then we propose two methods to verify the conformance between a given IOLTS property specification and a Statechart specification: two-phase verification and on-the-fly verification. Compared with two-phase verification, on-the-fly verification needs less storage and computation-time, especially when the target Statechart specification is very large or likely to have many errors.
APA, Harvard, Vancouver, ISO, and other styles
4

Többen, Johannes Reinhard, Martin Distelkamp, Britta Stöver, Saskia Reuschel, Lara Ahmann, and Christian Lutz. "Global Land Use Impacts of Bioeconomy: An Econometric Input–Output Approach." Sustainability 14, no. 4 (February 9, 2022): 1976. http://dx.doi.org/10.3390/su14041976.

Full text
Abstract:
Many countries have set ambiguous targets for the development of a bioeconomy that not only ensures sufficient production of high-quality foods but also contributes to decarbonization, green jobs and reducing import dependency through biofuels and advanced biomaterials. However, feeding a growing and increasingly affluent world population and providing additional biomass for a future bioeconomy all within planetary boundaries constitute an enormous challenge for achieving the Sustainable Development Goals (SDG). Global economic models mapping the complex network of global supply such as multiregional input–output (MRIO) or computable general equilibrium (CGE) models have been the workhorses to monitor the past as well as possible future impacts of the bioeconomy. These approaches, however, have often been criticized for their relatively low amount of detail on agriculture and energy, or for their lack of an empirical base for the specification of agents’ economic behavior. In this paper, we address these issues and present a hybrid macro-econometric model that combines a comprehensive mapping of the world economy with highly detailed submodules of agriculture and the energy sector in physical units based on FAO and IEA data. We showcase the model in a case study on the future global impacts of the EU’s bioeconomy transformation and find small positive economic impacts at the cost of a considerable increase in land use mostly outside of Europe.
APA, Harvard, Vancouver, ISO, and other styles
5

Deman, S. "Stability of Supply Coefficients and Consistency of Supply-Driven and Demand-Driven Input—Output Models." Environment and Planning A: Economy and Space 20, no. 6 (June 1988): 811–16. http://dx.doi.org/10.1068/a200811.

Full text
Abstract:
The motivation behind this paper is to review the debate on inconsistencies in and growing concern over the validity of a supply-driven model as an alternative to the Leontief demand-driven model for interindustry analysis. Both the Leontief demand-driven and the Ghoshian alternative supply-constrained allocation specification are critically discussed. Various claims and counterclaims have been addressed in the literature available on this issue. A general condition is derived for consistency between the two approaches, and a theorem is stated in which greater confidence is provided in the use of the supply-driven model for interindustry analysis.
APA, Harvard, Vancouver, ISO, and other styles
6

Tezak, Nikolas, Armand Niederberger, Dmitri S. Pavlichin, Gopal Sarma, and Hideo Mabuchi. "Specification of photonic circuits using quantum hardware description language." Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences 370, no. 1979 (November 28, 2012): 5270–90. http://dx.doi.org/10.1098/rsta.2011.0526.

Full text
Abstract:
Following the simple observation that the interconnection of a set of quantum optical input–output devices can be specified using structural mode VHSIC hardware description language, we demonstrate a computer-aided schematic capture workflow for modelling and simulating multi-component photonic circuits. We describe an algorithm for parsing circuit descriptions to derive quantum equations of motion, illustrate our approach using simple examples based on linear and cavity-nonlinear optical components, and demonstrate a computational approach to hierarchical model reduction.
APA, Harvard, Vancouver, ISO, and other styles
7

Ören, Tuncer. "Coupling concepts for simulation: A systematic and comprehensive view and advantages with declarative models." International Journal of Modeling, Simulation, and Scientific Computing 05, no. 02 (February 25, 2014): 1430001. http://dx.doi.org/10.1142/s1793962314300015.

Full text
Abstract:
A brief review of the importance of simulation-based engineering and science (including social sciences) is followed by a historic perspective of model-based simulation. Section 2 is on declarative modeling of component systems as well as its advantages for self-documentation and for computer-aided checks and coupling. As an example for declarative modeling, General System Theory (GEST) implementor is given. In Sec. 3, basic concepts for coupling of component models, and rules for computer-assisted coupling specification are explained. Section 4 is devoted to possible computerized checks in couplings of declarative models such as: (1) automatic unit checking to avoid meaningless input/output matching at the time of coupling specification, (2) automatic threshold checking to provide warnings and/or to avoid disasters, and (3) automatic unit conversion for convenience of using library models. Section 5 is about several layers of nested couplings for modeling systems of systems. In Sec. 6, two types of variable couplings are discussed: (1) couplings with variable connections (to allow input/output relations of models to depend on time or state conditions) and (2) coupling with variable component models (to allow component (or coupled) models to be switched based on time or state conditions). Section 7 is on the use of multimodels as component models in couplings. Section 8 is on types of inputs and their use in couplings as well as on external inputs to simulation studies. In Sec. 9, conclusions and future work for complex systems are outlined. Especially, the values of simulation systems engineering as well as understanding and avoidance of misunderstanding in cognitive and emotive simulations are stressed. Appendix A is a list of almost 50 types of couplings and Appendix B lists over 50 terms related with couplings in modeling and simulation. To show the richness of "input" concept which is important in specification of input/output relations of component models, Appendix C lists almost 150 types of inputs. Information shared in this article may be useful in developing advanced modeling and simulation software, tools and environments.
APA, Harvard, Vancouver, ISO, and other styles
8

Gnatenko, Anton Romanovich, and Vladimir Anatolyevich Zakharov. "On the Satisfiability and Model Checking for one Parameterized Extension of Linear-time Temporal Logic." Modeling and Analysis of Information Systems 28, no. 4 (December 18, 2021): 356–71. http://dx.doi.org/10.18255/1818-1015-2021-4-356-371.

Full text
Abstract:
Sequential reactive systems are computer programs or hardware devices which process the flows of input data or control signals and output the streams of instructions or responses. When designing such systems one needs formal specification languages capable of expressing the relationships between the input and output flows. Previously, we introduced a family of such specification languages based on temporal logics $LTL$, $CTL$ and $CTL^*$ combined with regular languages. A characteristic feature of these new extensions of conventional temporal logics is that temporal operators and basic predicates are parameterized by regular languages. In our early papers, we estimated the expressive power of the new temporal logic $Reg$-$LTL$ and introduced a model checking algorithm for $Reg$-$LTL$, $Reg$-$CTL$, and $Reg$-$CTL^*$. The main issue which still remains unclear is the complexity of decision problems for these logics. In the paper, we give a complete solution to satisfiability checking and model checking problems for $Reg$-$LTL$ and prove that both problems are Pspace-complete. The computational hardness of the problems under consideration is easily proved by reducing to them the intersection emptyness problem for the families of regular languages. The main result of the paper is an algorithm for reducing the satisfiability of checking $Reg$-$LTL$ formulas to the emptiness problem for Buchi automata of relatively small size and a description of a technique that allows one to check the emptiness of the obtained automata within space polynomial of the size of input formulas.
APA, Harvard, Vancouver, ISO, and other styles
9

Poison, Rudolph A., and C. Richard Shumway. "Structure of South Central Agricultural Production." Journal of Agricultural and Applied Economics 22, no. 2 (December 1990): 153–63. http://dx.doi.org/10.1017/s1074070800001905.

Full text
Abstract:
Abstract Using a dual economic specification of a multiproduct technology, the structure of agricultural production was tested for five South Central states (Texas, Oklahoma, Arkansas, Mississippi, and Louisiana). A comprehensive set of output supplies and input demands comprised the estimation equations in each state. Evidence of nonjoint production in a subset of commodities was detected in four of the five states. Several commodities also satisfied sufficient conditions for consistent aggregation. However, the specific outputs satisfying each structural property varied by state. Sufficient conditions for consistent geographic aggregation across the states were not satisfied. These results provide empirical guidance and important cautions for legitimately simplifying state-level model specifications of southern agricultural production.
APA, Harvard, Vancouver, ISO, and other styles
10

Hucka, Michael, Frank T. Bergmann, Andreas Dräger, Stefan Hoops, Sarah M. Keating, Nicolas Le Novère, Chris J. Myers, et al. "Systems Biology Markup Language (SBML) Level 2 Version 5: Structures and Facilities for Model Definitions." Journal of Integrative Bioinformatics 12, no. 2 (June 1, 2015): 731–901. http://dx.doi.org/10.1515/jib-2015-271.

Full text
Abstract:
Summary Computational models can help researchers to interpret data, understand biological function, and make quantitative predictions. The Systems Biology Markup Language (SBML) is a file format for representing computational models in a declarative form that can be exchanged between different software systems. SBML is oriented towards describing biological processes of the sort common in research on a number of topics, including metabolic pathways, cell signaling pathways, and many others. By supporting SBML as an input/output format, different tools can all operate on an identical representation of a model, removing opportunities for translation errors and assuring a common starting point for analyses and simulations. This document provides the specification for Version 5 of SBML Level 2. The specification defines the data structures prescribed by SBML as well as their encoding in XML, the eXtensible Markup Language. This specification also defines validation rules that determine the validity of an SBML document, and provides many examples of models in SBML form. Other materials and software are available from the SBML project web site, http://sbml.org/.
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Input-output specification model"

1

Hughes, Jack, and Dominic Orchard. "Resourceful Program Synthesis from Graded Linear Types." In Logic-Based Program Synthesis and Transformation, 151–70. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-68446-4_8.

Full text
Abstract:
AbstractLinear types provide a way to constrain programs by specifying that some values must be used exactly once. Recent work on graded modal types augments and refines this notion, enabling fine-grained, quantitative specification of data use in programs. The information provided by graded modal types appears to be useful for type-directed program synthesis, where these additional constraints can be used to prune the search space of candidate programs. We explore one of the major implementation challenges of a synthesis algorithm in this setting: how does the synthesis algorithm efficiently ensure that resource constraints are satisfied throughout program generation? We provide two solutions to this resource management problem, adapting Hodas and Miller’s input-output model of linear context management to a graded modal linear type theory. We evaluate the performance of both approaches via their implementation as a program synthesis tool for the programming language Granule, which provides linear and graded modal typing.
APA, Harvard, Vancouver, ISO, and other styles
2

Kohno, Hirotada, and Yoshiro Higano. "Optimal Allocation of the Public Funds to the Transportation Infrastructures Using the Interregional Input–Output Programming Model (Part II): Specification with Ten Regions, Ten Industries, and Nine Transport Modes." In New Frontiers in Regional Science: Asian Perspectives, 321–71. Tokyo: Springer Japan, 2022. http://dx.doi.org/10.1007/978-4-431-55221-5_6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Kohno, Hirotada, and Yoshiro Higano. "Optimum Allocation of the Capital Funds to the Transportation Infrastructures Using the Interregional Input–Output Programming Model (Part I): Specification with Five Regions, Five Industries, and Three Transport Modes." In New Frontiers in Regional Science: Asian Perspectives, 137–311. Tokyo: Springer Japan, 2022. http://dx.doi.org/10.1007/978-4-431-55221-5_4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

R J, Vijaya Saraswathi, Sukambika S, Wilcy Theresa F, Ishwarya M, and Srinithi T. "Modeling and Design of a Controller for a Dehumidifier." In Intelligent Systems and Computer Technology. IOS Press, 2020. http://dx.doi.org/10.3233/apc200129.

Full text
Abstract:
Many industries like pharmaceutics, plastics, food and confectionery, tobacco and cold storage, maintaining the humidity at the desired level is very important. The aim of this paper is to process the model and to design a controller for a dehumidifier to control the humidity at the desired set point. The air is humidified using a humidity chamber. This humid air which is the primary air is mixed with secondary air from the compressor and sent to the dehumidifier. The dehumidifier used here is a centrifugal separator. The humidity of the dehumidified air is measured using a sensor. The output signal of the sensor is compared to the set point. If the output measured exceeds the set point, accordingly the flow rate of the secondary air to the dehumidifier is varied until it reaches the desired set point .The model of the process is identified from the response of the process to the input signals. The controller is designed for the model obtained and the performance of the controller based on the time domain specification is compared.
APA, Harvard, Vancouver, ISO, and other styles
5

Goldin, Dina, David Keil, and Peter Wegner. "An Interactive Viewpoint on the Role of UML." In Unified Modeling Language, 250–64. IGI Global, 2001. http://dx.doi.org/10.4018/978-1-930708-05-1.ch015.

Full text
Abstract:
The role of the Unified Modeling Language (UML) is to model interactive systems, whose behaviors emerge from the interaction of their components with each other and with the environment. Unlike traditional (algorithmic) computation, interactive computation involves infinite and dynamic (late binding) input/output streams. Tools and models limited to an algorithmic paradigm do not suffice to express and manage the behavior of today’s interactive systems, which are capable of self-reconfiguring and adapting to their environment. Whereas procedural languages may express precise designs of closed processes, UML’s objective is to provide support for the analysis and specification of increasingly complex and inherently open systems. Interactive systems require dynamic models where interaction has first-class status, and where the environment is modeled explicitly, as a set of actors whose roles constrain the input patterns through use cases. UML’s interaction-based approach to system modeling fits well with the encapsulation-based, object-oriented approach to implementation. By coupling these approaches, the software engineering process can promise to provide a more complete solution to system design and implementation, leading the way for widespread adoption of networked and embedded intelligent agent technology. A theoretical framework for modeling interactive computing can strengthen the foundations of UML and guide its evolution.
APA, Harvard, Vancouver, ISO, and other styles
6

Hvannberg, Ebba Thóra, Sigrún Gunnarsdóttir, and Gyda Atladóttir. "From User Inquiries to Specification." In Encyclopedia of Human Computer Interaction, 220–26. IGI Global, 2006. http://dx.doi.org/10.4018/978-1-59140-562-7.ch035.

Full text
Abstract:
The aim of this article is to describe a method that helps analysts to translate qualitative data gathered in the field, collected for the purpose of requirements specification, to a model usable for software engineers. Requirements specification constitutes three different parts: functional requirements, quality requirements, and nonfunctional requirements. The first one specifies how the software system should function, who are the actors, and what are the input and output of the functions. The second one specifies what quality requirements the software should meet while operating in context of its environment such as reliability, usability, efficiency, portability, and maintainability. Finally, the third part specifies other requirements including context of use and development constraints. Examples of context of use are where and when the system is used, and examples of development constraints are human resources, cost and time constraints, technological platforms, and development methods. The role of the requirements specification is to give software engineers a basis for software design, and, later in the software development life cycle, to validate the software system. Requirements specification can also serve the purpose of validating users’ or customers’ view of the requirements of the system. On one hand, there has been a growing trend towards analyzing needs of the user and abilities through participatory design (Kuhn & Muller, 1993), activity theory (Bertelsen & Bødker, 2003), contextual design (Beyer & Holtzblatt, 1998), user-centered design (Gulliksen, Göransson, & Lif, 2001), and co-design and observation as in ethnography (Hughes, O’Brien, Rodden, Rouncefield, & Sommerville, 1995). Common to these methods is that qualitative data (Taylor & Bogdan, 1998) is collected, to understand the future environment of the new system, by analyzing the work or the tasks, their frequency and criticality, the cognitive abilities of the user and the users’ collaborators. The scope of the information collected varies depending on the problem, and sometimes it is necessary to gather data about the regulatory, social, and organizational contexts of the problem (Jackson, 1995) to be solved. The temporal and spatial contexts describe when the work should be carried out and where. On the other hand, software engineers have specified requirements in several different modelling languages that range from semiformal to formal. Examples of semiformal languages are UML (Larman, 2002), SADT, and IDEF (Ross, 1985). Examples of the latter are Z (Potter, Sinclair, & Till, 1996), VDM or ASM. Those are modelling languages for software development, but some languages or methods focus on task or work modelling such as Concurrent Task Trees (Paterno, 2003) and Cognitive Work Analysis (Vicente, 1999). Others emphasize more the specification method than a modelling language. Examples of the former are Scenario-Based Design (SBD) (Rosson & Carroll, 2002) and Contextual Design (Beyer & Holtzblatt, 1998). In practice, many software developers use informal methods to express requirements in text, e.g., as narrations or stories. Agile Development Methods (Abrahamsson, Warsta, Siponen, & Ronkainen, 2003) emphasize this approach and thereby are consistent with their aim of de-emphasizing methods, processes, and languages in favor of getting things to work. A popular approach to requirements elicitation is developing prototypes. There has been less emphasis on bridging the gap between the above two efforts, for example, to deliver a method that gives practical guidelines on how to produce specifications from qualitative data (Hertzum, 2003). One reason for this gap can be that people from different disciplines work on the two aspects, for example, domain analysts or experts in HCI, and software engineers who read and use requirements specification as a basis for design. Another reason for this gap may be in the difference in the methods that the two sides have employed, that is, soft methods (i.e., informal) for elicitation and analysis and hard methods (i.e., formal) for specification. In this article, we suggest a method to translate qualitative data to requirements specification that we have applied in the development of a Smart Space for Learning. The process borrows ideas from or uses scenarios, interviews, feature-based development, soft systems methodology, claims analysis, phenomenology, and UML.
APA, Harvard, Vancouver, ISO, and other styles
7

"Input/output of simulation and specification of models." In Simulation in the Design of Digital Electronic Systems, 96–110. Cambridge University Press, 1993. http://dx.doi.org/10.1017/cbo9781139170376.006.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Balsters, Herman. "Merging and Outsourcing Information Systems with UML." In IT Outsourcing, 2188–210. IGI Global, 2010. http://dx.doi.org/10.4018/978-1-60566-770-6.ch138.

Full text
Abstract:
Businesses can change their business structure by merging with other companies or, on the other end of the spectrum, by smoothly outsourcing some of their business processes to other more specialized parties. In this paper we will concentrate on conceptual modelling of merging and outsourcing information systems. Merging of a collection of information systems will be defined as the construction of a global information system that contains exactly the functionality of the original collection of systems. Such global information systems are called federated information systems, when we wish to address the situation where the component systems are so-called legacy systems; i.e. systems that are given beforehand and which are to interoperate in an integrated single framework in which the legacy systems are to maintain as much as possible their respective autonomy. Two major problems in constructing federated information systems concern achieving and maintaining consistency and a uniform representation of the data on the global level of the federation. The process of creation of uniform representations of data is known as data extraction, whereas data reconciliation is concerned with resolving data inconsistencies. Outsourcing of an information system, on the other hand, will be defined as the handing over of part of the functionality of the original system to an outside party (the supplier). Such functionality typically involves one or more operations, where each operation satisfies certain input- and output requirements. These requirements will be defined in terms of the ruling service level agreements (SLAs). We will provide a formal means to ensure that the outsourcing relationship between outsourcing party and supplier, determined by a SLA, satisfies specific correctness criteria. Formal specifications as offered in this paper can prove their value in the setup and evaluation of outsourcing contracts. We shall describe a uniform semantic framework for specification of both federated and outsourced information systems based on the UML/OCL data model. In particular, we will show that we can represent so-called exact views in UML/OCL, providing the means to capture the duality relation between federating and outsourcing.
APA, Harvard, Vancouver, ISO, and other styles
9

Balsters, Herman. "Merging and Outsourcing Information Systems with UML." In Global Information Technologies, 1021–43. IGI Global, 2008. http://dx.doi.org/10.4018/978-1-59904-939-7.ch078.

Full text
Abstract:
Businesses can change their business structure by merging with other companies or, on the other end of the spectrum, by smoothly outsourcing some of their business processes to other more specialized parties. In this paper we will concentrate on conceptual modelling of merging and outsourcing information systems. Merging of a collection of information systems will be defined as the construction of a global information system that contains exactly the functionality of the original collection of systems. Such global information systems are called federated information systems, when we wish to address the situation where the component systems are so-called legacy systems; i.e. systems that are given beforehand and which are to interoperate in an integrated single framework in which the legacy systems are to maintain as much as possible their respective autonomy. Two major problems in constructing federated information systems concern achieving and maintaining consistency and a uniform representation of the data on the global level of the federation. The process of creation of uniform representations of data is known as data extraction, whereas data reconciliation is concerned with resolving data inconsistencies. Outsourcing of an information system, on the other hand, will be defined as the handing over of part of the functionality of the original system to an outside party (the supplier). Such functionality typically involves one or more operations, where each operation satisfies certain input- and output requirements. These requirements will be defined in terms of the ruling service level agreements (SLAs). We will provide a formal means to ensure that the outsourcing relationship between outsourcing party and supplier, determined by a SLA, satisfies specific correctness criteria. Formal specifications as offered in this paper can prove their value in the setup and evaluation of outsourcing contracts. We shall describe a uniform semantic framework for specification of both federated and outsourced information systems based on the UML/OCL data model. In particular, we will show that we can represent so-called exact views in UML/OCL, providing the means to capture the duality relation between federating and outsourcing.
APA, Harvard, Vancouver, ISO, and other styles
10

Floudas, Christodoulos A. "Introduction." In Nonlinear and Mixed-Integer Optimization. Oxford University Press, 1995. http://dx.doi.org/10.1093/oso/9780195100563.003.0004.

Full text
Abstract:
This chapter introduces the reader to elementary concepts of modeling, generic formulations for nonlinear and mixed integer optimization models, and provides some illustrative applications. Section 1.1 presents the definition and key elements of mathematical models and discusses the characteristics of optimization models. Section 1.2 outlines the mathematical structure of nonlinear and mixed integer optimization problems which represent the primary focus in this book. Section 1.3 illustrates applications of nonlinear and mixed integer optimization that arise in chemical process design of separation systems, batch process operations, and facility location/allocation problems of operations research. Finally, section 1.4 provides an outline of the three main parts of this book. A plethora of applications in all areas of science and engineering employ mathematical models. A mathematical model of a system is a set of mathematical relationships (e.g., equalities, inequalities, logical conditions) which represent an abstraction of the real world system under consideration. Mathematical models can be developed using (i) fundamental approaches, (ii) empirical methods, and (iii) methods based on analogy. In (i), accepted theories of sciences are used to derive the equations (e.g., Newton’s Law). In (ii), input-output data are employed in tandem with statistical analysis principles so as to generate empirical or “black box” models. In (iii), analogy is employed in determining the essential features of the system of interest by studying a similar, well understood system. The variables can take different values and their specifications define different states of the system. They can be continuous, integer, or a mixed set of continuous and integer. The parameters are fixed to one or multiple specific values, and each fixation defines a different model. The constants are fixed quantities by the model statement. The mathematical model relations can be classified as equalities, inequalities, and logical conditions. The model equalities are usually composed of mass balances, energy balances, equilibrium relations, physical property calculations, and engineering design relations which describe the physical phenomena of the system. The model inequalities often consist of allowable operating regimes, specifications on qualities, feasibility of heat and mass transfer, performance requirements, and bounds on availabilities and demands. The logical conditions provide the connection between the continuous and integer variables.
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Input-output specification model"

1

Dai, Xudong, and Xuefen Ma. "Product Design Knowledge Model in Distributed Resource Environment." In ASME 2014 12th Biennial Conference on Engineering Systems Design and Analysis. American Society of Mechanical Engineers, 2014. http://dx.doi.org/10.1115/esda2014-20352.

Full text
Abstract:
Product design is a process of knowledge flowing and integrating. From the perspective, product design in distributed resource environment could be defined as the constructing process of a product knowledge model to satisfy specific needs of customers. This paper studies and builds structured models of product design knowledge, i.e. a customer need knowledge model, a DE knowledge model and a RU knowledge model. A product design knowledge model is the structured specification of a desired artifact including customer Need, Function with Environment (constraints), Physical principle with its Structure and Implement method and Technology detail (FEPSAT). A customer need knowledge model is constructed by customer Group feature, need Content and product Meaning (GCM). The DE knowledge model is constructed by Value, Profession, Culture and Experience background (VPCE). The RU knowledge model is built by knowledge service Content, service Input and service Output (CIO). The evolution of the product design knowledge models and the interaction with each other were analyzed.
APA, Harvard, Vancouver, ISO, and other styles
2

Springer, Scott L., and Rajit Gadh. "Haptic Feedback for Virtual Reality Computer Aided Design." In ASME 1997 International Mechanical Engineering Congress and Exposition. American Society of Mechanical Engineers, 1997. http://dx.doi.org/10.1115/imece1997-0001.

Full text
Abstract:
Abstract Many advances in computer aided design (CAD) have been made to support more rapid product development. One key area that has not been adequately addressed is in the area of rapid model creation for the exploration and communication of initial or conceptual design ideas. The human computer interface for current CAD systems remains tedious, detail oriented, and time consuming, and because of this CAD cannot be utilized effectively in the early stages of a design project. This is a considerable deficit, since the visualization of spatial models at the most crucial (concept) stage of design would be of great value. It is during this stage that most decisions controlling overall product performance limits are made. To address this need, a new approach to shape modeling has been proposed, which incorporates a virtual reality (VR) design interface. The initial interface specification includes multi-modal input/output consisting of speech input, auditory output, spatial (glove) input, and three dimensional stereographic display output. It is desirable to augment this interface with an additional output mode in the form of haptic or touch sensation to more efficiently and intuitively interact with concept solid models. The hypothesis is that the addition of this output mode will (i) increase the efficiency in complex geometry creation, and (ii) aid in the understanding and exploration of a wide variety of design concepts. The desire for haptic feedback is shared by many developers of VR environments, for a wide variety of applications. Thus, there has been a recent increase in research interest in this area. The development of haptic feedback systems remains a multi-disciplinary challenge as an effective system must address perceptual, comfort, control, and mechanical issues that arise from the closely coupled human-machine dynamics present in a haptic feedback system. In this paper, we evaluate the general approaches for haptic feedback design in light of CAD tasks. Inherent limitations to the various approaches are identified and evaluated. The limitations are compared with the device complexity and value added for CAD operations.
APA, Harvard, Vancouver, ISO, and other styles
3

Wang, Nanxin, and Jie Cheng. "EMAT: An Engineering Methodology Application Tool." In ASME 1995 15th International Computers in Engineering Conference and the ASME 1995 9th Annual Engineering Database Symposium collocated with the ASME 1995 Design Engineering Technical Conferences. American Society of Mechanical Engineers, 1995. http://dx.doi.org/10.1115/cie1995-0730.

Full text
Abstract:
Abstract More and more applications in today’s automotive industry call for integration of existing product design/analysis programs into packages to perform a higher level of system functionality, such as total engine analysis, model-based engine mapping, and powertrain system or vehicle optimization. The functional and procedural specifications for these integrations are often referred to as engineering methodologies. To enable the rapid prototyping of these methodologies, a generic software integration framework, EMAT (Engineering Methodology Application Tool) has been developed. EMAT consists of a high-level language environment MDL (Methodology Description Language) and a program for process execution scheduling and monitoring based on an artificial intelligence technology called Blackboard. Under the EMAT framework, a user can easily specify the control flow and data flow for any methodology in a declarative manner. Such a specification only needs to contain logical orders in which individual component programs will be executed (such as sequence, branching, or looping), and the input/output connections between the programs. EMAT will then dynamically interpret this specification into procedures that actually carry out the execution. In contrast to the conventional integration practices such as developing application specific scripts, EMAT provides a generic and high level means for integration, which improves hot only the efficiency of programing, but also the modularity, maintainability, and reusability of software. EMAT is currently being applied to the integration of multiple engine simulation programs to prototype complicated engineering methodologies for a wide range of applications within FORD.
APA, Harvard, Vancouver, ISO, and other styles
4

Moradi, Hamed, and Majid Saffar-Avval. "Nonlinear Control of an Air Handling Unit Using Feedback Linearization." In ASME 2009 International Mechanical Engineering Congress and Exposition. ASMEDC, 2009. http://dx.doi.org/10.1115/imece2009-11535.

Full text
Abstract:
Heating, ventilation and air conditioning (HVAC) systems are equipments used to maintain satisfactory comfort conditions in buildings. Also, energy consumption of ventilated buildings highly depend on the design, performance and control of HVAC systems. In this paper, nonlinear model of a multi-variable HVAC system is considered in which the control inputs are the air and cool water flow rates. Using thermodynamics and heat transfer rules, differential and consequently state space equations of the system are represented. To achieve a good performance, dynamic variables such as output temperature and relative humidity must be controlled. Using input-output feedback linearization, a PI controller is designed. It is shown that by applying the controller, system tracks from one operating point to another with an appropriate specification of time response. In addition, using feedback linearization guarantees robustness of the system against the parametric uncertainties associated with dynamic model.
APA, Harvard, Vancouver, ISO, and other styles
5

Jadhav, Sharad P., Rajan H. Chile, and Satish T. Hamde. "Modeling and Design of Fractional-Order IMC Based Controller for Power Plant Gas Turbine." In ASME 2015 Gas Turbine India Conference. American Society of Mechanical Engineers, 2015. http://dx.doi.org/10.1115/gtindia2015-1264.

Full text
Abstract:
Fractional-order modeling and controller design by a simplified way is the demanding research area and is gearing more and more momentum. This paper is the attempt of application of fractional-order modeling and controller design for the power plant gas turbine. The Gas Turbine is most important equipment in power, aviation and automotive industry. It converts the thermal energy of fuel into the mechanical power. Therefore, important requirement of gas turbine system is to control the flow of input fuel. The existing identified model of the gas turbine between the input fuel flow and the output speed, is of high-order and integer type, which is reduced to the simple and compact integer-order (IO) and fractional-order (FO) models using local optimization technique. The fractional-order internal model controller (FO-IMC) is designed and to show the performance efficacy it is compared with integer-order internal model controller (IO-IMC), which is also designed using the same methodology and specification. Simulation results show that FO-IMC based controller gives better performance for the set point tracking, plant uncertainty and disturbance rejection than the IO-IMC. FO-IMC controller also satisfy the robust stability condition.
APA, Harvard, Vancouver, ISO, and other styles
6

Krishnan, Girish, Charles Kim, and Sridhar Kota. "A Lumped-Model Based Building-Block Concatenation for a Conceptual Compliant Mechanism Synthesis." In ASME 2008 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. ASMEDC, 2008. http://dx.doi.org/10.1115/detc2008-49982.

Full text
Abstract:
Present building-block synthesis techniques for compliant mechanisms [4–7] account for the kinematic behavior of the mechanism alone, leaving the stiffness, manufacturability and mechanical efficiency to be determined by the shape-size optimization process. In this effort, we aim to generate practical and feasible conceptual designs by designing for kinematics and stiffness simultaneously. To enable this, we use a lumped spring-lever model, which intuitively characterizes the stiffness and the kinematics of a deformable-complaint building block with distinct input and output points. This model aids in the understanding of how the stiffness and the kinematics of building blocks combine when concatenated to form a mechanism. We use this understanding to synthesize compliant mechanisms by combining building blocks of known motion characteristics. A simple compliant-dyad building block is characterized for its lumped values of stiffness and kinematics. The concatenation of these dyad-building blocks is solved in detail, and guidelines for conceptual synthesis are proposed. Two practical examples are solved; a motion amplifier for a piezo-stack and a compliant energy storage mechanism for a staple-gun. The conceptual designs obtained from this approach are very close to the kinematic and the stiffness requirements of the application, thus minimizing the role of shape and size optimization to achieve the problem specification. The model, when extended to higher dimensions may be used to solve for precision positioning and other applications.
APA, Harvard, Vancouver, ISO, and other styles
7

Roy, U., R. Sudarsan, R. D. Sriram, K. W. Lyons, and M. R. Duffey. "Information Architecture for Design Tolerancing: From Conceptual to the Detail Design." In ASME 1999 Design Engineering Technical Conferences. American Society of Mechanical Engineers, 1999. http://dx.doi.org/10.1115/detc99/dac-8704.

Full text
Abstract:
Abstract Tolerance design is the process of deriving a description of geometric tolerance specifications for a product from a set of specifications on the desired properties of the product. Existing approaches to tolerance analysis and synthesis entail detailed knowledge of geometry of assemblies and are mostly applicable during advanced stages of design, leading to a less than optimal design process. During the design process of assemblies, both assembly structure and associated tolerance information evolve continuously and significant gains can be achieved by effectively using this information to influence the design of an assembly. Any pro-active approach to the assembly or tolerance analysis in the early design stages will involve decision making with incomplete information models. In order to carry out early tolerance synthesis and analysis in the conceptual stages of the product design, we need to devise techniques for representing function-behavior-assembly models that will allow analysis and synthesis of tolerances, even with the incomplete data set. A ‘function’ (what the system is for) is associated with the transformation of an input physical entity into an output physical entity by the system. The problem or customer’s need, initially described by functional requirements on an assembly, and associated constraints on the functional requirements derives the concept of an assembly. This specification of functional requirements and constraints define a functional model for the assembly. Many researchers have studied functional representation (function based taxonomy and ontology), function to form mapping, and behavior representation (behavior means how the system/product works). However, there is no comprehensive function-assembly-behavior (FAB) integrated model. In this paper, we discuss the integration of function, assembly, and behavior representation into a comprehensive information model (FAB models). To do this, we need to develop appropriate assembly models and tolerance models that would enable the designer to incrementally understand the build-up or propagation of tolerances (i.e., constraints) and optimize the layout, features, or assembly realizations. This will ensure ease of tolerance delivery.
APA, Harvard, Vancouver, ISO, and other styles
8

Pourgol-Mohammad, Mohammad. "Uncertainty Propagation in Complex Codes Calculations." In 2013 21st International Conference on Nuclear Engineering. American Society of Mechanical Engineers, 2013. http://dx.doi.org/10.1115/icone21-16570.

Full text
Abstract:
The uncertainty propagation is an important segment of quantitative uncertainty analysis for complex computational codes (e.g., RELAP5 thermal-hydraulics) computations. Different sampling techniques, dependencies between uncertainty sources, and accurate inference on results are among the issues to be considered. The dynamic behavior of the system codes executed in each time step, results in transformation of accumulated errors and uncertainties to next time step. Depending on facility type, availability of data, scenario specification, computing machine and the software used, propagation of uncertainty results in considerably different results. This paper discusses the practical considerations of uncertainty propagation for code computations. The study evaluates the implications of the complexity on propagation of the uncertainties through inputs, sub-models and models. The study weighs different techniques of propagation, their statistics with considering their advantages and limitation at dealing with the problem. The considered methods are response surface, Monte Carlo (including simple, Latin Hypercube, and importance sampling) and boot-strap techniques. As a case study, the paper will discuss uncertainty propagation of the Integrated Methodology on Thermal-Hydraulics Uncertainty Analysis (IMTHUA). The methodology comprehensively covers various aspects of complex code uncertainty assessment for important accident transients. It explicitly examines the TH code structural uncertainties by treating internal sub-model uncertainties and by propagating such model uncertainties along with parameters in the code calculations. The two-step specification of IMTHUA (input phase following with the output updating) makes it special case to make sure that the figure of merit statistical coverage is achieved at the end with target confidence level. Tolerance limit statistics provide confidence a level on the level of coverage depending on the sample size, number of output measures, and one-sided or two-sided type of statistics. This information should be transferred to the second phase in the form of a probability distribution for each of the output measures. The research question is how to use data to develop such distributions from the corresponding tolerance limit statistics. Two approaches of using extreme values method and Bayesian updating are selected to estimate the parametric distribution parameters and compare the coverage in respect to the selected coverage criteria. The analysis is demonstrated on the large break loss of coolant accident for the LOFT test facility.
APA, Harvard, Vancouver, ISO, and other styles
9

Yang, Dongzhe, Kourosh Danai, and David Kazmer. "A Knowledge-Based Tuning Method for Injection Molding Machines." In ASME 2000 International Mechanical Engineering Congress and Exposition. American Society of Mechanical Engineers, 2000. http://dx.doi.org/10.1115/imece2000-2326.

Full text
Abstract:
Abstract Complexity of manufacturing processes has hindered methodical specification of machine setpoints for improving productivity. Traditionally in injection molding, the machine setpoints are assigned either by trial and error, based on heuristic knowledge of an experienced operator, or according to an empirical model between the inputs and part quality attributes obtained from statistical design of experiments (DOE). In this paper, a Knowledge-Based Tuning (KBT) Method is presented which takes advantage of the a priori knowledge of the process, in the form of a qualitative model, to reduce the demand for experimentation. The KBT Method is designed to provide an estimate of the process feasible region (process window) as the basis of finding the optimal setpoints, and to update its knowledge-base according to new input-output data that becomes available during tuning. The KBT Method’s utility is demonstrated in production of digital video disks (DVDs).
APA, Harvard, Vancouver, ISO, and other styles
10

Pascoe, Jason, Yuksel Parlatan, B. McLaughlin, and Sophia Fung. "Application of Uncertainty Analysis in the Comparison of Void Fraction Calculations With Experiment." In ASME 2010 3rd Joint US-European Fluids Engineering Summer Meeting collocated with 8th International Conference on Nanochannels, Microchannels, and Minichannels. ASMEDC, 2010. http://dx.doi.org/10.1115/fedsm-icnmm2010-31027.

Full text
Abstract:
Safety analysis computer codes are designed to simulate phenomena relevant to the assessment of normal and transient behaviour in nuclear power plants. In order to do so, models of relevant phenomena are developed and a set of such models constitutes a computer code. In accident or transient analysis the values of certain output parameters (margin parameters) are used to characterize the severity of the event. The accuracy of the computer code in calculating these margin parameters is usually obtained through validation and variation in the margin parameter is estimated through the propagation of variation in the code input. A method for estimating code uncertainty respect to a specific output parameter has been developed. The methodology has the following basic elements: (1) specification and ranking of phenomena that govern the behaviour of the output parameter for which an uncertainty range is required; (2) identification of models within the code that represent the relevant phenomena; (3) determination of the governing parameters for the phenomenological models and Identification of uncertainty ranges for the governing model parameters from validation or scientific basis, if available; (4) decomposition of the governing model parameters into related parameters; (5) identification of uncertainty ranges for the modelling parameters for use in Best Estimate Analysis; (6) design and execution of a case matrix; and (7) estimation of the code uncertainty through quantification of the variability in output parameters arising from uncertainty in modelling parameters. This methodology has been employed using simulations of Large Break Loss of Coolant Accident (LOCA) tests in the RD-14M test facility to calculate the uncertainty in the TUF thermal hydraulics code calculation of the coolant void fraction. The uncertainty has been estimated with and without plant parameters (parameters specific to the RD-14M test loop). The TUF coolant void fraction uncertainty without plant parameters was determined to be 0.08 while the uncertainty with plant parameters included was determined to be 0.11. The uncertainty value without plant parameters included is comparable to the uncertainty in the measurements (0.09). The uncertainty value with plant parameters included is larger than the variation in the bias (0.10) of the TUF calculation of void fraction. From these findings, it can be concluded that the estimated accuracy of the TUF code calculation of void fraction is consistent with the available experimental data.
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Input-output specification model"

1

Sinclair, Samantha, and Sally Shoop. Automated detection of austere entry landing zones : a “GRAIL Tools” validation assessment. Engineer Research and Development Center (U.S.), August 2022. http://dx.doi.org/10.21079/11681/45265.

Full text
Abstract:
The Geospatial Remote Assessment for Ingress Locations (GRAIL) Tools software is a geospatial product developed to locate austere entry landing zones (LZs) for military aircraft. Using spatial datasets like land classification and slope, along with predefined LZ geometry specifications, GRAIL Tools generates binary suitability filters that distinguish between suitable and unsuitable terrain. GRAIL Tools combines input suitability filters, searches for LZs at user‐defined orientations, and plots results. To refine GRAIL Tools, we: (a) verified software output; (b) conducted validation assessments using five unpaved LZ sites; and (c) assessed input dataset resolution on outcomes using 30 and 1‐m datasets. The software was verified and validated in California and the Baltics, and all five LZs were correctly identified in either the 30 or the 1‐m data. The 30‐m data provided numerous LZs for consideration, while the 1‐m data highlighted hazardous conditions undetected in the 30‐m data. Digital elevation model grid size affected results, as 1‐m data produced overestimated slope values. Resampling the data to 5 m resulted in more realistic slopes. Results indicate GRAIL Tools is an asset the military can use to rapidly assess terrain conditions.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography