Academic literature on the topic 'Uniform metric system'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Uniform metric system.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Uniform metric system"

1

Ahmadi, Seyyed. "Shadowing, ergodic shadowing and uniform spaces." Filomat 31, no. 16 (2017): 5117–24. http://dx.doi.org/10.2298/fil1716117a.

Full text
Abstract:
We introduce and study the topological concepts of ergodic shadowing, chain transitivity and topological ergodicity for dynamical systems on non-compact non-metrizable spaces. These notions generalize the relevant concepts for metric spaces. We prove that a dynamical system with topological ergodic shadowing property is topologically chain transitive, and that topological chain transitivity together with topological shadowing property implies topological ergodicity.
APA, Harvard, Vancouver, ISO, and other styles
2

Fry, Edward W. S., Sophie Triantaphillidou, Robin B. Jenkin, Ralph E. Jacobson, and John R. Jarvis. "Noise Power Spectrum Scene-Dependency in Simulated Image Capture Systems." Electronic Imaging 2020, no. 9 (January 26, 2020): 345–1. http://dx.doi.org/10.2352/issn.2470-1173.2020.9.iqsp-345.

Full text
Abstract:
The Noise Power Spectrum (NPS) is a standard measure for image capture system noise. It is derived traditionally from captured uniform luminance patches that are unrepresentative of pictorial scene signals. Many contemporary capture systems apply nonlinear content-aware signal processing, which renders their noise scene-dependent. For scene-dependent systems, measuring the NPS with respect to uniform patch signals fails to characterize with accuracy: i) system noise concerning a given input scene, ii) the average system noise power in real-world applications. The sceneand- process-dependent NPS (SPD-NPS) framework addresses these limitations by measuring temporally varying system noise with respect to any given input signal. In this paper, we examine the scene-dependency of simulated camera pipelines in-depth by deriving SPD-NPSs from fifty test scenes. The pipelines apply either linear or non-linear denoising and sharpening, tuned to optimize output image quality at various opacity levels and exposures. Further, we present the integrated area under the mean of SPD-NPS curves over a representative scene set as an objective system noise metric, and their relative standard deviation area (RSDA) as a metric for system noise scene-dependency. We close by discussing how these metrics can also be computed using scene-and-processdependent Modulation Transfer Functions (SPD-MTF).
APA, Harvard, Vancouver, ISO, and other styles
3

BARLOS, FOTIOS, and OPHIR FRIEDER. "JOIN WORKLOAD PARTITIONING UNDER UNIFORM AND SKEWED INPUT RELATIONS." Parallel Processing Letters 04, no. 01n02 (June 1994): 95–104. http://dx.doi.org/10.1142/s0129626494000119.

Full text
Abstract:
Parallel Join algorithms partition the workload into buckets and assign each bucket to a node of the multiprocessor system. The existing algorithms use the volume of the load as a metric to determine the bucket boundaries. When the input relations exhibit a high degree of skew, the above metric does not achieve uniform partitioning. We propose a new method to partition the workload of the Join operation that guarantees near equal execution time of the created buckets. We present results obtained from the Intel 1860 hypercube system that support our theory.
APA, Harvard, Vancouver, ISO, and other styles
4

LI, DESHENG, and P. E. KLOEDEN. "EQUI-ATTRACTION AND THE CONTINUOUS DEPENDENCE OF PULLBACK ATTRACTORS ON PARAMETERS." Stochastics and Dynamics 04, no. 03 (September 2004): 373–84. http://dx.doi.org/10.1142/s0219493704001061.

Full text
Abstract:
The equi-attraction properties of uniform pullback attractors [Formula: see text] of nonautonomous dynamical systems (θ,ϕλ) with a parameter λ∈Λ, where Λ is a compact metric space, are investigated; here θ is an autonomous dynamical system on a compact metric space P which drives the cocycle ϕλon a complete metric state space X. In particular, under appropriate regularity conditions, it is shown that the equi-attraction of the family [Formula: see text] uniformly in p∈P is equivalent to the continuity of the setvalued mappings [Formula: see text] in λ with respect to the Hausdorff metric on the nonempty compact subsets of X.
APA, Harvard, Vancouver, ISO, and other styles
5

Agop, Maricel, Tudor-Cristian Petrescu, Dumitru Filipeanu, Claudia Elena Grigoraș-Ichim, Ana Iolanda Voda, Andrei Zala, Lucian Dobreci, Constantin Baciu, and Decebal Vasincu. "Toward Complex Systems Dynamics through Flow Regimes of Multifractal Fluids." Symmetry 13, no. 5 (April 27, 2021): 754. http://dx.doi.org/10.3390/sym13050754.

Full text
Abstract:
In the framework of the Multifractal Theory of Motion, which is expressed by means of the multifractal hydrodynamic model, complex system dynamics are explained through uniform and non-uniform flow regimes of multifractal fluids. Thus, in the case of the uniform flow regime of the multifractal fluid, the dynamics’ description is “supported” only by the differentiable component of the velocity field, the non-differentiable component being null. In the case of the non-uniform flow regime of the multifractal fluid, the dynamics’ description is “supported” by both components of the velocity field, their ratio specifying correlations through homographic transformations. Since these transformations imply metric geometries explained, for example, by means of Killing–Cartan metrics of the SL(2R)-type algebra, of the set of 2 × 2 matrices with real elements, and because these metrics can be “produced” as Cayleyan metrics of absolute geometries, the dynamics’ description is reducible, based on a minimal principle, to harmonic mappings from the usual space to the hyperbolic space. Such a conjecture highlights not only various scenarios of dynamics’ evolution but also the types of interactions “responsible” for these scenarios. Since these types of interactions become fundamental in the self-structuring processes of polymeric-type materials, finally, the theoretical model is calibrated based on the author’s empirical data, which refer to controlled drug release applications.
APA, Harvard, Vancouver, ISO, and other styles
6

Olshevskyi, M. S. "Metric properties of Cayley graphs of alternating groups." Carpathian Mathematical Publications 13, no. 2 (November 19, 2021): 545–81. http://dx.doi.org/10.15330/cmp.13.2.545-581.

Full text
Abstract:
A well known diameter search problem for finite groups with respect to its systems of generators is considered. The problem can be formulated as follows: find the diameter of a group over its system of generators. The diameter of a group over a specific system of generators is the diameter of the corresponding Cayley graph. It is considered alternating groups with classic irreducible system of generators consisting of cycles with length three of the form $(1,2,k)$. The main part of the paper concentrates on analysis how even permutations decompose with respect to this system of generators. The rules for moving generators from permutation's decomposition from left to right and from right to left are introduced. These rules give rise for transformations of decompositions, that do not increase their lengths. They are applied for removing fixed points of a permutation, that were included in its decomposition. Based on this rule the stability of system of generators is proved. The strict growing property of the system of generators is also proved, as the corollary of transformation rules and the stability property. It is considered homogeneous theory, that was introduced in the previous author's paper. For the series of alternating groups with systems of generators mentioned above it is shown that this series is uniform and homogeneous. It makes possible to apply the homogeneous down search algorithm to compute the diameter. This algorithm is applied and exact values of diameters for alternating groups of degree up to 43 are computed.
APA, Harvard, Vancouver, ISO, and other styles
7

Ahsanullah, T. M. G., and Gunther Jäger. "Quantale-Valued Uniformizations of Quantale-Valued Generalizations of Approach Groups." New Mathematics and Natural Computation 15, no. 03 (October 7, 2019): 517–38. http://dx.doi.org/10.1142/s1793005719500303.

Full text
Abstract:
We introduce the categories of quantale-valued approach uniform spaces and quantale-valued uniform gauge spaces, and prove that they are topological categories. We first show that the category of quantale-valued uniform gauge spaces is a full bireflective subcategory of the category of quantale-valued approach uniform spaces and; second, we prove that only under strong restrictions on the quantale these two categories are isomorphic. Besides presenting embeddings of the category of quantale-valued metric spaces into the categories of quantale-valued approach uniform spaces as well as quantale-valued uniform gauge spaces, we show that every quantale-valued approach system group and quantale-valued gauge group has a natural underlying quantale-valued approach uniform space, respectively, a quantale-valued uniform gauge space.
APA, Harvard, Vancouver, ISO, and other styles
8

Грицюк, Ю. І., and Т. О. Муха. "Methods of determination of quality of software." Scientific Bulletin of UNFU 30, no. 1 (February 27, 2020): 158–67. http://dx.doi.org/10.36930/40300127.

Full text
Abstract:
Developed modern software tool for determining the quality of software (SW) techniques metric analysis. The software allows you to use quality metrics to calculate the corresponding metric and determine the value of the complex index of quality software product. Clarified the quality assessment process, software analyzes the concept of the quality of the software product as an object of standardization and quality levels of performance models of the software. This allowed the opportunity to improve the quality of software by generating the relevant requirements of the criteria for quality evaluation. It is also possible to make the improvement of the metric analysis of models of its quality and its quantitative measurement methods in all phases of a software project. It was revealed that the driving force behind the success of software projects is the desire of their leaders to develop such software, which would have a certain value. It should be important for certain tasks or to achieve tactical and strategic objectives. The value of the software can be expressed in the form of its value, or in some other form. The customer usually has their own idea of ​​the maximum cost of investing in the development of software. These funds profit it expects to achieve in the case of the main goals of using the software. It can also have a vision of the functionality of software and certain expectations of its quality. The features of the use of the metric analysis for determining the quality of the software, revealed the lack of uniform standards for the metric. Therefore, each supplier of its measurement system offers its own methods of evaluating the quality of software and associated metrics. Also it is challenging the interpretation of metric values, since for the majority of users of its software metrics and their values ​​are not absolutely clear and informative. It was found that the main parameters of the choice of an embodiment of the software is its cost, the duration of the development process and the reputation of the designer of the company. But the decisions taken on the basis of these parameters, not always guarantee proper quality of the software.
APA, Harvard, Vancouver, ISO, and other styles
9

Ozturk, Mustafa C., Dongming Xu, and José C. Príncipe. "Analysis and Design of Echo State Networks." Neural Computation 19, no. 1 (January 2007): 111–38. http://dx.doi.org/10.1162/neco.2007.19.1.111.

Full text
Abstract:
The design of echo state network (ESN) parameters relies on the selection of the maximum eigenvalue of the linearized system around zero (spectral radius). However, this procedure does not quantify in a systematic manner the performance of the ESN in terms of approximation error. This article presents a functional space approximation framework to better understand the operation of ESNs and proposes an information-theoretic metric, the average entropy of echo states, to assess the richness of the ESN dynamics. Furthermore, it provides an interpretation of the ESN dynamics rooted in system theory as families of coupled linearized systems whose poles move according to the input signal dynamics. With this interpretation, a design methodology for functional approximation is put forward where ESNs are designed with uniform pole distributions covering the frequency spectrum to abide by the richness metric, irrespective of the spectral radius. A single bias parameter at the ESN input, adapted with the modeling error, configures the ESN spectral radius to the input-output joint space. Function approximation examples compare the proposed design methodology versus the conventional design.
APA, Harvard, Vancouver, ISO, and other styles
10

Amadid, Jamal, Abdelfettah Belhabib, Asma Khabba, Zakaria El Ouadi, and Abdelouhab Zeroual. "Channel Estimation Evaluation For a Massive MIMO System Considering Spatially Correlated Channels in an Urban Network." E3S Web of Conferences 351 (2022): 01055. http://dx.doi.org/10.1051/e3sconf/202235101055.

Full text
Abstract:
Channel estimation (CE) is an important process that is done during the pilot transmission phase in each base station. This work addresses this process for the massive multiple-input multiple-output systems by studying the scenario where the channels are spatially correlated. Throughout this work, the spatial correlation between channels is modeled using the exponential correlation model. The minimum mean square error (MMSE) estimator’s performance for uncorrelated and correlated channels is compared and examined using the normalized mean square error (NMSE) metric, where the correlated scenario is presented through two array designs, namely proposed uniform planar array (UPA) and uniform linear array (ULA). In comparison to the uncorrelated situation, the correlated channels scenario is a more practical scenario that represents the real- world environment and provides superior channel estimate quality since the spatial correlation is advantageous for CE. Hence, we proposed a proposed UPA arrangement for correlated channels based on the Kronecker product of the ULA arrangement that outperforms the ULA arrangement and offers superior performance comparedto ULA. Numerical results are offered in order to support our analytical study.
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Uniform metric system"

1

Павлова, Галина Вікторівна, and Лариса Петрівна Семененко. "Автографи вчених в книжкової колекції академіка Б. С. Якобі." Thesis, 2019. http://repository.kpi.kharkov.ua/handle/KhPI-Press/41809.

Full text
Abstract:
Книжкова колекція академіка Б. С. Якобі, яка збережена і досліджується в НТБ НТУ "ХПІ", — це новий нестандартний портрет вченого, представлений зібранням книг. Також подана інформація про учених, автографи яких містяться в книгах колекції.
The book collection of academician B. S. Jacobi, which is stored and explored in the NTU "KhPI" Scientific and Technical Library, is a new non-standard portrait of a scientist, presented by a collection of books. The article also provides information on scientists whose autographs are contained in the books of the collection.
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Uniform metric system"

1

Nakazono, Takumi, Ken-ichiro Moridomi, Kohei Hatano, and Eiji Takimoto. "A Combinatorial Metrical Task System Problem Under the Uniform Metric." In Lecture Notes in Computer Science, 276–87. Cham: Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-46379-7_19.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Castiglioni, Valentina, Ruggero Lanotte, and Simone Tini. "Fully Syntactic Uniform Continuity Formats for Bisimulation Metrics." In The Art of Modelling Computational Systems: A Journey from Logic and Concurrency to Security and Privacy, 293–312. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-31175-9_17.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Cheng, Yougan, Ronny Straube, Abed E. Alnaif, Lu Huang, Tarek A. Leil, and Brian J. Schmidt. "Virtual Populations for Quantitative Systems Pharmacology Models." In Methods in Molecular Biology, 129–79. New York, NY: Springer US, 2022. http://dx.doi.org/10.1007/978-1-0716-2265-0_8.

Full text
Abstract:
AbstractQuantitative systems pharmacology (QSP) places an emphasis on dynamic systems modeling, incorporating considerations from systems biology modeling and pharmacodynamics. The goal of QSP is often to quantitatively predict the effects of clinical therapeutics, their combinations, and their doses on clinical biomarkers and endpoints. In order to achieve this goal, strategies for incorporating clinical data into model calibration are critical. Virtual population (VPop) approaches facilitate model calibration while faced with challenges encountered in QSP model application, including modeling a breadth of clinical therapies, biomarkers, endpoints, utilizing data of varying structure and source, capturing observed clinical variability, and simulating with models that may require more substantial computational time and resources than often found in pharmacometrics applications. VPops are frequently developed in a process that may involve parameterization of isolated pathway models, integration into a larger QSP model, incorporation of clinical data, calibration, and quantitative validation that the model with the accompanying, calibrated VPop is suitable to address the intended question or help with the intended decision. Here, we introduce previous strategies for developing VPops in the context of a variety of therapeutic and safety areas: metabolic disorders, drug-induced liver injury, autoimmune diseases, and cancer. We introduce methodological considerations, prior work for sensitivity analysis and VPop algorithm design, and potential areas for future advancement. Finally, we give a more detailed application example of a VPop calibration algorithm that illustrates recent progress and many of the methodological considerations. In conclusion, although methodologies have varied, VPop strategies have been successfully applied to give valid clinical insights and predictions with the assistance of carefully defined and designed calibration and validation strategies. While a uniform VPop approach for all potential QSP applications may be challenging given the heterogeneity in use considerations, we anticipate continued innovation will help to drive VPop application for more challenging cases of greater scale while developing new rigorous methodologies and metrics.
APA, Harvard, Vancouver, ISO, and other styles
4

Cavallaro, Andrea, and Stefan Winkler. "Perceptual Semantics." In Multimedia Technologies, 1441–55. IGI Global, 2008. http://dx.doi.org/10.4018/978-1-59904-953-3.ch105.

Full text
Abstract:
The design of image and video compression or transmission systems is driven by the need for reducing the bandwidth and storage requirements of the content while maintaining its visual quality. Therefore, the objective is to define codecs that maximize perceived quality as well as automated metrics that reliably measure perceived quality. One of the common shortcomings of traditional video coders and quality metrics is the fact that they treat the entire scene uniformly, assuming that people look at every pixel of the image or video. In reality, we focus only on particular areas of the scene. In this chapter, we prioritize the visual data accordingly in order to improve the compression performance of video coders and the prediction performance of perceptual quality metrics. The proposed encoder and quality metric incorporate visual attention and use a semantic segmentation stage, which takes into account certain aspects of the cognitive behavior of people when watching a video. This semantic model corresponds to a specific human abstraction, which need not necessarily be characterized by perceptual uniformity. In particular, we concentrate on segmenting moving objects and faces, and we evaluate the perceptual impact on video coding and on quality evaluation.
APA, Harvard, Vancouver, ISO, and other styles
5

Cavallaro, Andrea, and Stefan Winkler. "Perceptual Semantics." In Digital Multimedia Perception and Design, 1–20. IGI Global, 2006. http://dx.doi.org/10.4018/978-1-59140-860-4.ch001.

Full text
Abstract:
The design of image and video compression or transmission systems is driven by the need for reducing the bandwidth and storage requirements of the content while maintaining its visual quality. Therefore, the objective is to define codecs that maximize perceived quality as well as automated metrics that reliably measure perceived quality. One of the common shortcomings of traditional video coders and quality metrics is the fact that they treat the entire scene uniformly, assuming that people look at every pixel of the image or video. In reality, we focus only on particular areas of the scene. In this chapter, we prioritize the visual data accordingly in order to improve the compression performance of video coders and the prediction performance of perceptual quality metrics. The proposed encoder and quality metric incorporate visual attention and use a semantic segmentation stage, which takes into account certain aspects of the cognitive behavior of people when watching a video. This semantic model corresponds to a specific human abstraction, which need not necessarily be characterized by perceptual uniformity. In particular, we concentrate on segmenting moving objects and faces, and we evaluate the perceptual impact on video coding and on quality evaluation.
APA, Harvard, Vancouver, ISO, and other styles
6

Hand, David J. "1. A brief history." In Measurement: A Very Short Introduction, 1–16. Oxford University Press, 2016. http://dx.doi.org/10.1093/actrade/9780198779568.003.0001.

Full text
Abstract:
‘A brief history’ shows that measurement is at least as old as civilization. Different systems and different units of measurement were developed in different places, with the physical size of natural biological objects frequently being used as a basic unit. The key drivers for a uniform measurement system were trade, the industrial revolution, and scientific advance. In 1960 the Système International d’Units (SI units) was introduced, consisting of seven basic units: length (metre), mass (kilogram), time (second), electric current (ampere), temperature (degree kelvin), quantity of substance (mole), and luminous intensity (candela). Another twenty-two named units were defined as powers and combinations of these basic seven.
APA, Harvard, Vancouver, ISO, and other styles
7

Saini, Munish, and Kuljit Kaur Chahal. "A Systematic Review of Attributes and Techniques for Open Source Software Evolution Analysis." In Advances in Systems Analysis, Software Engineering, and High Performance Computing, 1–23. IGI Global, 2018. http://dx.doi.org/10.4018/978-1-5225-5314-4.ch001.

Full text
Abstract:
Many studies have been conducted to understand the evolution process of Open Source Software (OSS). The researchers have used various techniques for understanding the OSS evolution process from different perspectives. This chapter reports a meta-data analysis of the systematic literature review on the topic in order to understand its current state and to identify opportunities for the future. This research identified 190 studies, selected against a set of questions, for discussion. It categorizes the research studies into nine categories. Based on the results obtained from the systematic review, there is evidence of a shift in the metrics and methods for OSS evolution analysis over the period of time. The results suggest that there is a lack of a uniform approach to analyzing and interpreting the results. There is need of more empirical work using a standard set of techniques and attributes to verify the phenomenon governing the OSS projects. This will help to advance the field and establish a theory of software evolution.
APA, Harvard, Vancouver, ISO, and other styles
8

Nolte, David D. "Physics and Geometry." In Introduction to Modern Dynamics, 3–52. Oxford University Press, 2019. http://dx.doi.org/10.1093/oso/9780198844624.003.0001.

Full text
Abstract:
This chapter emphasizes the importance of a geometric approach to dynamics. The central objects of interest are trajectories of a dynamical system through multidimensional spaces composed of generalized coordinates. Trajectories through configuration space are parameterized by the path length element, which becomes an important feature in later chapters on relativity and metric spaces. Trajectories through state space are defined by mathematical flow equations whose flow fields and flow lines become the chief visualization tool for complex dynamics. Coordinate transformations and Jacobian matrices are used throughout this text, and the transformation to noninertial frames introduces fictitious forces like the Coriolis force that are experienced by observers in noninertial frames. Uniformly rotating frames provide the noninertial reference frames for the description of rigid-body motion.
APA, Harvard, Vancouver, ISO, and other styles
9

Shahriar, Hossain, and Hisham M. Haddad. "Fuzzy Rule-Based Vulnerability Assessment Framework for Web Applications." In Application Development and Design, 778–97. IGI Global, 2018. http://dx.doi.org/10.4018/978-1-5225-3422-8.ch034.

Full text
Abstract:
This paper addresses the problem of assessing risk in web application due to implementation level vulnerabilities. In particular, the authors address the common research challenge of finding enough historical data to compute the probability of vulnerabilities and exploitations. They develop a Fuzzy Logic based System (FLS)1 to compute the risk uniformly and to address the diversity of risks. The authors propose a set of crisp metrics that are used to define fuzzy sets. They also develop a set of rule-bases to assess the risk level. The proposed FLS can be a useful tool to aid application developers and industry practitioners to assess the risk and plan ahead for employing necessary mitigation approaches. The authors evaluate their proposed approach using three real-world web applications implemented in PHP, and apply it to four types of common vulnerabilities. The initial results indicate that the proposed FLS approach can effectively discover high risk applications.
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Uniform metric system"

1

Qin, Yinghao, Lijun Tian, Chunyang Liu, and Junwei Liu. "Research on Modeling of Electro-thermal Integrated Energy System based on Uniform Energy Metric." In 2021 6th Asia Conference on Power and Electrical Engineering (ACPEE). IEEE, 2021. http://dx.doi.org/10.1109/acpee51499.2021.9436993.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Lucero, Briana M., and Matthew J. Adams. "Common Functionality Across Engineering Domains Through Transfer Functions and Bond Graphs." In ASME 2016 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. American Society of Mechanical Engineers, 2016. http://dx.doi.org/10.1115/detc2016-59769.

Full text
Abstract:
Functional Modeling allows a direct, and sometimes abstract, method for depicting a product. Through this method, product architecture, concept generation and physical modeling can be used to obtain repeatable and more meaningful results. The Functional Basis approach of engineering design, as taught to engineering design students, provides the vocabulary to produce a uniform approach to function structures with functions (verbs) and flows (nouns). This paper suggests that the flows, particularly the “signal” flows, can be correlated to additional domains domain through transfer functions common in controls engineering. Controls engineering employs transfer functions to mathematically represent the physical or digital functions of a system or product using block diagrams to show the individual steps. The research herein suggests the correlations between the mathematical representations of transfer functions and the functional basis of engineering design through the actions performed upon “signal” flows. Specifically, the methodologies employed by controls engineering can relate to engineering design by 1) Schematic similarities, 2) Quantifiable performance metric inputs/outputs, 3) Mathematical representations of the flows, and 4) isomorphic matching of the schematics. Controls systems use block diagrams to represent the sequential steps of the system, These block diagrams parallel the functions structures of engineering design. Performance metrics between the two domains can be complimentary when decomposed down to non-dimensional engineering units. Mathematical Functions of the actions in a controls systems can resemble the functional basis functions through the use if bond graphs by identifying characteristic behavior of the functions on the flows. Isomorphic matching using the schematic diagrams can be used to find analogies based upon similar functionality and target performance metrics. When these four similarities are performed, parallels between the engineering domain and the controls engineering can be establish. Examples of cross-domain matching via transfer functions and controls systems are provided as contextualization for the concepts proposed. Pathways forward for this preliminary research are additionally suggested.
APA, Harvard, Vancouver, ISO, and other styles
3

Nguyen, Daniel, Jacques A. Dolan, Abraham K. Ishihara, and Shahar Ben-Menahem. "A Finite Element Method Formulation for Non-Uniform Heating of Photovoltaic Modules with Associated Validation Metrics." In Power and Energy Systems and Applications. Calgary,AB,Canada: ACTAPRESS, 2011. http://dx.doi.org/10.2316/p.2011.756-087.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Nguyen, Daniel, Jacques A. Dolan, Abraham K. Ishihara, and Shahar Ben-Menahem. "A Finite Element Method Formulation for Non-Uniform Heating of Photovoltaic Modules with Associated Validation Metrics." In Power and Energy Systems and Applications. Calgary,AB,Canada: ACTAPRESS, 2012. http://dx.doi.org/10.2316/p.2012.756-087.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

"A FULLY AUTOMATIC RED-EYES DETECTION AND CORRECTION ALGORITHM BASED ON UNIFORM COLOR METRIC AND BINOCULAR GEOMETRIC CONSTRAINT." In International Conference on Bio-inspired Systems and Signal Processing. SciTePress - Science and and Technology Publications, 2008. http://dx.doi.org/10.5220/0001065302630266.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Singh, Gurpreet, Srinath Balaji, Jami J. Shah, David Corman, Ron Howard, Raju Mattikalli, and D. Stuart. "Evaluation of Network Measures as Complexity Metrics." In ASME 2012 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. American Society of Mechanical Engineers, 2012. http://dx.doi.org/10.1115/detc2012-70483.

Full text
Abstract:
Modern automotive and aerospace products are large cyber-physical system consisting of software, mechanical, electrical and electronic components. The increasing complexity of such systems is a major concern as it impacts development time and effort, as well as, initial and operational costs. Although much literature exists on complexity metrics, very little work has been done in determining if metrics correlate with real world products. Aspects of complexity include the product structure, development process and manufacturing. Since all these aspects can be uniformly represented in the form of networks, we examine common network based complexity measures in this paper. Network metrics are grouped into three categories: size complexity, numeric complexity (degree of coupling) and technological complexity (solvability). Several empirical studies were undertaken to determine the efficacy of various metrics. One approach was to survey project engineers in an aerospace company to gauge their perception of complexity. The second was through case studies of alternative designs to perform equivalent functions. The third was to look at actual time, labor data from past projects. Data structures and fast algorithms for complexity calculations for large cyber physical systems were also implemented.
APA, Harvard, Vancouver, ISO, and other styles
7

Yang, R. J., N. Wang, C. H. Tho, J. P. Bobineau, and B. P. Wang. "Metamodeling Development for Vehicle Frontal Impact Simulation." In ASME 2001 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. American Society of Mechanical Engineers, 2001. http://dx.doi.org/10.1115/detc2001/dac-21012.

Full text
Abstract:
Abstract Response surface methods or metamodels are commonly used to approximate large engineering systems. This paper presents a new metric for evaluating a response surface method or a metamodeling technique. Five response surface methods are studied: Stepwise Regression, Moving Least Square, Kriging, Multiquadratic, and Adaptive and Interactive Modeling System. A real world frontal impact design problem is used as an example, which is a complex, highly nonlinear, transient, dynamic, large deformation finite element model. The optimal Latin Hypercube Sampling method is used to distribute the sampling points uniformly over the entire design space. The Root Mean Square Error is used as the error indicator to study the accuracy and convergence rate of the metamodels for this vehicle impact analysis. A hybrid approach/strategy for selecting the best metamodels of impact responses is proposed.
APA, Harvard, Vancouver, ISO, and other styles
8

Meunier, Jeffrey K., and Amaury Rolin. "Autonomous Train Control and Track Circuit Inspection System." In 2013 Joint Rail Conference. American Society of Mechanical Engineers, 2013. http://dx.doi.org/10.1115/jrc2013-2462.

Full text
Abstract:
The Washington D.C. Metro utilizes an Automatic Train Control (ATC) system designed in the mid 1970’s. This ATC system employs passive track bed markers and antennas inductively coupled to the track to send and receive audio frequency control signals over the rails. The ATC system is only installed in revenue passenger cars and has never been installed on any other vehicle. Recently, a new inspection vehicle was delivered that includes an inspection module that interfaces with this train control system to inspect the control and track occupancy signals. This paper will discuss the challenges of designing a track circuit inspection system and retrofitting a refurbished ATC system designed for a revenue vehicle to a custom self propelled inspection car to ensure uniform inductive coupling over curved track, maintaining dynamic clearance envelop of the inspection car and design of custom broadband signal antennas to prevent interference. Additionally this paper will discuss the technical approach for conducting the inspection and results achieved during testing.
APA, Harvard, Vancouver, ISO, and other styles
9

Demetriou, Dustin W., and H. Ezzat Khalifa. "Energy Modeling of Air-Cooled Data Centers: Part II—The Effect of Recirculation on the Energy Optimization of Open-Aisle, Air-Cooled Data Centers." In ASME 2011 Pacific Rim Technical Conference and Exhibition on Packaging and Integration of Electronic and Photonic Systems. ASMEDC, 2011. http://dx.doi.org/10.1115/ipack2011-52004.

Full text
Abstract:
The work presented in this paper describes a simplified thermodynamic model that can be used for exploring optimization possibilities in air-cooled data centers. The model has been used to identify optimal, energy-efficient designs, operating scenarios, and operating parameters such as flow rates and air supply temperature. The model is used to parametrically evaluate the total energy consumption of the data center cooling infrastructure, by considering changes in the server temperature rise. The results of this parametric analysis highlight the important features that need to be considered when optimizing the operation of air-cooled data centers, especially the trade-off between low air supply temperature and increased air flow rate. The analysis is used to elucidate the deleterious effect of temperature non-uniformity at the inlet of the racks on the data center cooling infrastructure power consumption. A recirculation non-uniformity metric, θ, is introduced, which is the ratio of the maximum recirculation of any server to the average recirculation of all servers. The analysis of open-aisle data centers shows that as the recirculation non-uniformity at the inlet of the racks increases, optimal operation tends toward lower recirculation and higher power consumption; stressing the importance of providing as uniform conditions to the racks as possible. Cooling infrastructure energy savings greater than 40% are possible for a data center with uniform recirculation (θ = 0) compared to a data center with a typical recirculation non-uniformity (θ = 4). It is also revealed that servers with a modest temperature rise (∼10°C) have a wider latitude for cooling optimization than those with a high temperature rise (≥20°C).
APA, Harvard, Vancouver, ISO, and other styles
10

Raza, Syed Waqar, and Ibrahim Mostafa Deiab. "On Sustainability Assesment of Machining Processes." In ASME 2013 International Mechanical Engineering Congress and Exposition. American Society of Mechanical Engineers, 2013. http://dx.doi.org/10.1115/imece2013-65710.

Full text
Abstract:
There is an increased interest in sustainability assessment of manufacturing systems and processes because of the growing global interest in sustainable manufacturing practices. The current sustainability assessment models present a holistic approach, e.g. LCA, without much focus on process specific details. This paper uses a ‘XSI’ approach for defining sustainability indices (e.g. Energy Sustainability Index, ESI). These sustainability metrics can quantify machining processes in terms of impact on the environment and power consumption in a flexible manner, so that various material removal processes can be rated on a uniform scale. In addition, the concept of Normalization, with respect to the ‘feature-of-interest’ is introduced, thus presenting a flexible rating system in terms of process types (turning, milling etc.) and perspectives (material removal, quality etc.). A user-friendly calculator is developed, which converts a set of inputs for the machining scenario into a set of measurable rating quantities and indices including but not limited to production rate, production cost, tool life/cost, energy consumption and environmental burden. This will enable the manufacturing engineer to make an informed decision about parameter selection and process design for sustainability. Machining of hard-to-machine materials such as Titanium Alloys is such a scenario, which is used as a case study to validate the proposed approach.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography