Gotowa bibliografia na temat „Semantics preserving modeling technique”

Utwórz poprawne odniesienie w stylach APA, MLA, Chicago, Harvard i wielu innych

Wybierz rodzaj źródła:

Zobacz listy aktualnych artykułów, książek, rozpraw, streszczeń i innych źródeł naukowych na temat „Semantics preserving modeling technique”.

Przycisk „Dodaj do bibliografii” jest dostępny obok każdej pracy w bibliografii. Użyj go – a my automatycznie utworzymy odniesienie bibliograficzne do wybranej pracy w stylu cytowania, którego potrzebujesz: APA, MLA, Harvard, Chicago, Vancouver itp.

Możesz również pobrać pełny tekst publikacji naukowej w formacie „.pdf” i przeczytać adnotację do pracy online, jeśli odpowiednie parametry są dostępne w metadanych.

Artykuły w czasopismach na temat "Semantics preserving modeling technique"

1

Tigane, Samir, Fayçal Guerrouf, Nadia Hamani, Laid Kahloul, Mohamed Khalgui i Masood Ashraf Ali. "Dynamic Timed Automata for Reconfigurable System Modeling and Verification". Axioms 12, nr 3 (22.02.2023): 230. http://dx.doi.org/10.3390/axioms12030230.

Pełny tekst źródła
Streszczenie:
Modern discrete-event systems (DESs) are often characterized by their dynamic structures enabling highly flexible behaviors that can respond in real time to volatile environments. On the other hand, timed automata (TA) are powerful tools used to design various DESs. However, they lack the ability to naturally describe dynamic-structure reconfigurable systems. Indeed, TA are characterized by their rigid structures, which cannot handle the complexity of dynamic structures. To overcome this limitation, we propose an extension to TA, called dynamic timed automata (DTA), enabling the modeling and verification of reconfigurable systems. Additionally, we present a new algorithm that transforms DTA into semantic-equivalent TA while preserving their behavior. We demonstrate the usefulness and applicability of this new modeling and verification technique using an illustrative example.
Style APA, Harvard, Vancouver, ISO itp.
2

Kim, Ji-Sun, i Cheong Youn. "EMPS : An Efficient Software Merging Technique for Preserving Semantics". KIPS Transactions:PartD 13D, nr 2 (1.04.2006): 223–34. http://dx.doi.org/10.3745/kipstd.2006.13d.2.223.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
3

Varde, Aparna S., Mohammed Maniruzzaman i Richard D. Sisson. "QuenchML: A semantics-preserving markup language for knowledge representation in quenching". Artificial Intelligence for Engineering Design, Analysis and Manufacturing 27, nr 1 (15.01.2013): 65–82. http://dx.doi.org/10.1017/s0890060412000352.

Pełny tekst źródła
Streszczenie:
AbstractKnowledge representation (KR) is an important area in artificial intelligence (AI) and is often related to specific domains. The representation of knowledge in domain-specific contexts makes it desirable to capture semantics as domain experts would. This motivates the development of semantics-preserving standards for KR within the given domain. In addition to the storage and analysis of information using such standards, the effect of globalization today necessitates the publishing of information on the Web. Thus, it is advisable to use formats that make the information easily publishable and accessible while developing KR standards. In this article, we propose such a standard called Quenching Markup Language (QuenchML). This follows the syntax of the eXtensible Markup Language and captures the semantics of the quenching domain within the heat treating of materials. We describe the development of QuenchML, a multidisciplinary effort spanning the realms of AI, database management, and materials science, considering various aspects such as ontology, data modeling, and domain-specific constraints. We also explain the usefulness of QuenchML in semantics-preserving information retrieval and in text mining guided by domain knowledge. Furthermore, we outline the significance of this work in software tools within the field of AI.
Style APA, Harvard, Vancouver, ISO itp.
4

Marino, B. G., A. Masiero, F. Chiabrando, A. M. Lingua, F. Fissore, W. Błaszczak-Bak i A. Vettore. "DATA OPTIMIZATION FOR 3D MODELING AND ANALYSIS OF A FORTRESS ARCHITECTURE". ISPRS - International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences XLII-2/W11 (4.05.2019): 809–13. http://dx.doi.org/10.5194/isprs-archives-xlii-2-w11-809-2019.

Pełny tekst źródła
Streszczenie:
<p><strong>Abstract.</strong> Thanks to the recent worldwide spread of drones and to the development of structure from motion photogrammetric software, UAV photogrammetry is becoming a convenient and reliable way for the 3D documentation of built heritage. Hence, nowadays, UAV photogrammetric surveying is a common and quite standard tool for producing 3D models of relatively large areas. However, when such areas are large, then a significant part of the generated point cloud is often of minor interest. Given the necessity of efficiently dealing with storing, processing and analyzing the produced point cloud, some optimization step should be considered in order to reduce the amount of redundancy, in particular in the parts of the model that are of minor interest. Despite this can be done by means of a manual selection of such parts, an automatic selection is clearly much more viable way to speed up the final model generation. Motivated by the recent development of many semantic classification techniques, the aim of this work is investigating the use of point cloud optimization based on semantic recognition of different components in the photogrammetric 3D model. The Girifalco Fortress (Cortona, Italy) is used as case study for such investigation. The rationale of the proposed methodology is clearly that of preserving high point density in the model in the areas that describe the fortress, whereas point cloud density is dramatically reduced in vegetated and soil areas. Thanks to the implemented automatic procedure, in the considered case study, the size of the point cloud has been reduced by a factor five, approximately. It is worth to notice that such result has been obtained preserving the original point density on the fortress surfaces, hence ensuring the same capabilities of geometric analysis of the original photogrammetric model.</p>
Style APA, Harvard, Vancouver, ISO itp.
5

Batra, Dinesh. "An Event-Oriented Data Modeling Technique Based on the Cognitive Semantics Theory". Journal of Database Management 23, nr 4 (październik 2012): 52–74. http://dx.doi.org/10.4018/jdm.2012100103.

Pełny tekst źródła
Streszczenie:
The Resource-Event-Agent (REA) model has been proposed as a data modeling approach for representing accounting transactions. However, most business events are not transactions; thus, the REA formulation is incomplete. Based on the Conceptual Semantics theory, this paper discusses the entity-relationship event network (EREN) model, which extends the REA model and provides a comprehensive data template for a business event. Specifically, the notions of resource, event, and agent in the REA model are extended to include more discriminating entity types. The EREN technique can be used to identify events, sketch a network of events, and develop a data model of a business application by applying the EREN template to each event. Most extant techniques facilitate only the descriptive role whereas the EREN technique facilitates both the design and descriptive role of data modeling.
Style APA, Harvard, Vancouver, ISO itp.
6

Beck, Edgar, Carsten Bockelmann i Armin Dekorsy. "Semantic Information Recovery in Wireless Networks". Sensors 23, nr 14 (12.07.2023): 6347. http://dx.doi.org/10.3390/s23146347.

Pełny tekst źródła
Streszczenie:
Motivated by the recent success of Machine Learning (ML) tools in wireless communications, the idea of semantic communication by Weaver from 1949 has gained attention. It breaks with Shannon’s classic design paradigm by aiming to transmit the meaning of a message, i.e., semantics, rather than its exact version and, thus, enables savings in information rate. In this work, we extend the fundamental approach from Basu et al. for modeling semantics to the complete communications Markov chain. Thus, we model semantics by means of hidden random variables and define the semantic communication task as the data-reduced and reliable transmission of messages over a communication channel such that semantics is best preserved. We consider this task as an end-to-end Information Bottleneck problem, enabling compression while preserving relevant information. As a solution approach, we propose the ML-based semantic communication system SINFONY and use it for a distributed multipoint scenario; SINFONY communicates the meaning behind multiple messages that are observed at different senders to a single receiver for semantic recovery. We analyze SINFONY by processing images as message examples. Numerical results reveal a tremendous rate-normalized SNR shift up to 20 dB compared to classically designed communication systems.
Style APA, Harvard, Vancouver, ISO itp.
7

Vdovychenko, Ruslan, i Vadim Tulchinsky. "Parallel Implementation of Sparse Distributed Memory for Semantic Storage". Cybernetics and Computer Technologies, nr 2 (30.09.2022): 58–66. http://dx.doi.org/10.34229/2707-451x.22.2.6.

Pełny tekst źródła
Streszczenie:
Introduction. Sparse Distributed Memory (SDM) and Binary Sparse Distributed Representations (Binary Sparse Distributed Representations, BSDR), as two phenomenological approaches to biological memory modelling, have many similarities. The idea of ??their integration into a hybrid semantic storage model with SDM as a low-level cleaning memory (brain cells) for BSDR, which is used as an encoder of high-level symbolic information, is natural. A hybrid semantic store should be able to store holistic data (for example, structures of interconnected and sequential key-value pairs) in a neural network. A similar design has been proposed several times since the 1990s. However, the previously proposed models are impractical due to insufficient scalability and/or low storage density. The gap between SDM and BSDR can be bridged by the results of a third theory related to sparse signals: Compressive Sensing or Sampling (CS). In this article, we focus on the highly efficient parallel implementation of the CS-SDM hybrid memory model for graphics processing units on the NVIDIA CUDA platform, analyze the computational complexity of CS-SDM operations for the case of parallel implementation, and offer optimization techniques for conducting experiments with big sequential batches of vectors. The purpose of the paper is to propose an efficient software implementation of sparse-distributed memory for preserving semantics on modern graphics processing units. Results. Parallel algorithms for CS-SDM operations are proposed, their computational complexity is estimated, and a parallel implementation of the CS-SDM hybrid semantic store is given. Optimization of vector reconstruction for experiments with sequential data batches is proposed. Conclusions. The obtained results show that the design of CS-SDM is naturally parallel and that its algorithms are by design compatible with the architecture of systems with massive parallelism. The conducted experiments showed high performance of the developed implementation of the SDM memory block. Keywords: GPU, CUDA, neural network, Sparse Distributed Memory, associative memory, Compressive Sensing.
Style APA, Harvard, Vancouver, ISO itp.
8

Albert, Elvira, Nikolaos Bezirgiannis, Frank de Boer i Enrique Martin-Martin. "A Formal, Resource Consumption-Preserving Translation from Actors with Cooperative Scheduling to Haskell*". Fundamenta Informaticae 177, nr 3-4 (10.12.2020): 203–34. http://dx.doi.org/10.3233/fi-2020-1988.

Pełny tekst źródła
Streszczenie:
We present a formal translation of a resource-aware extension of the Abstract Behavioral Specification (ABS) language to the functional language Haskell. ABS is an actor-based language tailored to the modeling of distributed systems. It combines asynchronous method calls with a suspend and resume mode of execution of the method invocations. To cater for the resulting cooperative scheduling of the method invocations of an actor, the translation exploits for the compilation of ABS methods Haskell functions with continuations. The main result of this article is a correctness proof of the translation by means of a simulation relation between a formal semantics of the source language and a high-level operational semantics of the target language, i.e., a subset of Haskell. We further prove that the resource consumption of an ABS program extended with a cost model is preserved over this translation, as we establish an equivalence of the cost of executing the ABS program and its corresponding Haskell-translation. Concretely, the resources consumed by the original ABS program and those consumed by the Haskell program are the same, considering a cost model. Consequently, the resource bounds automatically inferred for ABS programs extended with a cost model, using resource analysis tools, are sound resource bounds also for the translated Haskell programs. Our experimental evaluation confirms the resource preservation over a set of benchmarks featuring different asymptotic costs.
Style APA, Harvard, Vancouver, ISO itp.
9

Pimentel-Niño, M. A., Paresh Saxena i M. A. Vazquez-Castro. "Reliable Adaptive Video Streaming Driven by Perceptual Semantics for Situational Awareness". Scientific World Journal 2015 (2015): 1–16. http://dx.doi.org/10.1155/2015/394956.

Pełny tekst źródła
Streszczenie:
A novel cross-layer optimized video adaptation driven by perceptual semantics is presented. The design target is streamed live video to enhance situational awareness in challenging communications conditions. Conventional solutions for recreational applications are inadequate and novel quality of experience (QoE) framework is proposed which allows fully controlled adaptation and enables perceptual semantic feedback. The framework relies on temporal/spatial abstraction for video applications serving beyond recreational purposes. An underlying cross-layer optimization technique takes into account feedback on network congestion (time) and erasures (space) to best distribute available (scarce) bandwidth. Systematic random linear network coding (SRNC) adds reliability while preserving perceptual semantics. Objective metrics of the perceptual features in QoE show homogeneous high performance when using the proposed scheme. Finally, the proposed scheme is in line with content-aware trends, by complying with information-centric-networking philosophy and architecture.
Style APA, Harvard, Vancouver, ISO itp.
10

Motzek, Alexander, i Ralf Möller. "Indirect Causes in Dynamic Bayesian Networks Revisited". Journal of Artificial Intelligence Research 59 (27.05.2017): 1–58. http://dx.doi.org/10.1613/jair.5361.

Pełny tekst źródła
Streszczenie:
Modeling causal dependencies often demands cycles at a coarse-grained temporal scale. If Bayesian networks are to be used for modeling uncertainties, cycles are eliminated with dynamic Bayesian networks, spreading indirect dependencies over time and enforcing an infinitesimal resolution of time. Without a ``causal design,'' i.e., without anticipating indirect influences appropriately in time, we argue that such networks return spurious results. By identifying activator random variables, we propose activator dynamic Bayesian networks (ADBNs) which are able to rapidly adapt to contexts under a causal use of time, anticipating indirect influences on a solid mathematical basis using familiar Bayesian network semantics. ADBNs are well-defined dynamic probabilistic graphical models allowing one to model cyclic dependencies from local and causal perspectives while preserving a classical, familiar calculus and classically known algorithms, without introducing any overhead in modeling or inference.
Style APA, Harvard, Vancouver, ISO itp.

Rozprawy doktorskie na temat "Semantics preserving modeling technique"

1

Cortés, Luis Alejandro. "A Petri Net based Modeling and Verification Technique for Real-Time Embedded Systems". Licentiate thesis, Linköping University, Linköping University, ESLAB - Embedded Systems Laboratory, 2001. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-5751.

Pełny tekst źródła
Streszczenie:

Embedded systems are used in a wide spectrum of applications ranging from home appliances and mobile devices to medical equipment and vehicle controllers. They are typically characterized by their real-time behavior and many of them must fulfill strict requirements on reliability and correctness.

In this thesis, we concentrate on aspects related to modeling and formal verification of realtime embedded systems.

First, we define a formal model of computation for real-time embedded systems based on Petri nets. Our model can capture important features of such systems and allows their representations at different levels of granularity. Our modeling formalism has a welldefined semantics so that it supports a precise representation of the system, the use of formal methods to verify its correctness, and the automation of different tasks along the design process.

Second, we propose an approach to the problem of formal verification of real-time embedded systems represented in our modeling formalism. We make use of model checking to prove whether certain properties, expressed as temporal logic formulas, hold with respect to the system model. We introduce a systematic procedure to translate our model into timed automata so that it is possible to use available model checking ools. Various examples, including a realistic industrial case, demonstrate the feasibility of our approach on practical applications.

Style APA, Harvard, Vancouver, ISO itp.
2

Athaiya, Snigdha. "Extending Program Analysis Techniques to Web Applications and Distributed Systems". Thesis, 2020. https://etd.iisc.ac.in/handle/2005/5523.

Pełny tekst źródła
Streszczenie:
Web-based applications and distributed systems are ubiquitous and indispensable today. These systems use multiple parallel machines for greater functionality, and efficient and reliable computation. At the same time they present innumerable challenges, especially in the field of program analysis. In this thesis, we address two problems in the domain of web based applications and distributed systems relating to program analysis, and design effective solutions for those problems. The first challenge that the thesis addresses is the difficulty of analyzing a web application in an end-to-end manner using a single tool. Such an analysis is hard due to client-server interaction, user interaction, and the use of multiple types of languages, and frameworks in a web application. We propose a semantics preserving modeling technique, that converts a web application into a single-language program. The model of a web application in the thesis is a Java program as we present our modeling technique in the context Java-based web applications. As a result of the translation, off -the-shelf tools available for Java can now be used to analyze the application. We have built a tool for the translation of applications. We evaluate our translation tool by converting 5 real world web applications into corresponding models, and then analyzing the models using 3 popular third-party program analysis tools - Wala (static slicing), Java PathFinder (explicit-state and symbolic model checking), and Zoltar (dynamic fault localization). In all the analysis tools, we get precise results for most cases. The second challenge that the thesis addresses, is the precise data flow analysis of message passing asynchronous systems. Message passing systems are distributed systems, where multiple processes execute concurrently, and communicate with each other by passing messages to the channels associated with each process. These systems encompass majority of the real world distributed systems, e.g., web applications, event-driven programs, reactive systems, etc. Therefore, there is a clear need for robust program analysis techniques for these systems. One such technique is data flow analysis, which statically analyzes a program, and approximates the values of variables in the program due to all runs of the program, using lattices. Any precise data flow analysis needs to account for the blocking of execution in message passing systems, when the required message is not present in the channel. Current data flow analysis techniques for message passing systems either over-approximate the behavior by allowing non-blocking receive operations, or, they are not applicable to general data flow lattices. The thesis proposes algorithms for performing precise data flow analysis of message passing asynchronous programs, using infinite, but finite height lattices. The problem was not known to be decidable before. The algorithm builds on the concepts of parallel systems modeling theory, in a novel and involved manner. We have also made a tool for the algorithm, and have studied its precision and performance by analyzing 10 well-known asynchronous systems and protocols with encouraging results
Style APA, Harvard, Vancouver, ISO itp.
3

Hausmann, Jan Hendrik [Verfasser]. "Dynamic META modeling : a semantics description technique for visual modeling languages / Jan Hendrik Hausmann". 2005. http://d-nb.info/978511158/34.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.

Części książek na temat "Semantics preserving modeling technique"

1

Oluwagbemi, Oluwatolani, i Hishammuddin Asmuni. "An Improved Model-Based Technique for Generating Test Scenarios from UML Class Diagrams". W Advances in Systems Analysis, Software Engineering, and High Performance Computing, 434–48. IGI Global, 2014. http://dx.doi.org/10.4018/978-1-4666-6026-7.ch019.

Pełny tekst źródła
Streszczenie:
The foundation of any software testing process is test scenario generation. This is because it forecasts the expected output of a system under development by extracting the artifacts expressed in any of the Unified Modeling Language (UML) diagrams, which are eventually used as the basis for software testing. Class diagrams are UML structural diagrams that describe a system by displaying its classes, attributes, and the relationships between them. Existing class diagram-based test scenario generation techniques only extract data variables and functions, which leads to incomprehensible or vague test scenarios. Consequently, this chapter aims to develop an improved technique that automatically generates test scenarios by reading, extracting, and interpreting the sets of objects that share attributes, operations, relationships, and semantics in a class diagram. From the performance evaluation, the proposed model-based technique is efficiently able to read, interpret, and generate scenarios from all the descriptive links of a class diagram.
Style APA, Harvard, Vancouver, ISO itp.
2

Shinde, Shweta Annasaheb, i Prabu Sevugan. "Glorified Secure Search Schema Over Encrypted Secure Cloud Storage With a Hierarchical Clustering Computation". W Big Data Analytics for Satellite Image Processing and Remote Sensing, 72–98. IGI Global, 2018. http://dx.doi.org/10.4018/978-1-5225-3643-7.ch005.

Pełny tekst źródła
Streszczenie:
This chapter improves the SE scheme to grasp these contest difficulties. In the development, prototypical, hierarchical clustering technique is intended to lead additional search semantics with a supplementary feature of making the scheme to deal with the claim for reckless cipher text search in big-scale surroundings, such situations where there is a huge amount of data. Least relevance of threshold is considered for clustering the cloud document with hierarchical approach, and it divides the clusters into sub-clusters until the last cluster is reached. This method may affect the linear computational complexity versus the exponential growth of group of documents. To authenticate the validity for search, minimum hash sub tree is also implemented. This chapter focuses on fetching of cloud data of a subcontracted encrypted information deprived of loss of idea and of security and privacy by transmission attribute key to the information. In the next level, the typical is improved with a multilevel conviction privacy preserving scheme.
Style APA, Harvard, Vancouver, ISO itp.
3

Shinde, Shweta Annasaheb, i Prabu Sevugan. "Glorified Secure Search Schema Over Encrypted Secure Cloud Storage With a Hierarchical Clustering Computation". W Cloud Security, 657–77. IGI Global, 2019. http://dx.doi.org/10.4018/978-1-5225-8176-5.ch033.

Pełny tekst źródła
Streszczenie:
This chapter improves the SE scheme to grasp these contest difficulties. In the development, prototypical, hierarchical clustering technique is intended to lead additional search semantics with a supplementary feature of making the scheme to deal with the claim for reckless cipher text search in big-scale surroundings, such situations where there is a huge amount of data. Least relevance of threshold is considered for clustering the cloud document with hierarchical approach, and it divides the clusters into sub-clusters until the last cluster is reached. This method may affect the linear computational complexity versus the exponential growth of group of documents. To authenticate the validity for search, minimum hash sub tree is also implemented. This chapter focuses on fetching of cloud data of a subcontracted encrypted information deprived of loss of idea and of security and privacy by transmission attribute key to the information. In the next level, the typical is improved with a multilevel conviction privacy preserving scheme.
Style APA, Harvard, Vancouver, ISO itp.
4

Abirami A.M, Askarunisa A., Shiva Shankari R A i Revathy R. "Ontology Based Feature Extraction From Text Documents". W Applications of Security, Mobile, Analytic, and Cloud (SMAC) Technologies for Effective Information Processing and Management, 174–95. IGI Global, 2018. http://dx.doi.org/10.4018/978-1-5225-4044-1.ch009.

Pełny tekst źródła
Streszczenie:
This article describes how semantic annotation is the most important need for the categorization of labeled or unlabeled textual documents. Accuracy of document categorization can be greatly improved if documents are indexed or modeled using the semantics rather than the traditional term-frequency model. This annotation has its own challenges like synonymy and polysemy in the document categorization problem. The model proposes to build domain ontology for the textual content so that the problems like synonymy and polysemy in text analysis are resolved to greater extent. Latent Dirichlet Allocation (LDA), the topic modeling technique has been used for feature extraction from the documents. Using the domain knowledge on the concept and the features grouped by LDA, the domain ontology is built in the hierarchical fashion. Empirical results show that LDA is the better feature extraction technique for text documents than TF or TF-IDF indexing technique. Also, the proposed model shows improvement in the accuracy of document categorization when domain ontology built using LDA has been used for document indexing.
Style APA, Harvard, Vancouver, ISO itp.

Streszczenia konferencji na temat "Semantics preserving modeling technique"

1

Almeida, Joao Paulo A., Fernando A. Musso, Victorio A. Carvalho, Claudenir M. Fonseca i Giancarlo Guizzardi. "Preserving Multi-level Semantics in Conventional Two-Level Modeling Techniques". W 2019 ACM/IEEE 22nd International Conference on Model Driven Engineering Languages and Systems Companion (MODELS-C). IEEE, 2019. http://dx.doi.org/10.1109/models-c.2019.00025.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
2

Zhao, Fangyuan, Xuebin Ren, Shusen Yang i Xinyu Yang. "On Privacy Protection of Latent Dirichlet Allocation Model Training". W Twenty-Eighth International Joint Conference on Artificial Intelligence {IJCAI-19}. California: International Joint Conferences on Artificial Intelligence Organization, 2019. http://dx.doi.org/10.24963/ijcai.2019/675.

Pełny tekst źródła
Streszczenie:
Latent Dirichlet Allocation (LDA) is a popular topic modeling technique for discovery of hidden semantic architecture of text datasets, and plays a fundamental role in many machine learning applications. However, like many other machine learning algorithms, the process of training a LDA model may leak the sensitive information of the training datasets and bring significant privacy risks. To mitigate the privacy issues in LDA, we focus on studying privacy-preserving algorithms of LDA model training in this paper. In particular, we first develop a privacy monitoring algorithm to investigate the privacy guarantee obtained from the inherent randomness of the Collapsed Gibbs Sampling (CGS) process in a typical LDA training algorithm on centralized curated datasets. Then, we further propose a locally private LDA training algorithm on crowdsourced data to provide local differential privacy for individual data contributors. The experimental results on real-world datasets demonstrate the effectiveness of our proposed algorithms.
Style APA, Harvard, Vancouver, ISO itp.
3

Barakat, M., W. Zahra i A. Elsaid. "A Modified Polynomial Preserving Recovery Technique". W 12th International Conference on Simulation and Modeling Methodologies, Technologies and Applications. SCITEPRESS - Science and Technology Publications, 2022. http://dx.doi.org/10.5220/0011263400003274.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
4

Rafiq, Danish, i Mohammad Abid Bazaz. "Structure Preserving Nonlinear Reduced Order Modeling Technique for Power Systems". W 2021 Seventh Indian Control Conference (ICC). IEEE, 2021. http://dx.doi.org/10.1109/icc54714.2021.9703187.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
5

Liu, Weiming, Chaochao Chen, Xinting Liao, Mengling Hu, Jianwei Yin, Yanchao Tan i Longfei Zheng. "Federated Probabilistic Preference Distribution Modelling with Compactness Co-Clustering for Privacy-Preserving Multi-Domain Recommendation". W Thirty-Second International Joint Conference on Artificial Intelligence {IJCAI-23}. California: International Joint Conferences on Artificial Intelligence Organization, 2023. http://dx.doi.org/10.24963/ijcai.2023/245.

Pełny tekst źródła
Streszczenie:
With the development of modern internet techniques, Cross-Domain Recommendation (CDR) systems have been widely exploited for tackling the data-sparsity problem. Meanwhile most current CDR models assume that user-item interactions are accessible across different domains. However, such knowledge sharing process will break the privacy protection policy. In this paper, we focus on the Privacy-Preserving Multi-Domain Recommendation problem (PPMDR). The problem is challenging since different domains are sparse and heterogeneous with the privacy protection. To tackle the above issues, we propose Federated Probabilistic Preference Distribution Modelling (FPPDM). FPPDM includes two main components, i.e., local domain modelling component and global server aggregation component with federated learning strategy. The local domain modelling component aims to exploit user/item preference distributions using the rating information in the corresponding domain. The global server aggregation component is set to combine user characteristics across domains. To better extract semantic neighbors information among the users, we further provide compactness co-clustering strategy in FPPDM ++ to cluster the users with similar characteristics. Our empirical studies on benchmark datasets demonstrate that FPPDM/ FPPDM ++ significantly outperforms the state-of-the-art models.
Style APA, Harvard, Vancouver, ISO itp.
6

Massoni, Tiago, Rohit Gheyi i Paulo Borba. "Formal Refactoring for UML Class Diagrams". W Simpósio Brasileiro de Engenharia de Software. Sociedade Brasileira de Computação, 2005. http://dx.doi.org/10.5753/sbes.2005.23817.

Pełny tekst źródła
Streszczenie:
Refactoring UML models for evolution is usually carried out in an ad hoc way. These transformations can become an issue, since it is hard to ensure that the semantics of models is preserved. We provide a set of semantics-preserving transformations for UML class diagrams annotaded with OCL. Using the proposed transformations, software designers can safely define larger transformations and detect subtle problems when refactoring models. Semantics-preserving transformations can also be useful from design pattern introduction to MDA. We prove that our transformations are sound using a semantic model that is based on Alloy, which is a formal modeling language. Due to Alloy’s ammenability to automatic analysis, our approach may additionally bring such analysis to class diagrams.
Style APA, Harvard, Vancouver, ISO itp.
7

Arisoy, Erhan Batuhan, i Levent Burak Kara. "Topology Preserving Digitization of Physical Prototypes Using Deformable Subdivision Models". W ASME 2014 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. American Society of Mechanical Engineers, 2014. http://dx.doi.org/10.1115/detc2014-34390.

Pełny tekst źródła
Streszczenie:
Physical prototyping is an important stage of product design where designers have a chance to physically evaluate and alter digitally created surfaces. In these scenarios, designers generate a digital model, manufacture and alter the prototype as needed, and redigitize the prototype through scanning. Despite the variety of reverse engineering tools, redigitizing the prototypes into forms amenable to further digital editing remains a challenge. This is because current digitization methods cannot take advantage of the key elements of the original digital model such as the wireframe topology and surface flows. This paper presents a new reverse engineering method that augments conventional digitization with the knowledge of the original digital model’s curve topology to enhance iterative shape design activities. Our algorithm takes as input a curve network topology forming a subdivision control cage and a 3D scan of the physically modified prototype. To facilitate the digital capture of the physical modifications, our algorithm performs a series of registration, correspondence and deformation calculations to compute the new configuration of the initial control cage. The key advantage of the proposed technique is the preservation of the edge flows and initial topology while transferring surface modifications from prototypes. Our studies show that the proposed technique can be particularly useful for bridging the gap between physical and digital modeling in the early stages of product design.
Style APA, Harvard, Vancouver, ISO itp.
8

Kaneshiro, Percy Javier Igei, Jose´ Isidro Garcia Melo, Paulo E. Miyagi i Carlos E. Cugnasca. "Modeling of Collision Resolution Algorithm in LonWorks Networks". W ASME 2007 International Mechanical Engineering Congress and Exposition. ASMEDC, 2007. http://dx.doi.org/10.1115/imece2007-43950.

Pełny tekst źródła
Streszczenie:
The industrial organization of productive processes have presented a tendency for dispersion and distribution of manufacturing plants based on an increase of resources and functionality of information and mobility technology. In this sense, Local Operating Networks (LON, LonWorks) is the name of one of the leading technologies in sensor/control networking, addressed to a wide range of applications with topology free technology. In spite of its popularity, only a few design tools are available to simulate LonWorks architectures and predict their performance. In this paper, a model of the collision resolution algorithm of LonWorks is described. The approach developed in this work is based on the characterization of LonWorks networks as Discrete Event Dynamic Systems (DEDS), since dynamic behavior is defined through the discrete events and discrete states. The proposed procedure employs techniques that are derived from interpreted Petri net, which has been used as an efficient tool for modeling, analysis and control of DEDS. In this context, the media access control sublayer (MAC) is modeled in different levels of abstraction: a conceptual model which is obtained using the PFS (Production Flow Schema) technique, and a functional model by using MFG (Mark Flow Graph). The MFG abstraction level describes details in a functional form preserving the description activities of upper levels. This procedure allows the structured development of models, facilitating the modeling process of the algorithm specification. The result presented in this paper for a single network segment can be integrated into a global networks modeling, since the proposed procedure decomposes the model systematically according to a hierarchal approach into different modules. In this way, it is straightforward to integrate single segments networks models into complex networks.
Style APA, Harvard, Vancouver, ISO itp.
9

Fujimoto, Keiichiro, i Kozo Fujii. "Study on the Automated CFD Analysis Tools for Conceptual Design of Space Transportation Vehicles". W ASME/JSME 2007 5th Joint Fluids Engineering Conference. ASMEDC, 2007. http://dx.doi.org/10.1115/fedsm2007-37128.

Pełny tekst źródła
Streszczenie:
In order to improve turnaround time and usability of the aerodynamic analysis tool such as for the conceptual design of space launch vehicles, an automated high fidelity numerical aerodynamic analysis method is developed. This method has capability of fully automated and quick execution from geometry modeling through flow computation for high Reynolds number viscous flow over complicated geometry. The present method is based on the locally-body-fitted Cartesian grid method, which is applicable for the viscous flow computation over the complicated geometry and is easy-to-use requiring less expertise. Since this grid generation method has not been established due to the lack of robustness for feature preserving technique, robust feature preserving technique is developed in this study. In addition, the present method is validated and its prediction capability is confirmed through the application to the typical test problems including transonic airfoil flows, supersonic hemisphere flows, and subsonic/supersonic separated flow over the Apollo capsule. Finally, the proposed aerodynamic analysis method is applied to an aerodynamic analysis of aerodynamic fin effect on the single-stage-to-orbit (SSTO) rocket vehicle.
Style APA, Harvard, Vancouver, ISO itp.
10

Tourlomousis, Filippos, i Robert C. Chang. "2D and 3D Multiscale Computational Modeling of Dynamic Microorgan Devices as Drug Screening Platforms". W ASME 2015 International Mechanical Engineering Congress and Exposition. American Society of Mechanical Engineers, 2015. http://dx.doi.org/10.1115/imece2015-52734.

Pełny tekst źródła
Streszczenie:
The ability to incorporate three-dimensional (3D) hepatocyte-laden hydrogel constructs using layered fabrication approaches into devices that can be perfused with drugs enables the creation of dynamic microorgan devices (DMDs) that offer an optimal analog of the in vivo liver metabolism scenario. The dynamic nature of such in vitro metabolism models demands reliable numerical tools to determine the optimum process, material, and geometric parameters for the most effective metabolic conversion of the perfused drug into the liver microenvironment. However, there is a current lack of literature that integrates computational approaches to guide the optimum design of such devices. The groundwork of the present numerical study has been laid by our previous study [1], where the authors modeled in 2D an in vitro DMD of arbitrary dimensions and identified the modeling challenges towards meaningful results. These constructs are hosted in the chamber of the microfluidic device serving as walls of the microfluidic array of channels through which a fluorescent drug substrate is perfused into the microfluidic printed channel walls at a specified volumetric flow rate assuring Stokes flow conditions (Re<<1). Due to the porous nature of the hydrogel walls, a metabolized drug product is collected at the outlet port. A rigorous FEM based modeling approach is presented for a single channel parallel model geometry (1 free flow channel with 2 porous walls), where the hydrodynamics, mass transfer and pharmacokinetics equations are solved numerically in order to yield the drug metabolite concentration profile at the DMD outlet. The fluid induces shear stresses are assessed both in 3D, with only 27 cells modeled as single compartment voids, where all of the enzymatic reactions are assumed to take place. In this way, the mechanotransduction effect that alters the hepatocyte metabolic activity is assessed for a small scale model. This approach overcomes the numerical limitations imposed by the cell density (∼1012 cells/m3) of the large scale DMD device. In addition, a compartmentalization technique is proposed in order to assess the metabolism process at the subcellular level. The numerical results are validated with experiments to reveal the robustness of the proposed modeling approach and the necessity of scaling the numerical results by preserving dynamic and biochemical similarity between the small and large scale model.
Style APA, Harvard, Vancouver, ISO itp.
Oferujemy zniżki na wszystkie plany premium dla autorów, których prace zostały uwzględnione w tematycznych zestawieniach literatury. Skontaktuj się z nami, aby uzyskać unikalny kod promocyjny!

Do bibliografii