Academic literature on the topic 'Schema enrichment'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Schema enrichment.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Schema enrichment"

1

Malki, Doha, and Mohamed Bahaj. "Semantic Enrichment of XML Schema to Transform Association Relationships in ODL Schema." Journal of Software Engineering and Applications 08, no. 02 (2015): 62–71. http://dx.doi.org/10.4236/jsea.2015.82008.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Chen, Shu-Jiun. "Semantic Enrichment of Linked Personal Authority Data: A Case Study of Elites in Late Imperial China." KNOWLEDGE ORGANIZATION 46, no. 8 (2019): 607–14. http://dx.doi.org/10.5771/0943-7444-2019-8-607.

Full text
Abstract:
The study uses the Database of Names and Biographies (DNB) as an example to explore how in the transformation of original data into linked data, semantic enrichment can enhance engagement in digital humanities. In the preliminary results, we have defined instance-based and schema-based categories of semantic enrichment. In the instance-based category, in which enrichment occurs by enhancing the content of entities, we further determined three types, including: 1) enriching the entities by linking to diverse external resources in order to provide additional data of multiple perspectives; 2) enriching the entities with missing data, which is needed to satisfy the semantic queries; and, 3) providing the entities with access to an extended knowledge base. In the schema-based categories that enrichment occurs by enhancing the relations between the properties, we have identified two types, including: 1) enriching the properties by defining the hierarchical relations between properties; and, 2) specifying properties’ domain and range for data reasoning. In addition, the study implements the LOD dataset in a digital humanities platform to demonstrate how instances and entities can be applied in the full texts where the relationship between entities are highlighted in order to bring scholars more semantic details of the texts.
APA, Harvard, Vancouver, ISO, and other styles
3

Dunphy, Kim, John Smithies, Surajen Uppal, Holly Schauble, and Amy Stevenson. "Positing a schema of measurable outcomes of cultural engagement." Evaluation 26, no. 4 (September 28, 2020): 474–98. http://dx.doi.org/10.1177/1356389020952460.

Full text
Abstract:
The cultural sector (arts, heritage and library agencies) receives increasing demands to articulate and evaluate outcomes of its work. These demands are challenging because of pervasive conceptions that the intangible nature of cultural activities makes them inherently unmeasurable, while their ‘intrinsic’ properties render them essentially valuable. This article addresses this challenge in positing a schema of measurable cultural outcomes of cultural engagement developed through literature analysis and an iterative Delphi-style stakeholder consultation. The outcomes are as follows: creativity stimulated; aesthetic enrichment experienced; knowledge, ideas or insight gained; diversity of cultural expression appreciated; and sense of belonging to cultural heritage deepened.
APA, Harvard, Vancouver, ISO, and other styles
4

Bloch, Tanya. "Connecting research on semantic enrichment of BIM - review of approaches, methods and possible applications." Journal of Information Technology in Construction 27 (April 20, 2022): 416–40. http://dx.doi.org/10.36680/j.itcon.2022.020.

Full text
Abstract:
Semantic enrichment of BIM models is a process designed to add meaningful semantics to the information represented in a building model. Although semantic enrichment provides a valuable opportunity for BIM technology to reach its full potential, it is considered an emergent field of research. As such, the body of knowledge on the subject is incomplete and lacks formal definition of the process, possible applications, contributions, and computational approaches. In this work, an extensive literature review is performed to begin forming the body of knowledge in this field. A bibliometric analysis of relevant publications is implemented to identify previously explored approaches and methods for enrichment. Papers describing previous work in the field demonstrate the application of semantic enrichment to building information stored in accordance to the Industry Foundation Classes (IFC) schema as well as based on a web ontology. A detailed content analysis illustrates the benefits of semantic enrichment for various tasks in the BIM domain, including improvement of data exchange routines, design analysis and processing data obtained by remote sensing techniques. A formal definition for "semantic enrichment of BIM" is suggested based on the common features identified during the literature review. This work discusses the significance of semantic enrichment to a BIM workflow, pinpoints its current research gaps and describes direction for future research.
APA, Harvard, Vancouver, ISO, and other styles
5

Chen, Shu-Jiun. "Semantic Enrichment of Linked Archival Materials." KNOWLEDGE ORGANIZATION 46, no. 7 (2019): 530–47. http://dx.doi.org/10.5771/0943-7444-2019-7-530.

Full text
Abstract:
By using the metadata for the fonds of “Chen Cheng-po’s Paintings and Documents” (CCP) in the database of the Archives of the Institute of Taiwan History (IHT, Academia Sinica, Taiwan), we develop and enhance a semantic data model for converting the data into a linked data project, focusing on data modeling, data reconciliation, and data enrichment. The research questions are: 1) How can we keep the original rich and contextual information of the archival materials during a LOD task?; 2) How can we integrate heterogeneous datasets about the same real-world resources from libraries, archives, and museums, while keeping the different views distinct?; and, (3) How can we provide added value for semantic metadata of archives in terms of instance-based and schema-based types of enrichment? The project adopts the Europeana Data Model (EDM) as the main model and extends the properties to fit the contextual characteristics of archival materials. Various methods are explored to preserve the hierarchical structure and context of the archival materials, to enrich semantic data, and to connect data from different sources and institutions. We propose four approaches to enriching data semantics by: 1) directly using external vocabularies; 2) reconciling local links to other linked data sources; 3) introducing contextual classes for the appropriate contextual entities; and, 4) utilizing named entity extraction. The results can contribute to the best practice for developing linked data for art-related archival materials.
APA, Harvard, Vancouver, ISO, and other styles
6

Zou, Xiaozhu, Siyi Xiong, Zhi Li, and Ping Jiang. "Constructing Metadata Schema of Scientific and Technical Report Based on FRBR." Computer and Information Science 11, no. 2 (March 26, 2018): 34. http://dx.doi.org/10.5539/cis.v11n2p34.

Full text
Abstract:
Scientific and technical report is an important document type with high intelligence value. But the resource distribution of different carrier forms of scientific and technical report is not integrated and the resource description is not deeply and specific enough to the report document type, which influences the information searching accuracy and efficiency for users. Functional Requirements for Bibliographic Records (FRBR), an emerging model in the bibliographic domain, provides interesting possibilities in terms of cataloguing, representation and semantic enrichment of bibliographic data. This study employs the FRBR conceptual model and entity-relationship analysis method to design in-depth descriptive metadata schema of scientific and technical report by analyzing the entities and mapping the bibliographic attributes corresponding to the characteristics of report, which can help to integrating and disclosing scientific and technical report resources.
APA, Harvard, Vancouver, ISO, and other styles
7

Suri, Sahaana, Ihab F. Ilyas, Christopher Ré, and Theodoros Rekatsinas. "Ember." Proceedings of the VLDB Endowment 15, no. 3 (November 2021): 699–712. http://dx.doi.org/10.14778/3494124.3494149.

Full text
Abstract:
Structured data, or data that adheres to a pre-defined schema, can suffer from fragmented context: information describing a single entity can be scattered across multiple datasets or tables tailored for specific business needs, with no explicit linking keys. Context enrichment, or rebuilding fragmented context, using keyless joins is an implicit or explicit step in machine learning (ML) pipelines over structured data sources. This process is tedious, domain-specific, and lacks support in now-prevalent no-code ML systems that let users create ML pipelines using just input data and high-level configuration files. In response, we propose Ember, a system that abstracts and automates keyless joins to generalize context enrichment. Our key insight is that Ember can enable a general keyless join operator by constructing an index populated with task-specific embeddings. Ember learns these embeddings by leveraging Transformer-based representation learning techniques. We describe our architectural principles and operators when developing Ember, and empirically demonstrate that Ember allows users to develop no-code context enrichment pipelines for five domains, including search, recommendation and question answering, and can exceed alternatives by up to 39% recall, with as little as a single line configuration change.
APA, Harvard, Vancouver, ISO, and other styles
8

XI, HONGWEI. "Dependent ML An approach to practical programming with dependent types." Journal of Functional Programming 17, no. 2 (March 2007): 215–86. http://dx.doi.org/10.1017/s0956796806006216.

Full text
Abstract:
AbstractWe present an approach to enriching the type system of ML with a restricted form of dependent types, where type index terms are required to be drawn from a given type index language${\cal L}$that is completely separate from run-time programs, leading to the DML(${\cal L}$) language schema. This enrichment allows for specification and inference of significantly more precise type information, facilitating program error detection and compiler optimization. The primary contribution of the paper lies in our language design, which can effectively support the use of dependent types in practical programming. In particular, this design makes it both natural and straightforward to accommodate dependent types in the presence of effects such as references and exceptions.
APA, Harvard, Vancouver, ISO, and other styles
9

Coppens, Sam, Erik Mannens, and Rik Van de Walle. "Disseminating Heritage Records as Linked Open Data." International Journal of Virtual Reality 8, no. 3 (January 1, 2009): 39–44. http://dx.doi.org/10.20870/ijvr.2009.8.3.2740.

Full text
Abstract:
In Flanders, Belgium, many heritage institutions disseminate their metadata using the Open Archives Initiative Protocol for Metadata Harvesting. However this protocol does not offer granular access to the metadata. This can be solved by exposing the metadata as Linked Data. For this, we developed a semantic metadata schema consisting of two layers: One layer gives a Dublin Core description and is responsible for searching the whole dataset. The other layer holds a reference to the original metadata record, e.g., MARC record. Doing this, the user can still access the original record, once he found the data of interest using the Dublin Core description. Then, these metadata records are enriched in two stages: First, we enrich the records internally, interlinking all the harvested metadata from the Flemish heritage institutions. Then, we enrich the records with other datasets like DBpedia, weaving the information into the Web of Data. For publishing the records as linked open data, we enhanced the OAI2LOD Server to import data coming from different OAI-PMH repositories, to expose the records as linked open data using our developed metadata schema and to enrich the records using our metadata enrichment algorithm. This way, the data from Flemish heritage repositories are linked with each other and published as linked open data.
APA, Harvard, Vancouver, ISO, and other styles
10

Fedriani, Chiara, and Piera Molinelli. "Italian ma ‘but’ in deverbal pragmatic markers: Forms, functions, and productivity of a pragma-dyad." Cuadernos de Filología Italiana 26 (October 2, 2019): 29–56. http://dx.doi.org/10.5209/cfit.62864.

Full text
Abstract:
This paper focuses on the development, paradigmaticization, and productivity of a set of complex pragmatic markers in Italian, which are constituted by two distinct elements, namely the adversative conjunction ma ‘but’ and a deverbal pragmatic marker (e.g., ma dai ‘come on! really!’, literally: ‘but give’, or ma piantala ‘just stop! give it a rest!’, literally ‘but dump it’). The main idea we will develop is that such a complex pattern can be better described in terms of a pragma-dyad, i.e., a dyadic construction with a pragmatic meaning, featuring a fixed element which systematically combines with a set of preferential fillers. In our case, the fixed element is ma, which generally signals a contrast with the interlocutors’ point of view, thus shaping the pragmatic meaning of the resulting complex in terms of interactional contrast. Such a meaning is then functionally enriched through a variety of fillers compatible with the schema, which actualize it in conveying mock politeness, disagreement, counter-expectation, and pragmatically neighbouring values. By providing a corpus-based study of the development and productivity of these complex markers, we illustrate the empirical and theoretical advantages which a pragma-dyadic approach can offer in exploring processes of functional enrichment involving complex markers.
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Schema enrichment"

1

PORRINI, RICCARDO. "Construction and Maintenance of Domain Specific Knowledge Graphs for Web Data Integration." Doctoral thesis, Università degli Studi di Milano-Bicocca, 2016. http://hdl.handle.net/10281/126789.

Full text
Abstract:
A Knowledge Graph (KG) is a semantically organized, machine readable collection of types, entities, and relations holding between them. A KG helps in mitigating semantic heterogeneity in scenarios that require the integration of data from independent sources into a so called dataspace, realized through the establishment of mappings between the sources and the KG. Applications built on top of a dataspace provide advanced data access features to end-users based on the representation provided by the KG, obtained through the enrichment of the KG with domain specific facets. A facet is a specialized type of relation that models a salient characteristic of entities of particular domains (e.g., the vintage of wines) from an end-user perspective. In order to enrich a KG with a salient and meaningful representation of data, domain experts in charge of maintaining the dataspace must be in possess of extensive knowledge about disparate domains (e.g., from wines to football players). From an end-user perspective, the difficulties in the definition of domain specific facets for dataspaces significantly reduce the user-experience of data access features and thus the ability to fulfill the information needs of end-users. Remarkably, this problem has not been adequately studied in the literature, which mostly focuses on the enrichment of the KG with a generalist, coverage oriented, and not domain specific representation of data occurring in the dataspace. Motivated by this challenge, this dissertation introduces automatic techniques to support domain experts in the enrichment of a KG with facets that provide a domain specific representation of data. Since facets are a specialized type of relations, the techniques proposed in this dissertation aim at extracting salient domain specific relations. The fundamental components of a dataspace, namely the KG and the mappings between sources and KG elements, are leveraged to elicitate such domain specific representation from specialized data sources of the dataspace, and to support domain experts with valuable information for the supervision of the process. Facets are extracted by leveraging already established mappings between specialized sources and the KG. After extraction, a domain specific interpretation of facets is provided by re-using relations already defined in the KG, to ensure tight integration of data. This dissertation introduces also a framework to profile the status of the KG, to support the supervision of domain experts in the above tasks. Altogether, the contributions presented in this dissertation provide a set of automatic techniques to support domain experts in the evolution of the KG of a dataspace towards a domain specific, end-user oriented representation. Such techniques analyze and exploit the fundamental components of a dataspace (KG, mappings, and source data) with an effectiveness not achievable with state-of-the-art approaches, as shown by extensive evaluations conducted in both synthetic and real world scenarios.
APA, Harvard, Vancouver, ISO, and other styles
2

Yarimagan, Yalin. "Semantic Enrichment For The Automated Customization And Interoperability Of Ubl Schemas." Phd thesis, METU, 2008. http://etd.lib.metu.edu.tr/upload/12609427/index.pdf.

Full text
Abstract:
Universal Business Language (UBL) is an initiative to develop common business document schemas to provide standardization in the electronic business domain. However, businesses operate in different industry, geopolitical, and regulatory contexts and consequently they have different rules and requirements for the information they exchange. In this thesis, we provide semantic enrichment mechanisms for UBL that (i) allow automated customization of document schemas in response to contextual needs and (ii) maintain interoperability among different schema versions. For this purpose, we develop ontologies to provide machine processable representations for context domains, annotate custom components using classes from those ontologies and show that using these semantic annotations, automated discovery of components and automated customization of schemas becomes possible. We then provide a UBL Component Ontology that represents the semantics of individual components and their structural relationships and show that when an ontology reasoner interprets the expressions from this ontology, it computes equivalence and class-subclass relationships between classes representing components with similar content. Finally we describe how these computed relationships are used by a translation mechanism to establish interoperability among schema versions customized for different business context values.
APA, Harvard, Vancouver, ISO, and other styles
3

Yang, Mingming. "Development of the partition of unity finite element method for the numerical simulation of interior sound field." Thesis, Compiègne, 2016. http://www.theses.fr/2016COMP2282/document.

Full text
Abstract:
Dans ce travail, nous avons introduit le concept sous-jacent de PUFEM et la formulation de base lié à l'équation de Helmholtz dans un domaine borné. Le processus d'enrichissement de l'onde plane de variables PUFEM a été montré et expliqué en détail. L'idée principale est d'inclure une connaissance a priori sur le comportement local de la solution dans l'espace des éléments finis en utilisant un ensemble de fonctions d'onde qui sont des solutions aux équations aux dérivées partielles. Dans cette étude, l'utilisation des ondes planes se propageant dans différentes directions a été favorisée car elle conduit à des algorithmes de calcul efficaces. En outre, nous avons montré que le nombre de directions d'ondes planes dépend de la taille de l'élément PUFEM et la fréquence des ondes à la fois en 2D et 3D. Les approches de sélection de ces ondes planes sont également illustrés. Pour les problèmes 3D, nous avons étudié deux systèmes de distribution des directions d'ondes planes qui sont la méthode du cube discrétisé et la méthode de la force de Coulomb. Il a été montré que celle-ci permet d'obtenir des directions d'onde espacées de façon uniforme et permet d'obtenir un nombre arbitraire d'ondes planes attachées à chaque noeud de l'élément de PUFEM, ce qui rend le procédé plus souple.Dans le chapitre 3, nous avons étudié la simulation numérique des ondes se propageant dans deux dimensions en utilisant PUFEM. La principale priorité de ce chapitre est de venir avec un schéma d'intégration exacte (EIS), résultant en un algorithme d'intégration rapide pour le calcul de matrices de coefficients de système avec une grande précision. L'élément 2D PUFEM a ensuite été utilisé pour résoudre un problème de transmission acoustique impliquant des matériaux poreux. Les résultats ont été vérifiés et validés par la comparaison avec des solutions analytiques. Les comparaisons entre le régime exact d'intégration (EIS) et en quadrature de Gauss ont montré le gain substantiel offert par l'EIE en termes de temps CPU.Une 3D exacte Schéma d'intégration a été présenté dans le chapitre 4, afin d'accélérer et de calculer avec précision (jusqu'à la précision de la machine) des intégrales très oscillatoires découlant des coefficients de la matrice de PUFEM associés à l'équation 3D Helmholtz. Grâce à des tests de convergence, un critère de sélection du nombre d'ondes planes a été proposé. Il a été montré que ce nombre ne pousse que quadratiquement avec la fréquence qui donne lieu à une réduction drastique du nombre total de degrés de libertés par rapport au FEM classique. Le procédé a été vérifié pour deux exemples numériques. Dans les deux cas, le procédé est représenté à converger vers la solution exacte. Pour le problème de la cavité avec une source de monopôle située à l'intérieur, nous avons testé deux modèles numériques pour évaluer leur performance relative. Dans ce scénario, où la solution exacte est singulière, le nombre de directions d'onde doit être choisie suffisamment élevée pour faire en sorte que les résultats ont convergé.Dans le dernier chapitre, nous avons étudié les performances numériques du PUFEM pour résoudre des champs sonores intérieurs 3D et des problèmes de transmission d'ondes dans lequel des matériaux absorbants sont présents. Dans le cas particulier d'un matériau réagissant localement modélisé par une impédance de surface. Un des critères d'estimation d'erreur numérique est proposé en considérant simplement une impédance purement imaginaire qui est connu pour produire des solutions à valeur réelle. Sur la base de cette estimation d'erreur, il a été démontré que le PUFEM peut parvenir à des solutions précises tout en conservant un coût de calcul très faible, et seulement environ 2 degrés de liberté par longueur d'onde ont été jugées suffisantes. Nous avons également étendu la PUFEM pour résoudre les problèmes de transmission des ondes entre l'air et un matériau poreux modélisé comme un fluide homogène équivalent
In this work, we have introduced the underlying concept of PUFEM and the basic formulation related to the Helmholtz equation in a bounded domain. The plane wave enrichment process of PUFEM variables was shown and explained in detail. The main idea is to include a priori knowledge about the local behavior of the solution into the finite element space by using a set of wave functions that are solutions to the partial differential equations. In this study, the use of plane waves propagating in various directions was favored as it leads to efficient computing algorithms. In addition, we showed that the number of plane wave directions depends on the size of the PUFEM element and the wave frequency both in 2D and 3D. The selection approaches for these plane waves were also illustrated. For 3D problems, we have investigated two distribution schemes of plane wave directions which are the discretized cube method and the Coulomb force method. It has been shown that the latter allows to get uniformly spaced wave directions and enables us to acquire an arbitrary number of plane waves attached to each node of the PUFEM element, making the method more flexible.In Chapter 3, we investigated the numerical simulation of propagating waves in two dimensions using PUFEM. The main priority of this chapter is to come up with an Exact Integration Scheme (EIS), resulting in a fast integration algorithm for computing system coefficient matrices with high accuracy. The 2D PUFEM element was then employed to solve an acoustic transmission problem involving porous materials. Results have been verified and validated through the comparison with analytical solutions. Comparisons between the Exact Integration Scheme (EIS) and Gaussian quadrature showed the substantial gain offered by the EIS in terms of CPU time.A 3D Exact Integration Scheme was presented in Chapter 4, in order to accelerate and compute accurately (up to machine precision) of highly oscillatory integrals arising from the PUFEM matrix coefficients associated with the 3D Helmholtz equation. Through convergence tests, a criteria for selecting the number of plane waves was proposed. It was shown that this number only grows quadratically with the frequency thus giving rise to a drastic reduction in the total number of degrees of freedoms in comparison to classical FEM. The method has been verified for two numerical examples. In both cases, the method is shown to converge to the exact solution. For the cavity problem with a monopole source located inside, we tested two numerical models to assess their relative performance. In this scenario where the exact solution is singular, the number of wave directions has to be chosen sufficiently high to ensure that results have converged. In the last Chapter, we have investigated the numerical performances of the PUFEM for solving 3D interior sound fields and wave transmission problems in which absorbing materials are present. For the specific case of a locally reacting material modeled by a surface impedance. A numerical error estimation criteria is proposed by simply considering a purely imaginary impedance which is known to produce real-valued solutions. Based on this error estimate, it has been shown that the PUFEM can achieve accurate solutions while maintaining a very low computational cost, and only around 2 degrees of freedom per wavelength were found to be sufficient. We also extended the PUFEM for solving wave transmission problems between the air and a porous material modeled as an equivalent homogeneous fluid. A simple 1D problem was tested (standing wave tube) and the PUFEM solutions were found to be around 1% error which is sufficient for engineering purposes
APA, Harvard, Vancouver, ISO, and other styles
4

Yu-ChengSu and 蘇育正. "Enrichment of Heavy Water in Countercurrent Thermal-Diffusion Columns of Frazier-Scheme with Fraction of Flow-Rate Varied." Thesis, 2002. http://ndltd.ncl.edu.tw/handle/90528208677904500485.

Full text
Abstract:
碩士
淡江大學
化學工程學系
90
The purpose of this work is to investigate the effect of flow-rate fractions on enriching heavy water in countercurrent thermal-diffusion columns of the Frazier-scheme. The separation theory of the countercurrent-flow Frazier-scheme thermal-diffusion columns with various flow-rate fractions has been investigated analytically and experimentally. The equations for evaluating the degree of separation with flow-rate fraction variations of the vertical and inclined Frazier-scheme in serial countercurrent-flow and for estimating optimum plate-spacing and aspect ratio for maximum separation with flow-rate fraction variations have been derived. Therefore, The improvement of the separation efficiency is obtained when the thermal-diffusion column is operated at the optimum flow-rate fraction, best inclination angle, optimum plate-spacing and optimum aspect ratio under the consideration of fixed operating cost. Finally, the experimental results confirm with the prediction of the separation theory developed in this work very well. It was shown that the undesirable remixing effect could be effectively reduced by the suitable adjustment of flow-rate fractions and leads to improvement in separation efficiency.
APA, Harvard, Vancouver, ISO, and other styles
5

Chen, Yung-Tsung, and 陳永宗. "The Study of Thermal-Diffusion Columns of Frazier-Scheme for the Enrichment of Heavy Water with Flow-Rate Fraction Variation." Thesis, 2000. http://ndltd.ncl.edu.tw/handle/70495708338299355021.

Full text
Abstract:
碩士
淡江大學
化學工程學系
88
The purpose of this work is to investigate the effect of flow-rate fraction on the performance in flat-plate thermal diffusion columns of the Frazier scheme. Considerable improvement in performance is obtained , when a Frazier scheme is operated at the best corresponding constant ratio of column length varied at the optimum plate-spacing under the considerations of fixed operating expense. The separation theory of the Frazier-scheme thermal-diffusion columns with column length varied at a constant ratio has been investigated. The equations for estimating the best ratio of column length , the work developed best column number as well as the maximum degree of separation with flow-rate fraction variation were developed. The equations for estimating optimum plate-spacing for maximum separation with flow-rate fraction variation have also been developed. Experimental works for various flow-rate fraction of heavy water system were also conducted. The results quantitatively confirm with the prediction of the theory. It was shown that the undesirable remixing effect could be effectively reduced by controlling flow-rate fraction and a substantial improvement in separation efficiency attained.
APA, Harvard, Vancouver, ISO, and other styles
6

Yan, Yu-Long, and 顏尤龍. "The Effect of Flow-Rate Ratio on the Separation Efficiency for Enrichment of Heavy Water in Thermal Diffusion Clumn of Frazier Scheme." Thesis, 1998. http://ndltd.ncl.edu.tw/handle/74531478391707240458.

Full text
Abstract:
碩士
淡江大學
化學工程學系研究所
86
There are many methods to separate material in the industry. However, the most of these methods are hard to separate the isotopes or the materials that can not separate by these methods. Thermal diffusion is well-established method for separating isotopes. The method of thermal diffusion is compatible to the separation of highly valuable materials that are difficult or impossible to separate by other means. Recently, several improved thermal diffusion columns have been developed. The enrichments obtained from these columns are much better than the basic of Clusius-Dickel one. Frazier proposed a scheme to connect thermodiffusion columns in which the sampling straems do not pass through but move outside the columns. This scheme has been employed for producing lubricating oil.  The separation of flow-rate ratio for the enrichment of heavy water in the continuous-type and Frazier Scheme thermal diffusion column has been derived.The experimental works for various flow-rate ratio of heavy water system have been also conducted the results quantitatively confirm with the prediction of the theory. It was shown that the undesirable remixing effect could be effectively reduced by controlling flow-rate ratio and a substantial improvement in separation efficiency attained.
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Schema enrichment"

1

Chhatwal, Gurunameh Singh, and Gerard Deepak. "IEESWPR: An Integrative Entity Enrichment Scheme for Socially Aware Web Page Recommendation." In Data Science and Security, 239–49. Singapore: Springer Nature Singapore, 2022. http://dx.doi.org/10.1007/978-981-19-2211-4_21.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Kusch, Harald, Robert Kossen, Markus Suhr, Luca Freckmann, Linus Weber, Christian Henke, Christoph Lehmann, et al. "Management of Metadata Types in Basic Cardiological Research." In Studies in Health Technology and Informatics. IOS Press, 2021. http://dx.doi.org/10.3233/shti210542.

Full text
Abstract:
Introduction: Ensuring scientific reproducibility and compliance with documentation guidelines of funding bodies and journals is a topic of greatly increasing importance in biomedical research. Failure to comply, or unawareness of documentation standards can have adverse effects on the translation of research into patient treatments, as well as economic implications. In the context of the German Research Foundation-funded collaborative research center (CRC) 1002, an IT-infrastructure sub-project was designed. Its goal has been to establish standardized metadata documentation and information exchange benefitting the participating research groups with minimal additional documentation efforts. Methods: Implementation of the self-developed menoci-based research data platform (RDP) was driven by close communication and collaboration with researchers as early adopters and experts. Requirements analysis and concept development involved in person observation of experimental procedures, interviews and collaboration with researchers and experts, as well as the investigation of available and applicable metadata standards and tools. The Drupal-based RDP features distinct modules for the different documented data and workflow types, and both the development and the types of collected metadata were continuously reviewed and evaluated with the early adopters. Results: The menoci-based RDP allows for standardized documentation, sharing and cross-referencing of different data types, workflows, and scientific publications. Different modules have been implemented for specific data types and workflows, allowing for the enrichment of entries with specific metadata and linking to further relevant entries in different modules. Discussion: Taking the workflows and datasets of the frequently involved experimental service projects as a starting point for (meta-)data types to overcome irreproducibility of research data, results in increased benefits for researchers with minimized efforts. While the menoci-based RDP with its data models and metadata schema was originally developed in a cardiological context, it has been implemented and extended to other consortia at GÃűttingen Campus and beyond in different life science research areas.
APA, Harvard, Vancouver, ISO, and other styles
3

Taylor, Richard, and Damian Taylor. "1. Introduction." In Contract Law Directions, 1–14. Oxford University Press, 2019. http://dx.doi.org/10.1093/he/9780198836599.003.0001.

Full text
Abstract:
Without assuming prior legal knowledge, books in the Directions series introduce and guide readers through key points of law and legal debate. Questions, diagrams and exercises help readers to engage fully with each subject and check their understanding as they progress. This introductory chapter explains how contract law is structured and how it fits into the overall scheme of the law of obligations and into English law more generally. It explains the boundaries between contract law, torts and unjust enrichment and restitution and also explains the wider range of situations covered by the law of contract and puts it into its social and economic context.
APA, Harvard, Vancouver, ISO, and other styles
4

Taylor, Richard, and Damian Taylor. "1. Introduction." In Contract Law Directions. Oxford University Press, 2017. http://dx.doi.org/10.1093/he/9780198797739.003.0001.

Full text
Abstract:
Without assuming prior legal knowledge, books in the Directions series introduce and guide readers through key points of law and legal debate. Questions, diagrams and exercises help readers to engage fully with each subject and check their understanding as they progress. This introductory chapter explains how contract law is structured and how it fits into the overall scheme of the law of obligations and into English law more generally. It explains the boundaries between contract law, torts and unjust enrichment and restitution and also explains the wider range of situations covered by the law of contract and puts it into its social and economic context.
APA, Harvard, Vancouver, ISO, and other styles
5

Taylor, Richard, and Damian Taylor. "1. Introduction." In Contract Law Directions, 1–14. Oxford University Press, 2021. http://dx.doi.org/10.1093/he/9780198870593.003.0001.

Full text
Abstract:
Without assuming prior legal knowledge, books in the Directions series introduce and guide readers through key points of law and legal debate. Questions, diagrams and exercises help readers to engage fully with each subject and check their understanding as they progress. This introductory chapter explains how contract law is structured and how it fits into the overall scheme of the law of obligations and into English law more generally. It explains the boundaries between contract law, torts and unjust enrichment and restitution. It also explains the wider range of situations covered by the law of contract, and puts the law of contract into its social and economic context.
APA, Harvard, Vancouver, ISO, and other styles
6

Fu, Wai-Tat, and Thomas Kannampallil. "Harnessing Web 2.0 for Context-Aware Learning." In Social Computing, 1511–26. IGI Global, 2010. http://dx.doi.org/10.4018/978-1-60566-984-7.ch097.

Full text
Abstract:
We present an empirical study investigating how interactions with a popular social tagging system, called del.icio.us, may directly impact knowledge adaptation through the processes of concept assimilation and accommodation. We observed 4 undergraduate students over a period of 8 weeks and found that the quality of social tags and distributions of information content directly impact the formation and enrichment of concept schemas. A formal model based on a distributed cognition framework provides a good fit to the students learning data, showing how learning occurs through the adaptive assimilation of concepts and categories of multiple users through the social tagging system. The results and the model have important implications on how Web 2.0 technologies can promote formal and informal learning through collaborative methods.
APA, Harvard, Vancouver, ISO, and other styles
7

Si, Huayin, and Chang-Tsun Li. "Copyright Protection in Virtual Communities through Digital Watermarking." In Information Security and Ethics, 3788–93. IGI Global, 2008. http://dx.doi.org/10.4018/978-1-59904-937-3.ch257.

Full text
Abstract:
Although the development of multimedia processing techniques has facilitated the enrichment of information content, and the never-ending expansion of interconnected networks has constructed a solid infrastructure for information exchanges, meanwhile, the infrastructure and techniques have also smoothed the way for copyright piracy in virtual communities. As a result, the demand for intellectual property protection becomes apparent and exigent. In response to this challenge, digital watermarking has been proposed to serve this purpose. The idea of digital watermarking is to embed a small amount of secret information—the watermark—into the host digital productions, such as image and audio, so that it can be extracted later for the purposes of copyright assertion, authentication and content integrity verification, and so forth. Unlike traditional watermarks printed on paper, which are visible to human eyes, digital watermarks are usually invisible and can only be detected with the aid of a specially designed detector. One characteristic distinguishing digital watermarking from cryptography, which separates the digital signature from the raw data/content, is that digital watermarking embeds the signature in the content to be protected. The superiority of this characteristic is that while cryptography provides no protection after the content is decrypted, digital watermarking provides “intimate” protection, because the digital signature/secret information has become an inseparable constituent part of the content itself after embedding. Because of the very characteristic, digital watermarking requires no secret channel for communicating the digital signature that cryptography does. So in the last decade, digital watermarking has attracted numerous attention from researchers and is regarded as a promising technique in the field of information security. Various types of watermarking schemes have been developed for different applications. According to their natures, digital watermarking schemes could be classified into three categories: fragile watermarking, semi-fragile watermarking and robust watermarking. The schemes of the first two categories are developed for the purposes of multimedia authentication and content integrity verification, in which we expect the embedded watermark to be destroyed when attacks are mounted on its host media. More emphases of these schemes are placed on the capability of detecting and localizing forgeries and impersonations. The main difference between the two is that semi-fragile watermarking is tolerant to non-malicious operations, such as lossy compression within a certain compression ratio, while fragile watermarking is intolerant to any manipulations. Robust watermarking, on the other hand, is intended for the applications of copyright protection, wherein the watermarks should survive attacks aiming at weakening or erasing them provided the quality of the attacked content is still worth protecting. Therefore, the emphasis of robust watermarking schemes is placed on their survivability against attacks. This article is intended to focus on robust watermarking schemes for the application of copyright protection. See Li and Yang (2003) and Lin and Chang (2001) for more details about fragile and semi-fragile schemes.
APA, Harvard, Vancouver, ISO, and other styles
8

Si, Huayin, and Chang-Tsun Li. "Copyright Protection in Virtual Communities through Digital Watermarking." In Virtual Technologies, 1544–50. IGI Global, 2008. http://dx.doi.org/10.4018/978-1-59904-955-7.ch098.

Full text
Abstract:
Although the development of multimedia processing techniques has facilitated the enrichment of information content, and the never-ending expansion of interconnected networks has constructed a solid infrastructure for information exchanges, meanwhile, the infrastructure and techniques have also smoothed the way for copyright piracy in virtual communities. As a result, the demand for intellectual property protection becomes apparent and exigent. In response to this challenge, digital watermarking has been proposed to serve this purpose. The idea of digital watermarking is to embed a small amount of secret information—the watermark—into the host digital productions, such as image and audio, so that it can be extracted later for the purposes of copyright assertion, authentication and content integrity verification, and so forth. Unlike traditional watermarks printed on paper, which are visible to human eyes, digital watermarks are usually invisible and can only be detected with the aid of a specially designed detector. One characteristic distinguishing digital watermarking from cryptography, which separates the digital signature from the raw data/content, is that digital watermarking embeds the signature in the content to be protected. The superiority of this characteristic is that while cryptography provides no protection after the content is decrypted, digital watermarking provides “intimate” protection, because the digital signature/secret information has become an inseparable constituent part of the content itself after embedding. Because of the very characteristic, digital watermarking requires no secret channel for communicating the digital signature that cryptography does. So in the last decade, digital watermarking has attracted numerous attention from researchers and is regarded as a promising technique in the field of information security. Various types of watermarking schemes have been developed for different applications. According to their natures, digital watermarking schemes could be classified into three categories: fragile watermarking, semi-fragile watermarking and robust watermarking. The schemes of the first two categories are developed for the purposes of multimedia authentication and content integrity verification, in which we expect the embedded watermark to be destroyed when attacks are mounted on its host media. More emphases of these schemes are placed on the capability of detecting and localizing forgeries and impersonations. The main difference between the two is that semi-fragile watermarking is tolerant to non-malicious operations, such as lossy compression within a certain compression ratio, while fragile watermarking is intolerant to any manipulations. Robust watermarking, on the other hand, is intended for the applications of copyright protection, wherein the watermarks should survive attacks aiming at weakening or erasing them provided the quality of the attacked content is still worth protecting. Therefore, the emphasis of robust watermarking schemes is placed on their survivability against attacks. This article is intended to focus on robust watermarking schemes for the application of copyright protection. See Li and Yang (2003) and Lin and Chang (2001) for more details about fragile and semi-fragile schemes.
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Schema enrichment"

1

Sekhavat, Y. A., and J. Parsons. "SESM: Semantic enrichment of schema mappings." In 2013 IEEE 29th International Conference on Data Engineering Workshops (ICDEW 2013). IEEE, 2013. http://dx.doi.org/10.1109/icdew.2013.6547415.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Shieh, Evan, Saul Simhon, Geetha Aluri, Giorgos Papachristoudis, Doa Yakut, and Dhanya Raghu. "Attribute Similarity and Relevance-Based Product Schema Matching for Targeted Catalog Enrichment." In 2021 IEEE International Conference on Big Knowledge (ICBK). IEEE, 2021. http://dx.doi.org/10.1109/ickg52313.2021.00043.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Wang, Hongxia. "Researches on the In-Core Reload Design for TianWan Nuclear Power Station." In 18th International Conference on Nuclear Engineering. ASMEDC, 2010. http://dx.doi.org/10.1115/icone18-29057.

Full text
Abstract:
The two Units of TianWan Nuclear Power Station (phase 1) are in their third fuel cycle operation now. China Nuclear Power Engineering Co., Ltd. (CNPE) has finished the reload design of Unit 1 and the check on Russia’s computation of Unit 2 in the third cycle. From the forth cycle on, the in-core reload design will be given by Chinese part. It is summarized in this paper the problems met and the way to solve them during our design process. We discussed how to retain the related parameters, such as maximum burn-up and power peak factor, in reasonable ranges in the first part. In the last part, four design cases which can be used in cycle 5 for Unit 1 are recommended and evaluated briefly. The first two cases are partly low-leakage schemas; they satisfy the needs of the related restrictions very well. Case 3 is completely a low-leakage schema and case 4 is a long life schema. For case 3 and 4, a few new kind of assemblies, which are different from the present assemblies in uranium enrichment or the numbers of Oxide Gadolinium fuel rods, may be adapt to for better parameters.
APA, Harvard, Vancouver, ISO, and other styles
4

Britton, Kyle Anthony, and Zeyun Wu. "A Neutronics Feasibility Study of the TRIGA LEU Fuel in the 20MWt NIST Research Reactor." In 2018 26th International Conference on Nuclear Engineering. American Society of Mechanical Engineers, 2018. http://dx.doi.org/10.1115/icone26-82433.

Full text
Abstract:
The National Bureau of Standards reactor (NBSR) at the National Institute of Standards and Technology (NIST) is under conversion from high enriched uranium (HEU) to the low enriched uranium (LEU) schema under the Reduced Enrichment for Research and Test Reactors program (RERTR) as a part of the Global Threat Reduction Initiative (GTRI). The conversion of the high performance research reactors (HPRR) such as NBSR is a challenging task due to the high flux need (2.5 × 1014 n/cm2-s for the NBSR), as well as other neutronics performance characteristics requirements without significant changes to the external geometrical configuration. One fuel candidate, the General Atomics (GA) UZrH LEU fuel, has showed particular promise in this regard. The TRIGA LEU fuel was initially developed in the 1980s with particular considerations for fuel conversion for high power regimes such as high density research and test reactors. This study performs a neutronics feasibility study of the UZrH LEU fuel schema for the NBSR, examining the accountability and sustainability of the TRIGA fuel when applying it to the NBSR conversion. To identify the best option to deploy the TRIGA fuel to NBSR in terms of key neutronic performance characteristic, the study is carried out with various considerations in the fuel dimensions, fuel rod layout configurations, and structure material selections. Monte Carlo based computational model is used to assist and facilitate the research procedure. The research findings in this study will determine the viability of the TRIGA fuel type for the NBSR conversion, and provide supporting data for future investigations on this subject.
APA, Harvard, Vancouver, ISO, and other styles
5

"Environmental information enrichment – The TaToo approach." In 19th International Congress on Modelling and Simulation. Modelling and Simulation Society of Australia and New Zealand (MSSANZ), Inc., 2011. http://dx.doi.org/10.36334/modsim.2011.c3.schimak.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Ivanov, G. V., and A. A. Kuranov. "Development of automatic process control schemes coal enrichment." In Наука России: Цели и задачи. НИЦ «Л-Журнал», 2018. http://dx.doi.org/10.18411/sr-10-12-2018-08.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Ghanem, Roger, and Debraj Ghosh. "An Enrichment Scheme for Polynomial Chaos Expansion Applied to Random Eigenvalue Problem." In ASME 2005 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. ASMEDC, 2005. http://dx.doi.org/10.1115/detc2005-85450.

Full text
Abstract:
For a system with the parameters modeled as uncertain, polynomial approximations such as polynomial chaos expansion provide an effective way to estimate the statistical behavior of the eigenvalues and eigenvectors, provided the eigenvalues are widely spaced. For a system with a set of clustered eigenvalues, the corresponding eigenvalues and eigenvectors are very sensitive to perturbation of the system parameters. An enrichment scheme to the polynomial chaos expansion is proposed here in order to capture the behavior of such eigenvalues and eigenvectors. It is observed that for judiciously chosen enrichment functions, the enriched expansion provides better estimate of the statistical behavior of the eigenvalues and eigenvectors.
APA, Harvard, Vancouver, ISO, and other styles
8

Eerkens, Jeff W., Jay F. Kunze, and Leonard J. Bond. "Laser Isotope Enrichment for Medical and Industrial Applications." In 14th International Conference on Nuclear Engineering. ASMEDC, 2006. http://dx.doi.org/10.1115/icone14-89767.

Full text
Abstract:
Use of lasers for isotope separation has been considered for many decades. None of the proposed methods have attained sufficient proof of principal status to be economically attractive to pursue commercially. Some of the authors have succeeded in separating sulfur isotopes using a rather new and different method, known as condensation repression. In this scheme, a gas of the selected isotopes for enrichment is irradiated with a laser at a particular wavelength that would excite only one of the isotopes. The entire gas is subjected to low temperatures sufficient to cause condensation on a cold surface or coagulation in the gas. Those molecules in the gas that the laser excited are not as likely to condense or dimerize (coagulate into a double molecule, called a dimer) as unexcited molecules. Hence in cold-wall condensation, gas drawn out of the system is enriched in the isotope that was laser-excited. We have evaluated the relative energy required in this process if applied on a commercial scale. We estimate the energy required for laser isotope enrichment is about 30% of that required in centrifuge separations, and 2% of that required by use of “calutrons”.
APA, Harvard, Vancouver, ISO, and other styles
9

Bowman, Andrew, Andrew Mack, Wondwossen A. Gebreyes, and Julie A. Funk. "Comparison of enrichment schemes for the isolation of Yersinia enterocolitica." In Fifth International Symposium on the Epidemiology and Control of Foodborn Pathogens in Pork. Iowa State University, Digital Press, 2003. http://dx.doi.org/10.31274/safepork-180809-534.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Jamshidi, Soheil, and Mahmoud Reza Hashemi. "An efficient data enrichment scheme for fraud detection using social network analysis." In 2012 Sixth International Symposium on Telecommunications (IST). IEEE, 2012. http://dx.doi.org/10.1109/istel.2012.6483147.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Schema enrichment"

1

Van Rijn, Jaap, Harold Schreier, and Yossi Tal. Anaerobic ammonia oxidation as a novel approach for water treatment in marine and freshwater aquaculture recirculating systems. United States Department of Agriculture, December 2006. http://dx.doi.org/10.32747/2006.7696511.bard.

Full text
Abstract:
Ammonia waste removal in recirculating aquaculture systems is typically accomplished via the action of nitrifying bacteria in specially designed biofilters that oxidize ammonia to produce nitrate. In the majority of these systems nitrate is discharged to the environment through frequent water exchanges. As environmental considerations have made it necessary to eliminate nitrate release, new strategies for nitrate consumption are being developed. In the funding period we showed that ammonia removal from wastewater could take place by an anaerobic ammonia oxidation process carried out by bacterial Planctomycetessp. Referred to as “anammox”, this process occurs in the absence of an organic source and in the presence of nitrite (or nitrate) as an electron acceptor as follows: NH₃ + HNO₂ -> N₂ + 2H₂O. Annamox has been estimated to result in savings of up to 90% of the costs associated with was wastewater treatment plants. Our objective was to study the applicability of the anammox process in a variety of recirculating aquaculture systems to determine optimal conditions necessary for efficient ammonia waste removal. Both seawater and freshwater systems operated with either conventional aerobic treatment of ammonia to nitrate (USA) or, in addition, denitrifying biofilters as well as anaerobic digestion of sludge (Israel) were tested. Molecular tools were used to screen and monitor different treatment compartments for the presence of Planctomycetes. Optimal conditions for the enrichment of the anammox bacteria were tested using laboratory scale biofilters as well as a semi-commercial system. Enrichment studies resulted in the isolation of some unique heterotrophic bacteria capable of plasmid-mediated autotrophic growth in the presence of ammonia and nitrite. Our studies have not only demonstrated the presence and viability of Planctomycetes spp. in recirculating marine and freshwater systems biofilter units but also demonstrated the applicability of the anammox process in these systems. Using our results we have developed treatment schemes that have allowed for optimizing the anammox process and applying it to recirculating systems.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography