Academic literature on the topic 'Standard exchange formats'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Standard exchange formats.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Standard exchange formats"

1

Egerton, R. F., D. S. Bright, S. D. Davilla, P. Ingram, E. J. Kirkland, M. Kundmann, C. E. Lyman, P. Rez, E. Steele, and N. J. Zaluzec. "Standard formats for the exchange and storage of image data." Proceedings, annual meeting, Electron Microscopy Society of America 51 (August 1, 1993): 220–21. http://dx.doi.org/10.1017/s0424820100146941.

Full text
Abstract:
In microscopy, there is an increasing need for images to be recorded electronically and stored digitally on disk or tape. This image data can be shared by mailing these magnetic media or by electronic transmission along telephone lines (e.g. modem transfer) or special networks, such as Bitnet and Internet. In each case, the format in which the image is stored or transmitted must be known to the recipient in order to correctly recover all the information. Because there are many image formats to choose from, it would undoubtedly save misunderstanding and frustration if a group of individuals with similar interests and needs could agree upon a common format. The MSA Standards Committee has surveyed several formats which could be of particular interest to microscopists, with a view to making a recommendation to our community.Our chief concern has been compatibility with existing software, combined with an adequate representation of the data, compactness of data storage (on disk) and reasonable rate of data transfer.
APA, Harvard, Vancouver, ISO, and other styles
2

Bergmann, Frank T., Nicolas Rodriguez, and Nicolas Le Novère. "COMBINE Archive Specification Version 1." Journal of Integrative Bioinformatics 12, no. 2 (June 1, 2015): 104–18. http://dx.doi.org/10.1515/jib-2015-261.

Full text
Abstract:
Summary Several standard formats have been proposed that can be used to describe models, simulations, data or other essential information in a consistent fashion. These constitute various separate components required to reproduce a given published scientific result.The Open Modeling EXchange format (OMEX) supports the exchange of all the information necessary for a modeling and simulation experiment in biology. An OMEX file is a ZIP container that includes a manifest file, an optional metadata file, and the files describing the model. The manifest is an XML file listing all files included in the archive and their type. The metadata file provides additional information about the archive and its content. Although any format can be used, we recommend an XML serialization of the Resource Description Framework.Together with the other standard formats from the Computational Modeling in Biology Network (COMBINE), OMEX is the basis of the COMBINE Archive. The content of a COMBINE Archive consists of files encoded in COMBINE standards whenever possible, but may include additional files defined by an Internet Media Type. The COMBINE Archive facilitates the reproduction of modeling and simulation experiments in biology by embedding all the relevant information in one file. Having all the information stored and exchanged at once also helps in building activity logs and audit trails.
APA, Harvard, Vancouver, ISO, and other styles
3

Lenivtceva, Iuliia D., and Georgy Kopanitsa. "Evaluating Manual Mappings of Russian Proprietary Formats and Terminologies to FHIR." Methods of Information in Medicine 58, no. 04/05 (November 2019): 151–59. http://dx.doi.org/10.1055/s-0040-1702154.

Full text
Abstract:
Abstract Background Evaluating potential data losses from mapping proprietary medical data formats to standards is essential for decision making. The article implements a method to evaluate the preliminary content overlap of proprietary medical formats, including national terminologies and Fast Healthcare Interoperability Resources (FHIR)—international medical standard. Methods Three types of mappings were evaluated in the article: proprietary format matched to FHIR, national terminologies matched to the FHIR mappings, and concepts from national terminologies matched to Systematized Nomenclature of Medicine–Clinical Terms (SNOMED CT). We matched attributes of the formats with FHIR definitions and calculated content overlap. Results The article reports the results of a manual mapping between a proprietary medical format and the FHIR standard. The following results were obtained: 81% of content overlap for the proprietary format to FHIR mapping, 88% of content overlap for the national terminologies to FHIR mapping, and 98.6% of concepts matching can be reached from national terminologies to SNOMED CT mapping. Twenty tables from the proprietary format and 20 dictionaries were matched with FHIR resources; nine dictionaries were matched with SNOMED CT concepts. Conclusion Mapping medical formats is a challenge. The obtained overlaps are promising in comparison with the investigated results. The study showed that standardization of data exchange between proprietary formats and FHIR is possible in Russia, and national terminologies can be used in FHIR-based information systems.
APA, Harvard, Vancouver, ISO, and other styles
4

Kim, Youngki, Hanra Lee, Mutahar Safdar, Tahir Abbas Jauhar, and Soonhung Han. "Exchange of parametric assembly models based on neutral assembly constraints." Concurrent Engineering 27, no. 4 (August 20, 2019): 285–94. http://dx.doi.org/10.1177/1063293x19869047.

Full text
Abstract:
It is difficult to exchange parametric assembly models using conventional neutral formats such as the standard for the exchange of product model data or the initial graphics exchange specification. These formats only support the boundary representation information that leads to the inability to perform parametric re-evaluation, once a model is exchanged. In order to exchange parametric information along with the design intent, a design history-based macro-parametric approach was proposed. Our method is macro-parametrics approach, however, supported only the exchange of individual part models. As most of the products are manufactured in assemblies, where several components are connected with multiple constraints, it is necessary to exchange the assembly model data. To overcome the issue of post-exchange editability, a collection of neutral assembly commands was introduced to extend the capabilities of the macro-parametric approach. A set of neutral assembly constraints was defined and a system for exchanging the parametric assembly models was implemented. An assembly model consisting of coaxial and incidence constraints was successfully exchanged between two commercial computer-aided design systems: CATIA and NX. It was possible to re-evaluate the assembly model parametrically after the exchange. The method can be further extended to exchange the remaining constraint types in different commercial computer-aided design systems.
APA, Harvard, Vancouver, ISO, and other styles
5

Azeroual, Otmane, and Nico Herbig. "Mapping and semantic interoperability of the German RCD data model with the Europe-wide accepted CERIF." Information Services & Use 40, no. 1-2 (October 23, 2020): 87–113. http://dx.doi.org/10.3233/isu-200076.

Full text
Abstract:
The provision, processing and distribution of research information are increasingly supported by the use of research information systems (RIS) at higher education institutions. National and international exchange formats or standards can support the validation and use of research information and increase their informative value and comparability through consistent semantics. The formats are very overlapping and represent different approaches to modeling. This paper presents the data model of the Research Core Dataset (RCD) and discusses its impact on data quality in RIS. Subsequently compares it with the Europe-wide accepted Common European Research Information Format (CERIF) standard to support the implementation of the RCD with CERIF compatibility in the RIS and so that institutions integrate their research information from internal and external heterogeneous data sources to ultimately provide valuable information with high levels of data quality. As these are fundamental to decision-making and knowledge generation as well as the presentation of research.
APA, Harvard, Vancouver, ISO, and other styles
6

Taylor, P., S. Cox, G. Walker, D. Valentine, and P. Sheahan. "WaterML2.0: development of an open standard for hydrological time-series data exchange." Journal of Hydroinformatics 16, no. 2 (April 8, 2013): 425–46. http://dx.doi.org/10.2166/hydro.2013.174.

Full text
Abstract:
The increasing global demand on freshwater is resulting in nations improving their terrestrial water monitoring and reporting systems to better understand the availability, and quality, of this valuable resource. A barrier to this is the inability for stakeholders to share information relating to water observations data: traditional hydrological information systems have relied on internal custom data formats to exchange data, leading to issues in data integration and exchange. Organisations are looking to information standards to assist in data exchange, integration and interpretation to lower costs in use, and re-use, of monitoring data. The WaterML2.0 Standards Working Group (SWG), working within the Open Geospatial Consortium (OGC) and in cooperation with the joint OGC-World Meteorological Organisation (WMO) Hydrology Domain Working Group (HDWG), has developed an open standard for the exchange of water observation data. The focus of the standard is time-series data, commonly used for hydrological applications such as flood forecasting, environmental reporting and hydrological infrastructure, where a lack of standards inhibits efficient re-use and automation. This paper describes the development methodology and principles of WaterML2.0, key parts of its information model, implementation scenarios, evaluation and future work. WaterML2.0 was adopted by the OGC as an official standard in September 2012.
APA, Harvard, Vancouver, ISO, and other styles
7

Frøystad, Christian, Inger Tøndel, and Martin Jaatun. "Security Incident Information Exchange for Cloud Service Provisioning Chains." Cryptography 2, no. 4 (December 11, 2018): 41. http://dx.doi.org/10.3390/cryptography2040041.

Full text
Abstract:
Online services are increasingly becoming a composition of different cloud services, making incident-handling difficult, as Cloud Service Providers (CSPs) with end-user customers need information from other providers about incidents that occur at upstream CSPs to inform their users. In this paper, we argue the need for commonly agreed-upon incident information exchanges between providers to improve accountability of CSPs, and present both such a format and a prototype implementing it. The solution can handle simple incident information natively as well as embed standard representation formats for incident-sharing, such as IODEF and STIX. Preliminary interviews show a desire for such a solution. The discussion considers both technical challenges and non-technical aspects related to improving the situation for incident response in cloud-computing scenarios. Our solution holds the potential of making incident-sharing more efficient.
APA, Harvard, Vancouver, ISO, and other styles
8

Goncharov, M. V., and K. A. Kolosov. "On interoperability of metadata within RNPLS&T’s Single Open Information Archive." Scientific and Technical Libraries, no. 10 (November 12, 2021): 45–62. http://dx.doi.org/10.33186/1027-3689-2021-10-45-62.

Full text
Abstract:
Russian National Public Library for Science and Technology has been developing the Single Open Information Archive (UOIA) to merge all digital full-text resources created or acquired by the Library. The authors examine the issues of interoperability when exchanging metadata between UOIA built on library automation software and open archives using OAI-PMH technology for metadata acquisition. Interoperability in information exchange between different ALIS is provided, for example, through applying SRU/SRW protocol and metadata scheme, while metadata exchange between OA repositories is provided mainly within Dublin Core (DC) scheme. ALIS – OA metadata transmission with transformation into DC results in information loss and prevents unambiguous reverse transformation.For a long time, DSpace has been the most popular software for open digital repositories. This product enables OAI-PMH metadata acquisition in DC and Qualified DC (QDC) formats, and supports Object Reuse and Exchange (ORE) standard, which enables to describe aggregated resources. ORE in DSpace enables to collect not only metadata but also connected files and to receive other connected data provided by importing source. DSpace uses rather simple ORE format based on Atom XML that allows binding several files of different functionality with RDF-triplets.The OAI-PMH software connector is designed for RNPLS&T SOIA and enables to present metadata in DC, QDC, MARC21, and ORE formats, which supports interoperability in information exchange with OA repositories with DSpace software. Beside metadata transmission, transmission of various data types is possible, e. g. document text or license information. Further development is to expand format structure to represent associated data, in particular using RDF.
APA, Harvard, Vancouver, ISO, and other styles
9

Safdar, Mutahar, Tahir Abbas Jauhar, Youngki Kim, Hanra Lee, Chiho Noh, Hyebin Kim, Inhwan Lee, Imgyu Kim, Soonjo Kwon, and Soonhung Han. "Feature-based translation of CAD models with macro-parametric approach: issues of feature mapping, persistent naming, and constraint translation." Journal of Computational Design and Engineering 7, no. 5 (April 9, 2020): 603–14. http://dx.doi.org/10.1093/jcde/qwaa043.

Full text
Abstract:
Abstract Feature-based translation of computer-aided design (CAD) models allows designers to preserve the modeling history as a series of modeling operations. Modeling operations or features contain information that is required to modify CAD models to create different variants. Conventional formats, including the standard for the exchange of product model data or the initial graphics exchange specification, cannot preserve design intent and only geometric models can be exchanged. As a result, it is not possible to modify these models after their exchange. Macro-parametric approach (MPA) is a method for exchanging feature-based CAD models among heterogeneous CAD systems. TransCAD, a CAD system for inter-CAD translation, is based on this approach. Translators based on MPA were implemented and tested for exchange between two commercial CAD systems. The issues found during the test rallies are reported and analyzed in this work. MPA can be further extended to remaining features and constraints for exchange between commercial CAD systems.
APA, Harvard, Vancouver, ISO, and other styles
10

Halfawy, Mahmoud R., Dana J. Vanier, and Thomas M. Froese. "Standard data models for interoperability of municipal infrastructure asset management systems." Canadian Journal of Civil Engineering 33, no. 12 (December 1, 2006): 1459–69. http://dx.doi.org/10.1139/l05-098.

Full text
Abstract:
Efficient management of infrastructure assets depends largely on the ability to efficiently share, exchange, and manage asset life-cycle information. Although software tools are used to support almost every asset management process in municipalities, data exchange is mainly performed using paper-based or neutral file formats based on ad hoc proprietary data models. Interoperability of various asset management systems is crucial to support better management of infrastructure data and to improve the information flow between various work processes. Standard data models can be used to significantly improve the availability and consistency of asset data across different software systems, to integrate data across various disciplines, and to exchange information between various stakeholders. This paper surveys a number of data standards that might be used in implementing interoperable and integrated infrastructure asset management systems. The main requirements for standard data models are outlined, and the importance of interoperability from an asset management perspective is highlighted. The role that spatial data and geographic information systems (GIS) can play in enhancing the efficiency of managing asset life-cycle data is also discussed. An ongoing effort to develop a standard data model for sewer systems is presented, and an example implementation of interoperable GIS and hydraulic modeling software is discussed.Key words: data standards, municipal infrastructure, asset management, data models, interoperability.
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Standard exchange formats"

1

Onyeako, Isidore. "Resolution-aware Slicing of CAD Data for 3D Printing." Thesis, Université d'Ottawa / University of Ottawa, 2016. http://hdl.handle.net/10393/34303.

Full text
Abstract:
3D printing applications have achieved increased success as an additive manufacturing (AM) process. Micro-structure of mechanical/biological materials present design challenges owing to the resolution of 3D printers and material properties/composition. Biological materials are complex in structure and composition. Efforts have been made by 3D printer manufacturers to provide materials with varying physical, mechanical and chemical properties, to handle simple to complex applications. As 3D printing is finding more medical applications, we expect future uses in areas such as hip replacement - where smoothness of the femoral head is important to reduce friction that can cause a lot of pain to a patient. The issue of print resolution plays a vital role due to staircase effect. In some practical applications where 3D printing is intended to produce replacement parts with joints with movable parts, low resolution printing results in fused joints when the joint clearance is intended to be very small. Various 3D printers are capable of print resolutions of up to 600dpi (dots per inch) as quoted in their datasheets. Although the above quoted level of detail can satisfy the micro-structure needs of a large set of biological/mechanical models under investigation, it is important to include the ability of a 3D slicing application to check that the printer can properly produce the feature with the smallest detail in a model. A way to perform this check would be the physical measurement of printed parts and comparison to expected results. Our work includes a method for using ray casting to detect features in the 3D CAD models whose sizes are below the minimum allowed by the printer resolution. The resolution validation method is tested using a few simple and complex 3D models. Our proposed method serves two purposes: (a) to assist CAD model designers in developing models whose printability is assured. This is achieved by warning or preventing the designer when they are about to perform shape operations that will lead to regions/features with sizes lower than that of the printer resolution; (b) to validate slicing outputs before generation of G-Codes to identify regions/features with sizes lower than the printer resolution.
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Standard exchange formats"

1

United States. Office of Federal Coordinator for Meteorological Services and Supporting Research., ed. Standard formats for weather data exchange among automated weather information systems. Washington, D.C: U.S. Dept. of Commerce, National Oceanic and Atmospheric Administration, Federal Coordinator for Meteorological Services and Supporting Research, 1987.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

United States. Office of Federal Coordinator for Meteorological Services and Supporting Research., ed. Standard formats for weather data exchange among automated weather information systems. Silver Spring, Md: The Office, 1994.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

United States. Office of Federal Coordinator for Meteorological Services and Supporting Research, ed. Standard formats for weather data exchange among automated weather information systems. Silver Spring, Md. (8455 Colesville Rd., Suite 1500, Silver Spring 20910): Office of the Federal Coordinator for Meteorological Services and Supporting Research, 1995.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

United States. Office of Federal Coordinator for Meteorological Services and Supporting Research., ed. Standard formats for weather data exchange among automated weather information systems. Rockville, Md: U.S. Dept. of Commerce, National Oceanic and Atmospheric Administration, 1990.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

United States. Office of Federal Coordinator for Meteorological Services and Supporting Research, ed. Standard formats for weather data exchange among automated weather information systems. Rockville, Md: U.S. Dept. of Commerce, National Oceanic and Atmospheric Administration, 1990.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

United States. Office of Federal Coordinator for Meteorological Services and Supporting Research., ed. Standard formats for weather data exchange among automated weather information systems. Rockville, Md: U.S. Dept. of Commerce, National Oceanic and Atmospheric Administration, 1990.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
7

Annie, McMorris, and HURIDOCS (Network), eds. HURIDOCS standard formats for the recording and exchange of information on human rights. Dordrecht: M. Nijhoff, 1985.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
8

National Information Standards Organization (U.S.). Proposed American national standard record format for patron records. Bethesda, MD: National Information Standards Organization, 1990.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
9

Mapping, Canadian Council on Surveying and. Standard EDP file exchange format for digital topographic data. Sherbrooke: Energy Mines and Resources, Surveys, Mapping and Remote Sensing Sector, 1989.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
10

East, E. William. The standard data exchange format for critical path method scheduling. Champaign, Ill: US Army Corps of Engineers, Construction Engineering Research Laboratories, 1995.

Find full text
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Standard exchange formats"

1

Davies, Antony N. "Standard Exchange Formats for Spectral Data." In Handbook of Chemoinformatics, 446–65. Weinheim, Germany: Wiley-VCH Verlag GmbH, 2008. http://dx.doi.org/10.1002/9783527618279.ch15.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Margaria, Tiziana, Hafiz Ahmad Awais Chaudhary, Ivan Guevara, Stephen Ryan, and Alexander Schieweck. "The Interoperability Challenge: Building a Model-Driven Digital Thread Platform for CPS." In Lecture Notes in Computer Science, 393–413. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-89159-6_25.

Full text
Abstract:
AbstractWith the heterogeneity of the industry 4.0 world, and more generally of the Cyberphysical Systems realm, the quest towards a platform approach to solve the interoperability problem is front and centre to any system and system-of-systems project. Traditional approaches cover individual aspects, like data exchange formats and published interfaces. They may adhere to some standard, however they hardly cover the production of the integration layer, which is implemented as bespoke glue code that is hard to produce and even harder to maintain. Therefore, the traditional integration approach often leads to poor code quality, further increasing the time and cost and reducing the agility, and a high reliance on the individual development skills. We are instead tackling the interoperability challenge by building a model driven/low-code Digital Thread platform that 1) systematizes the integration methodology, 2) provides methods and techniques for the individual integrations based on a layered Domain Specific Languages (DSL) approach, 3) through the DSLs it covers the integration space domain by domain, technology by technology, and is thus highly generalizable and reusable, 4) showcases a first collection of examples from the domains of robotics, IoT, data analytics, AI/ML and web applications, 5) brings cohesiveness to the aforementioned heterogeneous platform, and 6) is easier to understand and maintain, even by not specialized programmers. We showcase the power, versatility and the potential of the Digital Thread platform on four interoperability case studies: the generic extension to REST services, to robotics through the UR family of robots, to the integration of various external databases (for data integration) and to the provision of data analytics capabilities in R.
APA, Harvard, Vancouver, ISO, and other styles
3

Beyer, Dirk, and Karlheinz Friedberger. "Violation Witnesses and Result Validation for Multi-Threaded Programs." In Leveraging Applications of Formal Methods, Verification and Validation: Verification Principles, 449–70. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-61362-4_26.

Full text
Abstract:
Abstract Invariants and error traces are important results of a program analysis, and therefore, a standardized exchange format for verification witnesses is used by many program analyzers to store and share those results. This way, information about program traces and variable assignments can be shared across tools, e.g., to validate verification results, or provided to users, e.g., to visualize and explore the results in order to fix bugs or understand the reason for a program’s correctness. The standard format for correctness and violation witnesses that was used by SV-COMP for several years was only applicable to sequential (single-threaded) programs. To enable the validation of results for multi-threaded programs, we extend the existing standard exchange format by adding information about thread management and thread interleaving. We contribute a reference implementation of a validator for violation witnesses in the new format, which we implemented as component of the software-verification framework "Image missing" . We experimentally evaluate the format and validator on a large set of violation witnesses. The outcome is promising: several verification tools already produce violation witnesses that help validating the verification results, and our witness validator can re-verify most of the produced witnesses.
APA, Harvard, Vancouver, ISO, and other styles
4

Beyer, Dirk, and Sudeep Kanav. "An Interface Theory for Program Verification." In Leveraging Applications of Formal Methods, Verification and Validation: Verification Principles, 168–86. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-61362-4_9.

Full text
Abstract:
Abstract Program verification is the problem, for a given program $$P$$ and a specification $$\phi $$, of constructing a proof of correctness for the statement “program $$P$$ satisfies specification $$\phi $$” ($$P \models \phi $$) or a proof of violation ("Equation missing"). Usually, a correctness proof is based on inductive invariants, and a violation proof on a violating program trace. Verification engineers typically expect that a verification tool exports these proof artifacts. We propose to view the task of program verification as constructing a behavioral interface (represented e.g. by an automaton). We start with the interface $$I_{P}$$ of the program itself, which represents all traces of program executions. To prove correctness, we try to construct a more abstract interface $$I_{C}$$ of the program (overapproximation) that satisfies the specification. This interface, if found, represents more traces than $$I_{P}$$ that are all correct (satisfying the specification). Ultimately, we want a compact representation of the program behavior as a correctness interface $$I_{C}$$ in terms of inductive invariants. We can then extract a correctness witness, in standard exchange format, out of such a correctness interface. Symmetrically, to prove violation, we try to construct a more concrete interface $$I_{V}$$ of the program (underapproximation) that violates the specification. This interface, if found, represents fewer traces than $$I_{P}$$ that are all feasible (can be executed). Ultimately, we want a compact representation of the program behavior as a violation interface $$I_{V}$$ in terms of a violating program trace. We can then extract a violation witness, in standard exchange format, out of such a violation interface. This viewpoint exposes the duality of these two tasks — proving correctness and violation. It enables the decomposition of the verification process, and its tools, into (at least!) three components: interface synthesizers, refinement checkers, and specification checkers. We hope the reader finds this viewpoint useful, although the underlying ideas are not novel. We see it as a framework towards modular program verification.
APA, Harvard, Vancouver, ISO, and other styles
5

Papakonstantinou, Mihalis, Manos Karvounis, Giannis Stoitsis, and Nikos Manouselis. "Deploying a Scalable Big Data Platform to Enable a Food Safety Data Space." In Data Spaces, 227–48. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-98636-0_11.

Full text
Abstract:
AbstractThe main goal of this chapter is to share the technical details and best practices for setting up a scalable Big Data platform that addresses the data challenges of the food industry. The amount of data that is generated in our food supply chain is rapidly increasing. The data is published by hundreds of organizations on a daily basis, in many different languages and formats making its aggregation, processing, and exchange a challenge. The efficient linking and mining of the global food data can enable the generation of insights and predictions that can help food safety experts to make critical decisions. All the food companies as well as national authorities and agencies may highly benefit from the data services of such a data platform. The chapter focuses on the architecture and software stack that was used to set up a data platform for a specific business use case. We describe how the platform was designed following data and technology standards to ensure the interoperability between systems and the interconnection of data. We share best practices on the deployment of data platforms such as identification of records, orchestrating pipelines, automating the aggregation workflow, and monitoring of a Big Data platform. The platform was developed in the context of the H2020 BigDataGrapes project, was awarded by communities such as Elasticsearch, and is further developed in H2020 The Food Safety Market project in order to enable the setup of a data space for the food safety sector.
APA, Harvard, Vancouver, ISO, and other styles
6

McGrath, Tim. "The Reality of Using Standards for Electronic Business Document Formats." In Handbook of Research on E-Business Standards and Protocols, 21–32. IGI Global, 2012. http://dx.doi.org/10.4018/978-1-4666-0146-8.ch002.

Full text
Abstract:
This chapter presents the challenges faced when developing and using standard formats for electronic business document exchange and tries to identify the real values and costs. As a reference it takes the OASIS Universal Business Language (UBL) and demonstrates how, despite the challenges, UBL can provide a common bridging format (sometimes called a “lingua franca”) for exchanging business information between different communities.
APA, Harvard, Vancouver, ISO, and other styles
7

Yu, Shien-chiang, Hsueh-hua Chen, and Chao-chen Chen. "Dynamic Metadata Management System for Digital Archives." In Design and Usability of Digital Libraries, 55–75. IGI Global, 2005. http://dx.doi.org/10.4018/978-1-59140-441-5.ch004.

Full text
Abstract:
This chapter describes metalogy, an XML/metadata framework that can handle several different metadata formats. Metalogy was developed under the Digital Museum Project funded by the National Science Council of Taiwan. It is common to have different data types and catalog formats even within one organization. In order to accommodate a variety of objects, it is often necessary to adopt several metadata formats. Thus, when designing a metadata management system, one needs to be able to handle heterogeneous metadata formats. XML, being a standard gaining increasing popularity, is also often used as data format so that exchange between data can be done in a uniform way.
APA, Harvard, Vancouver, ISO, and other styles
8

Xu, Xun. "Integration Based on STEP Standards." In Integrating Advanced Computer-Aided Design, Manufacturing, and Numerical Control, 246–65. IGI Global, 2009. http://dx.doi.org/10.4018/978-1-59904-714-0.ch011.

Full text
Abstract:
The integration model (Model B) as discussed in the previous chapter makes use of exchangeable neutral data formats such as IGES (1980). Neutral data formats provide a middle tier to connect CAD and CAM systems. Thus, Model B can create a collaborative manufacturing environment and make the design data exchange possible for large projects at the international level. Yet, some problems still remain. IGES was designed to exchange geometrical information only, so additional design or manufacturing information (such as feature information) within a proprietary model is ignored. During data exchange, some information may become astray during data transfer; geometry stitching or model repair is often needed. Plus, IGES is not an international standard. As previously discussed, there are also problems common to both Models A and B (Figure 10.1). Different data formats (e.g. IGES and ISO 6983-1, 1982) are used in the designto- manufacturing chain. Data loss occurs in the transaction from design to manufacturing because only low-level, step-by-step sequential machining commands are passed onto the CNC controllers, leaving the complete product model behind. Of particular significance has been the endeavour made by the International Organization for Standardization to introduce the STEP Standard (i.e. ISO 10303-1 [1994]). Major aerospace and automotive companies have proven the value of STEP through production implementations resulting in savings of US $150 million per year (Gallaher, O’Connor & Phelps, 2002, PDES, Inc. 2006). Moreover, STEP has recently been extended to cater to manufacturing data modelling and execution with an aim to fill the information gap between CAD/CAPP/CAM and CNC. The standard is informally known as STEP-compliant Numerical Control, or otherwise STEP-NC for short. It was given an ISO name of “ISO 14649: Data model for Computerized Numerical Controllers (ISO 14649-1, 2003)”, which defines the STEP-NC Application Reference Model. With STEP being extended to model manufacturing information, a new paradigm of integrated CAD/CAPP/CAM/CNC is emerging. This is illustrated in Figure 11.1. The key to this paradigm is that no data conversion is required and the data throughout the design and manufacturing chain are preserved. This chapter focuses on the use of STEP standards to support data exchange between CAD systems as well as facilitate data flow between CAD, CAPP, CAM, and CNC systems. Also discussed are the specific integration issues between CAD and CAPP, CAPP and CAM, and CAM and CNC using STEP standards. STEP-NC data model is a relatively new member in the STEP family, but it completes the entire suite of STEP standards from design to NC machining. Both Physical File Implementation Method (ISO 10303-21, 1994) and XML Implementation Method (ISO/TS 10303-18, 2004) are presented as the two popular ways of implementing STEP and STEP-NC.
APA, Harvard, Vancouver, ISO, and other styles
9

Chiang, Chia-Chu. "Engineering Information Into Open Documents." In Open Information Management, 9–19. IGI Global, 2009. http://dx.doi.org/10.4018/978-1-60566-246-6.ch002.

Full text
Abstract:
Documents are perfectly suited for information exchange via the Internet. In order to insure that there are no misunderstandings, information embedded in a document needs to be precise and unambiguous. Having a (de facto) standard data model and conceptual information model insures that the involved parties will agree on what the information means. XML (eXtensible Markup Language) has become the de facto standard format for representing information in documents for document exchange. Many techniques have been proposed to create XML documents, including the validation and transformation of XML documents. However, very little is discussed when it comes to extracting information from non- XML documents and engineering the information into XML documents. The extraction process can be a highly labor intensive task if it is done manually. The use of automated tools would make the process more efficient. In this chapter, the author will briefly survey document engineering techniques for XML documents. Then, the author will present two techniques to extract data from Windows documents into XML documents. These two techniques have been successfully applied in two industrial projects. He believes that techniques that automate the extraction of data from non-XML documents into XML formats will definitely enhance the use of XML documents.
APA, Harvard, Vancouver, ISO, and other styles
10

Fomin, Vladis, and Kalle Lyytinen. "How to Distribute a Cake before Cutting It into Pieces." In Information Technology Standards and Standardization, 222–39. IGI Global, 2000. http://dx.doi.org/10.4018/978-1-878289-70-4.ch014.

Full text
Abstract:
This article analyses social networks by looking at the standard making processes. As a framework for analysis, actor network theory is chosen. Standards are of particular interest for actor network theory for they provide mechanisms to align interests of multiple social groups organized in networks that have a joint incentive in working with the standards and /or associated technologies. These social groups include scientific communities, government institutions and social movements (industrial groups, companies, and consumers) that are interested in regulating and innovating with new technologies. Standards provide the mechanisms to inscribe subsequent behaviors that are expected to become persistent over time. Standard making process is a social process. Actors are involved in the process of continuous negotiation of their interests. Due to this fact, standards became an object of analysis for scholars within the social shaping of technology theory (SST). Though usually scholars of this school take standards as material objects, they interpret technology as such, e.g., a bicycle, or a steam machines. In Information Technology (IT), domain standards are intangible. Those are electronic data exchange formats, communications protocols, signalling protocols, etc. Wireless and mobile communications in particular, being a large field of IT, represent an interesting case for analysis. Present in mobile telephony’s domain are de jure (e.g., GSM) and de facto standards (e.g., NMT). Also the broad scope and large scale of standardization processes suggests non-unified pattern of standard making and complex organizational structure. To make mobile telephony standards successful implies large networks and numerous mandatory passage points. In this paper we apply actor network theory based analysis (ANT) to the development of NMT wireless standards. Researchers interested in IT standardization, except for a few studies on electronic data interchange (EDI) by Hanseth (1997), have overlooked this approach. The acronym NMT stands for Nordisk MobilTelefon (Nordic Mobile Telephone) and it can be historically regarded as one of the best examples of Nordic cooperation in technology as NMT systems have spread quite widely around the world and it also formed an important stepping stone for the evolution of GSM standards. We chose for ANT analysis of the NMT standard making process to learn of the usefulness of theoretical framework and to understand the standard making process of NMT as a social and institutional change. In our opinion, this more than anything else, explains the success of this interesting historical incident that changed the telecommunication industry radically and made Scandinavia a powerhouse of the wireless technologies. Our approach expected to bring more understanding on how the enthusiasm of a small number of actors fostered successful development of the NMT cellular telephony standard. At the same time the NMT standard was based on concepts and visions of its developers. Yet, it was these visions and engagements that lead to distributed the big cake of the cellular world even before cutting it into pieces. The outline of the chapter is the following. In the next section, we discuss past theoretical analysis of the topic. Then we introduce new notions into ANT, such as a layer and a multilayered structure. Next we tell the story of the Nordic radio engineers’ gang. We then analyze the NMT standard’s development process as an instance of actor network mobilization. Some insights into future developments of cellular mobile communications, both from the technological and social perspectives are provided.
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Standard exchange formats"

1

Ramachandran, Suresh. "Lab2Lab Data Exchange Using ATML." In NCSL International Workshop & Symposium. NCSL International, 2014. http://dx.doi.org/10.51843/wsproceedings.2014.46.

Full text
Abstract:
Electronic Exchange of information between businesses (B2B) have reached levels of maturity over the years, establishing a backbone for electronic commerce built up on top of standards for data exchange such as UN/EDIFACT, ebXML and cXML etc. Standardizing electronic exchange of calibration information is becoming more and more relevant as assets are distributed across the world with limited centralization even within organizations. There are a number of vendor specific lab management tools and automated test equipment with a wide array of formats for capturing and storing calibration information. A protocol and standard that allows for electronic exchange of laboratory information will reduce the need for paper based calibration certificate and will help reduce the need for centralized asset tracking. This paper aims to cover the need for evolving a standard for electronic information exchange of calibration data across laboratories via the Internet in a secure fashion. This document covers the requirements and some of the use cases for real-time exchange of calibration information. The paper describes some of the approaches to building these standards using existing technology and infrastructure available such as using ATML and webservices. The paper addresses some of the challenges and known issues of sending information through the Internet.
APA, Harvard, Vancouver, ISO, and other styles
2

Over, H. H., T. Ojala, P. Ha¨hner, and T. Austin. "XML Related MatDB Tools for Data Exchange and Interoperability." In ASME 2011 Pressure Vessels and Piping Conference. ASMEDC, 2011. http://dx.doi.org/10.1115/pvp2011-57152.

Full text
Abstract:
The web-enabled materials properties database MatDB of the European Commission Joint Research Centre (EC-JRC) is a database application for the storage, retrieval and evaluation of experimentally measured materials data coming from European R&D projects. Data exchange and interoperability are important database issues to reduce costs of expensive material tests. Many organizations world-wide are participating in the development of GEN IV reactors. To reduce costs the GEN IV International Forum has agreed to interoperate and exchange data for the screening and qualification of candidate materials. To simplify the complexity of data mapping between differently structured databases, adoption of a standardized XML schema is the favored option. The paper focuses on MatDB XML related tools and items: • Upgrade, extension and implementation of the MatDB XML schema within a planned US/EC cooperation; • European standardization activities for data exchange, interoperability and the development of standard formats for engineering materials data; • MatDB data cite participation.
APA, Harvard, Vancouver, ISO, and other styles
3

Pan, Chunxia, and Shana Smith. "Extracting Geometrical Data From CAD STEP Files." In ASME 2003 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. ASMEDC, 2003. http://dx.doi.org/10.1115/detc2003/cie-48224.

Full text
Abstract:
Most CAD tools currently do not have advanced capability built in for directly analyzing the feasibility of product assembly during the design stage. As a result, the feasibility of product assembly has to be analyzed using external assembly analysis tools. To integrate external assembly analysis tools with CAD tools, geometrical data from the CAD models must be extracted from CAD files and imported into the assembly analysis tools. To transfer geometrical data between CAD tools and assembly analysis tools, a neutral file format is needed. STEP (standard for the exchange of product model data) is a neutral file format for convenient and reliable data exchange between different design and manufacturing systems. Therefore, STEP is considered as one of the most popular formats for saving designs. As a result, to evaluate designs, extracting geometrical data from CAD-STEP file is important. Comparing with some other currently available methods, first translating STEP file into XML and then using Java DOM to get geometrical information from XML file is much better. This paper explores the process of extracting geometrical data from a CAD-STEP file for assembly sequence planning. A STEPXML translator, Java-XML parser and DTD (document type definition) generator are used on the JDK 1.4 platform. In this project, DOM (document object model) is applied as the API (Application Programming Interface) for XML.
APA, Harvard, Vancouver, ISO, and other styles
4

Banfi, Fabrizio, Jacopo Alberto Bonini, Alessandro Mandelli, and Stefano Marco De Gennaro. "BIM INTEROPERABILITY: OPEN BIM-BASED WORKFLOW FOR HERITAGE BUILDING INFORMATION MODELLING (HBIM). A MULTIDISCIPLINARY APPROACH BASED ON ADVANCED 3D TOOLS AND EXCHANGE FORMATS." In ARQUEOLÓGICA 2.0 - 9th International Congress & 3rd GEORES - GEOmatics and pREServation. Editorial Universitat Politécnica de Valéncia: Editorial Universitat Politécnica de Valéncia, 2021. http://dx.doi.org/10.4995/arqueologica9.2021.12104.

Full text
Abstract:
In recent years we have witnessed how technology applied to built heritage has exponentially changed the daily practices of the various experts involved in the life cycle of buildings. The techniques of representation of historical architecture have been able to make use of new 3D survey tools as well as research methods capable of managing a large amount of data while improving the level of information (LOI) and accuracy of the surveyed artefacts. On the other hand, professionals still have to make use of a large number of exchange formats in order to share their digital representations (3D, 2D) and analysis. For this reason, this paper describes the research approach followed to obtain “standard” architectural representations of a heritage building in the Cultural Heritage domain. The word “standard” is used in its original meaning: “something established by authority, custom, or general consent as a model or example” (Collins Dictionary). In this context, 3D models have a primary role in the workflow because its position is in-between the 3D survey techniques that come first and the restoration/maintenance activities. The authors’ thought is that the workflow should be as smooth and sustainable as possible to have an effective standardization and collaboration among disciplines, sectors and technicians working in the different study areas.
APA, Harvard, Vancouver, ISO, and other styles
5

Lubell, Joshua, Russell S. Peak, Vijay Srinivasan, and Stephen C. Waterbury. "STEP, XML, and UML: Complementary Technologies." In ASME 2004 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. ASMEDC, 2004. http://dx.doi.org/10.1115/detc2004-57743.

Full text
Abstract:
One important aspect of product lifecycle management (PLM) is the computer-sensible representation of product information. Over the past fifteen years or so, several languages and technologies have emerged that vary in their emphasis and applicability for such usage. ISO 10303, informally known as the Standard for the Exchange of Product Model Data (STEP), contains the high-quality product information models needed for electronic business solutions based on the Extensible Markup Language (XML). However, traditional STEP-based model information is represented using languages that are unfamiliar to most application developers. This paper discusses efforts underway to make STEP information models available in universal formats familiar to most business application developers: specifically XML and the Unified Modeling Language™ (UML®). We also present a vision and roadmap for future STEP integration with XML and UML to enable enhanced PLM interoperability.
APA, Harvard, Vancouver, ISO, and other styles
6

Zhang, Jingsheng, and Shana Smith. "Shape Similarity Matching With Octree Representations." In ASME 2006 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. ASMEDC, 2006. http://dx.doi.org/10.1115/detc2006-99397.

Full text
Abstract:
Shape matching is one of the fundamental problems in content-based 3D shape retrieval. Since there are typically a large number of possible matches in a shape database, there is a crucial need to perform shape matching efficiently. As a result, shapes must be reduced into a simpler shape representation, and computational complexity is one of the most important criteria for evaluating 3D shape representations. To meet the need, the investigators have implemented a new effective and efficient approach for 3D shape matching, which uses a simplified octree representation of 3D mesh models. The simplified octree representation was developed to improve time and space efficiency over prior representations. In addition, octree representations are rapidly becoming the standard file format for delivering 3D content across the Internet. The proposed approach stores octree information in XML files, rather than using a new data file type, to facilitate comparing models over the Internet. New methods for normalizing models, generating octrees, and comparing models were developed. The proposed approach allows users to efficiently exchange shape information and compare models over the Internet, in standardized data and data file formats, without transferring exact model files. The proposed approach is the first step in a project which will build a complete 3D model database and data retrieval system, which can be incorporated with other data mining techniques.
APA, Harvard, Vancouver, ISO, and other styles
7

Sim, Susan Elliott, Ric Holt, and Rainer Koschke. "Workshop on standard exchange format (WoSEF) (workshop session)." In the 22nd international conference. New York, New York, USA: ACM Press, 2000. http://dx.doi.org/10.1145/337180.337825.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Saele, H. "ODEL - standard format for exchange of metering and settlement information." In 16th International Conference and Exhibition on Electricity Distribution (CIRED 2001). IEE, 2001. http://dx.doi.org/10.1049/cp:20010929.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Ye, Peng, Yonggang Zhang, Yanglan Wang, Geying Huang, and Lianshui Guo. "A Method of Parametric Model Data Exchange and Reconstruction Based on Feature Script." In ASME 2021 International Mechanical Engineering Congress and Exposition. American Society of Mechanical Engineers, 2021. http://dx.doi.org/10.1115/imece2021-69067.

Full text
Abstract:
Abstract In collaborative product development (CPD), iterative design with heterogeneous CAD systems is always completed by original equipment manufacturers (OEMs) and other suppliers around the world. Therefore, product data exchanges among different heterogeneous CAD systems are crucial in CPD. However, the major bottleneck is that parametric part models of complex products cannot be exchanged among heterogeneous CAD systems. Published by the International Organization for Standardization (ISO), the Standard for the Exchange of Product Model Data (STEP) is the most common solution for geometry-based data exchanges and product annotations in the manufacturing industry. However, the exchanged STEP parts cannot be modified due to the loss of important information including modeling history, parameters and constraints, which involve the design intent. Therefore, the challenge of modification on exchanged parts needs to be overcome for iterative design in heterogeneous CAD systems of the manufacturing industry. In this paper, a method of part parametric model data exchange and reconstruction based on feature script is proposed. The main idea is features, which include reference features (like datum-plane, datum-axis, and datum-point), shape features (like extrude and rotation), auxiliary features (like fillet and chamfer), and sketch features. These features are taken as the fundamental elements of a part in CAD system, and each is defined by feature script and saved as an extensible markup language (XML) node. The feature modeling history of a part is regarded as a tree structure, which can be converted to an XML tree structure. In part model reconstruction, the proposed method utilizes a persistent naming mechanism and geometric data to identify topological entities, and then realize the persistence of constraint information and parameter driving of part model. The proposed method will be applied to part data exchange and parametric model reconstruction among heterogeneous CAD systems in the following process: (i) the feature tree and parameters of the parametric model are obtained in the source CAD system; (ii) the feature model data is translated into feature script and saved as XML format file; (iii) the feature script file is read, and the parametric model is reconstructed in the target CAD system according to modeling history. In order to verify the accuracy of data exchange and model reconstruction between Creo and CATIA, a method on constraints equivalence analysis and feature comparison is also proposed for analyzing the similarities and differences between source and target parametric models. In addition, a case study is presented to demonstrate the potentials of the proposed method for parametric-feature model exchanges among heterogeneous CAD systems in web-based collaborative product development.
APA, Harvard, Vancouver, ISO, and other styles
10

Bender, Andreas, Karl Haesler, Claus Thomas, and Jaroslaw Grochowicz. "Development of Universal Brake Test Data Exchange Format and Evaluation Standard." In SAE 2010 Annual Brake Colloquium And Engineering Display. 400 Commonwealth Drive, Warrendale, PA, United States: SAE International, 2010. http://dx.doi.org/10.4271/2010-01-1698.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Standard exchange formats"

1

Grote, H., J. Holt, N. Malitsky, F. Pilat, R. Talman, and C. G. Trahern. SXF (Standard eXchange Format): definition, syntax, examples. Office of Scientific and Technical Information (OSTI), June 1998. http://dx.doi.org/10.2172/1119545.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

East, E. W. The Standard Data Exchange Format for Critical Path Method Scheduling. Fort Belvoir, VA: Defense Technical Information Center, September 1995. http://dx.doi.org/10.21236/ada303566.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography