To see the other types of publications on this topic, follow the link: Data element mapping.

Journal articles on the topic 'Data element mapping'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Data element mapping.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Chen, Ya-Ning. "A RDF-based approach to metadata crosswalk for semantic interoperability at the data element level." Library Hi Tech 33, no. 2 (June 15, 2015): 175–94. http://dx.doi.org/10.1108/lht-08-2014-0078.

Full text
Abstract:
Purpose – The purpose of this paper is to propose a Resource Description Framework (RDF)-based approach to transform metadata crosswalking from equivalent lexical element mapping into semantic mapping with various contextual relationships. RDF is used as a crosswalk model to represent the contextual relationships implicitly embedded between described objects and their elements, including semantic, hierarchical, granular, syntactic and multiple object relationships to achieve semantic metadata interoperability at the data element level. Design/methodology/approach – This paper uses RDF to translate metadata elements and their relationships into semantic expressions, and also as a data model to define the syntax for element mapping. The feasibility of the proposed approach for semantic metadata crosswalking is examined based on two use cases – the Archives of Navy Ships Project and the Digital Artifacts Project of National Palace Museum in Taipei – both from the Taiwan e-Learning and Digital Archives Program. Findings – As the model developed is based on RDF-based expressions, unsolved issues related to crosswalking, such as sets of shared terms, and contextual relationships embedded between described objects and their metadata elements could be manifested into a semantic representation. Corresponding element mapping and mapping rules can be specified without ambiguity to achieve semantic metadata interoperability. Research limitations/implications – Five steps were developed to clarify the details of the RDF-based crosswalk. The RDF-based expressions can also serve as a basis from which to develop linked data and Semantic Web applications. More use cases including biodiversity artifacts of natural history museums and literary works of libraries, and conditions, constraints and cardinality of metadata data elements will be required to make revisions to fine tune the proposed RDF-based metadata crosswalk. Originality/value – In addition to reviving contextual relationships embedded between described objects and their metadata elements, nine types of mapping rules were developed to achieve a semantic metadata crosswalk which will facilitate the design of related mapping software. Furthermore, the proposed approach complements existing crosswalking documents provided by authoritative organizations, and enriches mapping language developed by the CIDOC community.
APA, Harvard, Vancouver, ISO, and other styles
2

Zhang, Yan. "Researching about Performance of Archiving History Data Based on XML under Hadoop." Applied Mechanics and Materials 380-384 (August 2013): 2482–85. http://dx.doi.org/10.4028/www.scientific.net/amm.380-384.2482.

Full text
Abstract:
Analyzed radical problem of historical data archiving in date warehouse, presented applying XML into archiving. Arithmetic of structuring level direct graph was provided. The arithmetic of mapping from data base structure to XML Schema based on level direct graph was implemented. In the meantime, the ways to describe all level elements, including base element, table element, recorder element, field element and relation ship between them, were provided. The experiment results are showed that the method is efficient.
APA, Harvard, Vancouver, ISO, and other styles
3

Wright, Sue Ellen, and Gerhard Budin. "Data elements in terminological entries." Terminology 1, no. 1 (January 1, 1994): 41–59. http://dx.doi.org/10.1075/term.1.1.05wri.

Full text
Abstract:
Differing theoretical and methodological views and working-group needs have spawned a wide diversity in the content, layout and internal structure of terminological entries in database environments, which in turn complicates standardization and data interchange. Major criticisms lodged against the data element list provided in ISO 6156 (MATER) prompted the authors to conduct an empirical examination of over thirty existing databases to ascertain which data elements are truly used in practice (as opposed to those which are espoused or rejected in theory). Their results reveal that designation of data elements, like other terminological products, are subject to the vagaries of polysemy and synonymy. They conclude that, given the widespread differences in approach evidenced in existing databases, the most practical approach to data element concerns during interchange is to compile an open-ended dictionary of common data element types for use as a mapping device during the data preparation stage.
APA, Harvard, Vancouver, ISO, and other styles
4

Yan, Ge, Heqin Cheng, Lizhi Teng, Wei Xu, Yuehua Jiang, Guoqiang Yang, and Quanping Zhou. "Analysis of the Use of Geomorphic Elements Mapping to Characterize Subaqueous Bedforms Using Multibeam Bathymetric Data in River System." Applied Sciences 10, no. 21 (October 30, 2020): 7692. http://dx.doi.org/10.3390/app10217692.

Full text
Abstract:
Riverbed micro-topographical features, such as crest and trough, flat bed, and scour pit, indicate the evolution of fluvial geomorphology, and have an influence on the stability of underwater structures and overall scour pits. Previous studies on bedform feature extraction have focused mainly on the rhythmic bed surface morphology and have extracted crest and trough, while flat bed and scour pit have been ignored. In this study, to extend the feature description of riverbeds, geomorphic elements mapping was used by employing three geomorphic element classification methods: Wood’s criteria, a self-organization map (SOM) technique, and geomorphons. The results showed that geomorphic element mapping can be controlled by adjusting the slope tolerance and curvature tolerance of Wood’s criteria, using the map unit number and combination of the SOM technique and the flatness of geomorphons. Relatively flat bed can be presented using “plane”, “flat planar”, and “flat” elements, while scour pit can be presented using a “pit” element. A comparison of the difference between parameter settings for landforms and bedforms showed that SOM using 8 or 10 map units is applicable for land and underwater surface and is thus preferentially recommended for use. Furthermore, the use of geomorphons is recommended as the optimal method for characterizing bedform features because it provides a simple element map in the absence of area loss.
APA, Harvard, Vancouver, ISO, and other styles
5

Hudecová, Ľubica. "Mapping as a Spatial Data Source." Slovak Journal of Civil Engineering 21, no. 1 (March 1, 2013): 24–30. http://dx.doi.org/10.2478/sjce-2013-0004.

Full text
Abstract:
Abstract The basic database for a geographic information system (BD GIS) forms the core of a national spatial data infrastructure. Nowadays decisions are being made about the potential data sources for additional data updates and refinement of the BD GIS. Will the data from departmental or other information system administrators serve for this purpose? This paper gives an answer as to whether it is advisable to use “geodetic mapping” (the results realized in the process of land consolidation) or “cadastral mapping” (the results realized in the process of the renewal of cadastral documentation by new mapping) for additional data updates. In our analysis we focus on the quality parameters at the individual data element level, namely the positional accuracy, attribute accuracy, logical consistency, and data resolution. The results of the analysis are compared with the contents of the Object Class Catalog of BD GIS (OCC), which describes the group of objects managed by BD GIS and defines the data collection methods, types of geometry and its properties.
APA, Harvard, Vancouver, ISO, and other styles
6

Aldaoud, Manar, Dawood Al-Abri, Medhat Awadalla, and Firdous Kausar. "Data Structure and Management Protocol to Enhance Name Resolving in Named Data Networking." Future Internet 16, no. 4 (March 30, 2024): 118. http://dx.doi.org/10.3390/fi16040118.

Full text
Abstract:
Named Data Networking (NDN) is a future Internet architecture that requires an Inter-Domain Routing (IDR) to route its traffic globally. Address resolution is a vital component of any IDR system that relies on a Domain Name System (DNS) resolver to translate domain names into their IP addresses in TCP/IP networks. This paper presents a novel two-element solution to enhance name-to-delivery location resolution in NDN networks, consisting of (1) a mapping table data structure and a searching mechanism and (2) a management protocol to automatically populate and modify the mapping table. The proposed solution is implemented and tested on the Peer Name Provider Server (PNPS) mapping table, and its performance is compared with two other algorithms: component and character tries. The findings show a notable enhancement in the operational speed of the mapping table when utilizing the proposed data structure. For instance, the insertion process is 37 times faster compared to previous algorithms.
APA, Harvard, Vancouver, ISO, and other styles
7

Shang, Jian Dong, Ya Peng Zhang, and Dong Fang Hu. "The Contrast Study on the Method of STEP/XML Data Transformation." Advanced Materials Research 403-408 (November 2011): 4103–7. http://dx.doi.org/10.4028/www.scientific.net/amr.403-408.4103.

Full text
Abstract:
Heterogeneous data sharing is always a bottleneck problem for the development of collaborative design. The method of STEP/XML element mapping achieves the data transformation between STEP and XML by establishing the mapping relationship of the two data. It’s hard to ensure all element mapping one to one and easy to cause the partial data loss. Based on the research, this paper put forward a new method with data encapsulation. It encapsulates the whole STEP data into the XML data by using of the XML character data, and then the data are uploaded to the collaborative design platform. Another designer can download it and get the original STEP data by using the XML parser. This method is not only easy to be realized, but also can ensure the integrity of the product model data conversion. It is more conducive to achieve collaborative design.
APA, Harvard, Vancouver, ISO, and other styles
8

Ledford, Allison, Susan Askew, and Edward Huang. "DATA ELEMENT MAPPING AND ANALYSIS (DEMA) TO ENABLE SYSTEMATIC MODEL CREATION USING SYSML." INCOSE International Symposium 34, no. 1 (July 2024): 2373–83. http://dx.doi.org/10.1002/iis2.13275.

Full text
Abstract:
AbstractData Element Mapping and Analysis (DEMA) represents a new and systematic methodology for the standardized capture, mapping, and analysis of data threads essential for comprehending digital systems and their architecture. This research studies the synergies between DEMA and Systems Modeling Language (SysML). The study aims to enable the systematic creation of digital artifacts using SysML. The results show that DEMA can serve as a complementary tool, enhancing the creation of SysML models by improving knowledge capture and verification processes. The prototype DEMA to SysML framework presented in this work provides the foundation for a systematic methodology to enable the systematic creation of models using SysML.
APA, Harvard, Vancouver, ISO, and other styles
9

Fournelle, J., C. Davidson, F. Spear, M. Kohn, and H. Guo. "Trace Element Mapping of Minerals and Materials by Electron Microprobe." Microscopy and Microanalysis 5, S2 (August 1999): 630–31. http://dx.doi.org/10.1017/s1431927600016470.

Full text
Abstract:
A strength of the modern electron microprobe is its ability to provide 2D compositional information about materials. These images give the ability to observe features that might otherwise pass unseen. Elements at the trace element level are generally ignored due to the high detection limits imposed by mapping under “standard” EMP conditions.Trace element mapping requires beam regulation at high (e.g. 300 nA) to very high (e.g. 3 μA) faraday cup currents, reliable beam and stage control, and suitable samples and mounting media. The ability to operate at high accelerating voltage to maximize Pk2/Bkg is desirable (Robinson and Graham, 1992), although we have encountered column difficulties above 25-30 kV.We are mapping trace and minor elements including Y, Sc, P, Cr, Mn, Ca, in garnets. Fig. 1 shows Y, Sc and Cr maps (Spear and Kohn, 1996; Kohn, Spear and Valley, 1997), and Fig. 2 Y and Sc maps (Cameron, unpub. data), produced with a Cameca SX51.
APA, Harvard, Vancouver, ISO, and other styles
10

Bektaş, Başak Aldemir. "Use of Recursive Partitioning to Predict National Bridge Inventory Condition Ratings from National Bridge Elements Condition Data." Transportation Research Record: Journal of the Transportation Research Board 2612, no. 1 (January 2017): 29–38. http://dx.doi.org/10.3141/2612-04.

Full text
Abstract:
In the United States, National Bridge Inventory (NBI) condition ratings, since the 1970s, and AASHTO’s commonly recognized (CoRe) element condition data, since the 1990s, have provided two major sources of bridge condition data. Although these separate systems of condition assessment had their individual uses, comparing the two, and mapping one from the other had uses for both state and federal agencies and the bridge management community. Alternative methods for this mapping have been proposed in the literature with varying predictive accuracy. With the publication of the new AASHTO Manual for Bridge Element Inspection in 2013, national bridge elements (NBEs) replace the CoRe element condition data as the comparable condition data for the NBI condition ratings. This paper investigates the use of the recursive partitioning method to develop classification trees that predict NBI condition ratings from NBE condition data. On the basis of data from a 2016 submission and 12 transportation agencies, classification trees were developed that presented the most likely NBI condition ratings for a set of logical conditions based on the relative element quantities and the percentage of element quantities in the condition states. The predictive accuracies for the trees are sufficient, and the percentages of exact matches and matches within one error term are better than other studies in the literature. Although the trees can be improved in the future with the availability of more NBE data submissions, the study presented preliminary decision trees with sufficient predictive accuracy that could be adopted by transportation agencies for a variety of bridge management functions.
APA, Harvard, Vancouver, ISO, and other styles
11

Qi, Yang, Wang Zhulin, and Wang Hongyun. "Standard Description of Common Element Model Data Type Based On XML Schema." MATEC Web of Conferences 173 (2018): 01038. http://dx.doi.org/10.1051/matecconf/201817301038.

Full text
Abstract:
AI-ESTATE standard uses EXPRESS language to describe its diagnostic knowledge, but EXPRESS is not a programming language, which makes it difficult to implement diagnostics using EXPRESS language description. This makes it difficult to share and reuse diagnostic knowledge; XML language with its good flexibility, readability and extensibility, it has brought great convenience to information exchange. Mapping the EXPRESS language to XML Schema is of great significance for the sharing of diagnostic knowledge. First, the data types of the EXPRESS language and XML Schema are analysed; then, mapping mechanism of the simple data types, aggregate data type, and structural data types to XML data types in EXPRESS are studied separately. Finally, based on the analysis of the AI-ESTATE standard common element model data type, the process of describing the diagnostic knowledge of the common element model using XML Schema standardization is studied, and the diagnostic knowledge can be portable and reused.
APA, Harvard, Vancouver, ISO, and other styles
12

Zhang, Youqi, Zhenkun Li, Rui Hao, Weiwei Lin, Lingfang Li, and Di Su. "High-fidelity time-series data synthesis based on finite element simulation and data space mapping." Mechanical Systems and Signal Processing 200 (October 2023): 110630. http://dx.doi.org/10.1016/j.ymssp.2023.110630.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Treccani, D., and A. Adami. "SINGLE BUILDING POINT CLOUD SEGMENTATION: TOWARDS URBAN DATA MODELING AND MANAGEMENT." International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences XLVIII-1/W1-2023 (May 25, 2023): 511–16. http://dx.doi.org/10.5194/isprs-archives-xlviii-1-w1-2023-511-2023.

Full text
Abstract:
Abstract. To manage urban areas, a key step is the development of a geometric survey and its subsequent analysis and processing in order to provide useful information, and to become a good basis for urban modeling. Surveys of urban areas can be developed with various technologies, such as Aerial Laser Scanning, Unmanned Aerial Systems photogrammetry, and Mobile Mapping Systems. To make the resulting point clouds useful for subsequent steps, it is necessary to segment them into classes representing urban elements. On the other hand, there are 2D land representations that provide a variety of information related to the elements in the urban environment, which are linked to databases that have information content related to them. In this context, the element identified as interesting for urban management of the built heritage is the individual building unit. This paper presents an automated method for using map datasets to segment individual building units on a point cloud of an urban area. A unique number is then assigned to the segmented points, linking them directly to the corresponding element in the map database. The resulting point cloud thus becomes a container of the information in the map database, and a basis for possible city modeling. The method was successfully tested on the historic city of Sabbioneta (northern Italy), using two point clouds, one obtained through the use of a Mobile Mapping System and one obtained with Unmanned Aerial System photogrammetry. Two cartographic databases were used, one opensource (OpenStreetMap) and one provided by the regional authorities (regional cartographic database).
APA, Harvard, Vancouver, ISO, and other styles
14

Susilo, B., and A. Cahyono. "Utilization of online geospatial data sources for oikonym study: mapping and analysis of housing name in capital area of kulon progo regency." IOP Conference Series: Earth and Environmental Science 1089, no. 1 (November 1, 2022): 012030. http://dx.doi.org/10.1088/1755-1315/1089/1/012030.

Full text
Abstract:
Abstract Oikonym is a part of toponym that focus the study on the name given to inhabited place. In the past, availability of data was one of the obstacles in the study of toponym as well as oikonym. Now days, the development of digital mapping and information technology particularly internet enables oikonym data obtained from a variety of sources. This study aimed to explore the typology of housing names as well as the geographical characteristic of the housing location by means of mapping and spatial analysis. Data required for mapping and analysis were obtained via internet therefore referred as online geospatial data sources. Housing names were analyzed based on their generic and specific elements and language of origin used for naming. Spatial analyses i.e., 3D analysis and network analysis were performed to obtain geographical characteristic of the housing location. This study shows, about 57% of housings which have generic name, use indigenous element i.e., local language. In addition to this, about 80% of housings use indigenous element for their specific names. Housings mostly located in low land and gentle slope. On average, distance of housings to the center of capital area is 3.3 km and road density is 6.6 km/km2.
APA, Harvard, Vancouver, ISO, and other styles
15

Kholit, Noviar Jamaal, and Muhamad Nastain. "Mapping of data communication networks on social media." INJECT (Interdisciplinary Journal of Communication) 5, no. 2 (January 27, 2021): 143–62. http://dx.doi.org/10.18326/inject.v5i2.143-162.

Full text
Abstract:
Information technology is developing very fast, this has an impact on real changes in every element of life. In addition to hitting the information media industry, developments in information technology have brought updates to public spaces with easy access and an increasingly massive pattern of information distribution. Ease of access does not always present a positive side but there is a negative side, namely shifting communication patterns with the spread of false information or disinformation methods that invite public upheaval. This research uses the case study method, which is one way to investigate contemporary phenomena in the context of real life, where the boundaries between the phenomenon and the context are not clearly visible. Through the Social Network Analyzer (SNA) theoretical approach, this research will find three communication network patterns, namely a centralized network, a decentralized network and a distributed network.
APA, Harvard, Vancouver, ISO, and other styles
16

Zu, Hai Ying, Mi Tian, and Jia Xuan Han. "Reverse Modeling Analysis and Design on Rubber Sealing Element in Spherical BOP." Advanced Materials Research 655-657 (January 2013): 300–304. http://dx.doi.org/10.4028/www.scientific.net/amr.655-657.300.

Full text
Abstract:
Spherical BOP is one of crucial equipments in snubbing operation well in petroleum industries, with its core element of rubber sealing element, the connection between the spherical surface of metal skeleton and the rubber is irregular surface, it is difficult to determine the model by manual surveying and mapping. The paper put forward a design method on spherical bop rubber sealing element with the reverse engineering technology. Through the analysis of the reverse surveying and mapping basic requirements on spherical bop rubber sealing element, determined its data acquisition path and process spherical, and puts forward some solutions to the data merger, data transfer and the method of pixels processing.
APA, Harvard, Vancouver, ISO, and other styles
17

Yang, Han, Bin Wang, Stephen Grigg, Ling Zhu, Dandan Liu, and Ryan Marks. "Acoustic Emission Source Location Using Finite Element Generated Delta-T Mapping." Sensors 22, no. 7 (March 24, 2022): 2493. http://dx.doi.org/10.3390/s22072493.

Full text
Abstract:
One of the most significant benefits of Acoustic Emission (AE) testing over other Non-Destructive Evaluation (NDE) techniques lies in its damage location capability over a wide area. The delta-T mapping technique developed by researchers has been shown to enable AE source location to a high level of accuracy in complex structures. However, the time-consuming and laborious data training process of the delta-T mapping technique has prevented this technique from large-scale application on large complex structures. In order to solve this problem, a Finite Element (FE) method was applied to model training data for localization of experimental AE events on a complex plate. Firstly, the FE model was validated through demonstrating consistency between simulated data and the experimental data in the study of Hsu-Nielsen (H-N) sources on a simple plate. Then, the FE model with the same parameters was applied to a planar location problem on a complex plate. It has been demonstrated that FE generated delta-T mapping data can achieve a reasonable degree of source location accuracy with an average error of 3.88 mm whilst decreasing the time and effort required for manually collecting and processing the training data.
APA, Harvard, Vancouver, ISO, and other styles
18

Kesteren, A. R. van. "Forest type distribution on a calcareous terrain in western Newfoundland." Forestry Chronicle 72, no. 2 (April 1, 1996): 185–92. http://dx.doi.org/10.5558/tfc72185-2.

Full text
Abstract:
Terrain factors influencing forest type distribution on a calcareous terrain in western Newfoundland were investigated. Landform elements were mapped at a scale of 1:12,500 utilizing air photo interpretation. Minimum and maximum elevation data along with dominant forest type occurrence were determined in the field. Frequencies of landform element and forest type correspondence were tested using a log-linear G2 analysis. Additionally, elevational differences of both landform elements and forest types were analyzed using the Kruskal-Wallis test. Null hypotheses of no significant landform influence on forest type distribution and no significant elevational differentiation of landform elements were rejected. However, no significant direct elevational differentiation of forest types was detected. Results are supportive of the observations of Damman (1967), indicating a primary toposequence control on forest type distribution. Verified forest type–landform associations could aid the development of a statistically based phytogeomorphic mapping system for forest land use management in Newfoundland. Key words: forest type, landform element, phytogeomorphic mapping, air photo interpretation
APA, Harvard, Vancouver, ISO, and other styles
19

Lee, Wan-Ping, Jiantao Wu, and Gabor T. Marth. "Toolbox for Mobile-Element Insertion Detection on Cancer Genomes." Cancer Informatics 13s4 (January 2014): CIN.S13979. http://dx.doi.org/10.4137/cin.s13979.

Full text
Abstract:
Mobile elements constitute greater than 45% of the human genome as a result of repeated insertion events during human genome evolution. Although most of mobile elements are fixed within the human population, some elements (including ALU, long interspersed elements (LINE) 1 (L1), and SVA) are still actively duplicating and may result in life-threatening human diseases such as cancer, motivating the need for accurate mobile-element insertion (MEI) detection tools. We developed a software package, TANGRAM, for MEI detection in next-generation sequencing data, currently serving as the primary MEI detection tool in the 1000 Genomes Project. TANGRAM takes advantage of valuable mapping information provided by our own MOSAIK mapper, and until recently required MOSAIK mappings as its input. In this study, we report a new feature that enables TANGRAM to be used on alignments generated by any mainstream short-read mapper, making it accessible for many genomic users. To demonstrate its utility for cancer genome analysis, we have applied TANGRAM to the TCGA (The Cancer Genome Atlas) mutation calling benchmark 4 dataset. TANGRAM is fast, accurate, easy to use, and open source on https://github.com/jiantao/Tangram .
APA, Harvard, Vancouver, ISO, and other styles
20

Lee, Wan-Ping, Jiantao Wu, and Gabor T. Marth. "Toolbox for Mobile-Element Insertion Detection on Cancer Genomes." Cancer Informatics 14s1 (January 2015): CIN.S24657. http://dx.doi.org/10.4137/cin.s24657.

Full text
Abstract:
Mobile elements constitute greater than 45% of the human genome as a result of repeated insertion events during human genome evolution. Although most of mobile elements are fixed within the human population, some elements (including ALU, long interspersed elements (LINE) 1 (L1), and SVA) are still actively duplicating and may result in life-threatening human diseases such as cancer, motivating the need for accurate mobile-element insertion (MEI) detection tools. We developed a software package, TANGRAM, for MEI detection in next-generation sequencing data, currently serving as the primary MEI detection tool in the 1000 Genomes Project. TANGRAM takes advantage of valuable mapping information provided by our own MOSAIK mapper, and until recently required MOSAIK mappings as its input. In this study, we report a new feature that enables TANGRAM to be used on alignments generated by any mainstream short-read mapper, making it accessible for many genomic users. To demonstrate its utility for cancer genome analysis, we have applied TANGRAM to the TCGA (The Cancer Genome Atlas) mutation calling benchmark 4 dataset. TANGRAM is fast, accurate, easy to use, and open source on https://github.com/jiantao/Tangram .
APA, Harvard, Vancouver, ISO, and other styles
21

Herbomel, P., A. Rollier, F. Tronche, M. O. Ott, M. Yaniv, and M. C. Weiss. "The rat albumin promoter is composed of six distinct positive elements within 130 nucleotides." Molecular and Cellular Biology 9, no. 11 (November 1989): 4750–58. http://dx.doi.org/10.1128/mcb.9.11.4750-4758.1989.

Full text
Abstract:
No fewer than six different positive regulatory elements concentrated within 130 base pairs constitute the rat albumin promoter, which drives highly tissue specific transcription in rat hepatoma cells in culture. Inactivation of each element led to a decrease in transcriptional efficiency: from upstream to downstream, 3- to 4-fold for distal elements III and II, 15-fold for distal element I, and 50-fold for the CCAAT box and the proximal element (PE). Three of these elements, distal elements III and II and, more crucially, the PE, were found to be involved in the tissue-specific character of transcription, with an additional negative regulation possibly superimposed at the level of the PE. Finally, our mapping of these regulatory elements in vivo entirely coincided with footprint data obtained in vitro, thereby allowing the tentative assignment of specific factors to the effects observed in vivo.
APA, Harvard, Vancouver, ISO, and other styles
22

Herbomel, P., A. Rollier, F. Tronche, M. O. Ott, M. Yaniv, and M. C. Weiss. "The rat albumin promoter is composed of six distinct positive elements within 130 nucleotides." Molecular and Cellular Biology 9, no. 11 (November 1989): 4750–58. http://dx.doi.org/10.1128/mcb.9.11.4750.

Full text
Abstract:
No fewer than six different positive regulatory elements concentrated within 130 base pairs constitute the rat albumin promoter, which drives highly tissue specific transcription in rat hepatoma cells in culture. Inactivation of each element led to a decrease in transcriptional efficiency: from upstream to downstream, 3- to 4-fold for distal elements III and II, 15-fold for distal element I, and 50-fold for the CCAAT box and the proximal element (PE). Three of these elements, distal elements III and II and, more crucially, the PE, were found to be involved in the tissue-specific character of transcription, with an additional negative regulation possibly superimposed at the level of the PE. Finally, our mapping of these regulatory elements in vivo entirely coincided with footprint data obtained in vitro, thereby allowing the tentative assignment of specific factors to the effects observed in vivo.
APA, Harvard, Vancouver, ISO, and other styles
23

Auenhammer, Robert M., Niels Jeppesen, Lars P. Mikkelsen, Vedrana A. Dahl, and Leif E. Asp. "X-ray computed tomography data structure tensor orientation mapping for finite element models — STXAE." Software Impacts 11 (February 2022): 100216. http://dx.doi.org/10.1016/j.simpa.2021.100216.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Zheng, Hewei, Xueying Zhao, Hong Wang, Yu Ding, Xiaoyan Lu, Guosi Zhang, Jiaxin Yang, et al. "Location deviations of DNA functional elements affected SNP mapping in the published databases and references." Briefings in Bioinformatics 21, no. 4 (August 2, 2019): 1293–301. http://dx.doi.org/10.1093/bib/bbz073.

Full text
Abstract:
Abstract The recent extensive application of next-generation sequencing has led to the rapid accumulation of multiple types of data for functional DNA elements. With the advent of precision medicine, the fine-mapping of risk loci based on these elements has become of paramount importance. In this study, we obtained the human reference genome (GRCh38) and the main DNA sequence elements, including protein-coding genes, miRNAs, lncRNAs and single nucleotide polymorphism flanking sequences, from different repositories. We then realigned these elements to identify their exact locations on the genome. Overall, 5%–20% of all sequence element locations deviated among databases, on the scale of kilobase-pair to megabase-pair. These deviations even affected the selection of genome-wide association study risk-associated genes. Our results implied that the location information for functional DNA elements may deviate among public databases. Researchers should take care when using cross-database sources and should perform pilot sequence alignments before element location-based studies.
APA, Harvard, Vancouver, ISO, and other styles
25

Gezgin, H., and R. M. Alkan. "DETECTION AND RECOGNITION OF TRAFFIC SIGNS FROM DATA COLLECTED BY THE MOBILE MAPPING SYSTEM." International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences XLVIII-4/W9-2024 (March 8, 2024): 183–88. http://dx.doi.org/10.5194/isprs-archives-xlviii-4-w9-2024-183-2024.

Full text
Abstract:
Abstract. Autonomous vehicles and high-resolution maps are key elements of future transport systems. Detection and recognition of traffic signs is an important element for the safe driving of autonomous vehicles and the development of high-resolution maps. In this study, it is aimed to accurately detect and identify traffic signs based on the data collected by the mobile mapping system in order to ensure the safe movement of autonomous vehicles in traffic. A low-cost method is proposed with the ResNet-50 model for an autonomous vehicle to automatically detect and recognise traffic signs while moving on the road. As a result of the model training, 0.99 accuracy and 0.016 loss were obtained. The success of the method was first observed on images randomly selected from the dataset. Then, a real-time test was performed on a low-cost webcam. The tests showed that the handled method detects and identifies the traffic sign quickly and accurately.
APA, Harvard, Vancouver, ISO, and other styles
26

Sukarno, Bangkit Rambu, and Muhamad Ahsan. "Implementasi Strategi Pengembangan Bisnis Dengan Business Model Canvas." Jurnal Manajemen dan Inovasi (MANOVA) 4, no. 2 (August 25, 2021): 51–61. http://dx.doi.org/10.15642/manova.v4i2.456.

Full text
Abstract:
This study aims to determine the mapping of existing business strategies and develop new business strategies into a business model canvas. The method used is descriptive qualitative. Data were collected through observation, documentation, and interview. The collected data were analyzed using SWOT analysis and the results were elaborated into nine elements of the Business Model Canvas. The results showed that the business strategy obtained from the Business Model Canvas mapping is good enough because each element supports each other to increase revenue. The practical implication is that several strategies must be improvised to increase revenue, including by developing key resources, increasing the cost structure for advertising, and persuading customers to become resellers as key partnerships, customer relationships, and channels.
APA, Harvard, Vancouver, ISO, and other styles
27

Luo, Fujun, and Yousong Zhao. "Design of cooperative update mechanism of national geographic conditions monitoring and basic surveying and mapping." Abstracts of the ICA 1 (July 15, 2019): 1–2. http://dx.doi.org/10.5194/ica-abs-1-231-2019.

Full text
Abstract:
<p><strong>Abstract.</strong> National geographic conditions monitoring and basic surveying and mapping are two important tasks of the surveying and mapping department, and they are similar in production organization and technology realization. In the process of operation, both of them need to carry out internal collection, base map production, field verification and so on. It is operationally feasible to carry out cooperative production of national geographic conditions monitoring and basic surveying and mapping. From the perspective of technical process and method, both of them are carried out by combination internal and field work. Firstly, based on remote sensing images and thematic geographic data, the internal work will perform image interpretation and obtain staged results data. Then, the field verification will be carried out to make judgments and adjustments. Finally, the results of the field verification will be transferred back to the internal work, and the data will be further edited and organized in the internal work to obtain the final data.</p><p>Basic surveying and mapping focuses on abstract representation of the real world, but lacks comprehensive integration of information and in-depth knowledge mining. National geographic condition monitoring focuses on the spatial distribution, characteristics and interrelations of natural and human geographical elements on the surface. There are many differences between basic surveying and mapping and national geographic conditions monitoring in the content and index of data collection, data stratification and element attribute. But basic surveying and mapping results are the basic data for national geographic conditions monitoring and national geographic conditions monitoring data is an important update data source for basic surveying and mapping.</p><p>On the one hand, part of the geographic information can be updated on the basis of extracting relevant basic geographic information element data and attribute information, On the other hand, timely basic geographic information data can be used as the direct basis for the collection of geographic information.</p><p>This paper designs the technical methods and workflow of the cooperative update mechanism based on the relevant technical documents of national geographic conditions monitoring and basic surveying and mapping. It will enable one-time acquisition of data needed for the national geographic conditions monitoring and basic surveying and mapping, "one-time collection, classification and utilization". It will save a lot of time and effort, reduce workload and improve productivity.</p>
APA, Harvard, Vancouver, ISO, and other styles
28

Zhang, Li Hua. "Data Integration of CAD, CAE with CAM for Composite Structures." Advanced Materials Research 139-141 (October 2010): 1294–98. http://dx.doi.org/10.4028/www.scientific.net/amr.139-141.1294.

Full text
Abstract:
Digital composite structures definition is the basis for the data integration of CAD, CAE and CAM for composite structures. The key of digital composite structures definition is the modeling of material structures. In this paper the procedure of material structures modeling and contents of laminate lay-up definition data have been summarized briefly. Today composite structures can not be analyzed with true fiber orientations. True fiber orientations of discrete triangle elements have been used to approximate the final state of ply fibers and a XML file has been used to describe laminate lay-up definition data. Furthermore, automatic mapping of fiber orientation data to the finite element mesh based on user specified tolerances has been used to obtain the automatic data transfer from CAD software to CAE software. Finally, the data integration of the CAD software with two manufacturing systems has been presented.
APA, Harvard, Vancouver, ISO, and other styles
29

Schwarzer, R. A., and M. Wehrhahn. "X-Ray Scanning Apparatus for Mapping Texture and Element Distributions." Textures and Microstructures 29, no. 1-2 (January 1, 1997): 65–76. http://dx.doi.org/10.1155/tsm.29.65.

Full text
Abstract:
The past decade has seen some remarkable progress in spatially resolved texture analysis due to new computer-aided techniques of electron diffraction with the SEM and TEM. To avoid, however, some limitations specific for electron microscopy, an x-ray scanning apparatus has been developed for the mapping of texture and element distributions on bulk samples. The set-up consists of a “white” x-ray source, a collimator system to produce a fine primary beam spot, an x-y sample stage operated by stepper motors, and an EDX detecting system for peak separation.Energy dispersive x-ray diffraction and x-ray fluorescence analysis are used for data acquisition. The density distributions of selected crystallographic directions or of element concentrations in the sample surface are acquired spot by spot, and represented by pseudocolour or grey shade maps. Several texture distribution as well as element composition maps can be obtained simultaneously. Spatial resolution is presently limited to 50 μm by the low level of primary beam intensity.
APA, Harvard, Vancouver, ISO, and other styles
30

Urban, Marcel, Sören Hese, Martin Herold, Stefan Pöcking, and Christiane Schmullius. "Pan-Arctic Land Cover Mapping and Fire Assessment for the ESA Data User Element Permafrost." Photogrammetrie - Fernerkundung - Geoinformation 2010, no. 4 (August 1, 2010): 283–93. http://dx.doi.org/10.1127/1432-8364/2010/0056.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Rolph, W. D. "Requirements for finite element model generation from CAD data—An approach using numerical conformal mapping." Computers & Structures 56, no. 2-3 (July 1995): 515–22. http://dx.doi.org/10.1016/0045-7949(95)00041-e.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Steentoft, H., and W. Rabbel. "Seismic mapping of the marmousi data set with the common reflecting element method (CRE method)." Tectonophysics 232, no. 1-4 (April 1994): 355–63. http://dx.doi.org/10.1016/0040-1951(94)90096-5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Muhammad Fadhil, Aditya Prayoga, Andi Eriawan, Erwin Sulaeman, and Ari Legowo. "Aerodynamic Pressure Mapping Technique from CFD to FEM Model of N219 Winglet." CFD Letters 13, no. 12 (December 17, 2021): 90–99. http://dx.doi.org/10.37934/cfdl.13.12.9099.

Full text
Abstract:
Due to relatively complex geometry of N219 winglets, CFD simulations have to be conducted to predict the aerodynamic load by the structure in some critical flight conditions. Since the aerodynamic CFD model is not the same as the finite element model of the structure, there is a need to accurately transform the load data between the two models. This paper discusses a simple alternative technique to map pressure distribution from the mesh or face zone of a CFD simulation to an FEM model using a Matlab based in-house code program. The technique focuses on how an FEM shell element has same pressure value with its nearest CFD element. Although the cumulative forces sometimes give different result, the pressure distribution is highly accurate, moreover when the FEM model has smoother elements. Validation has been conducted by comparing with other pressure mapping technique of a commercial software Patran. The results show a good agreement where the present technique provide a more accurate result especially for the critical biggest load among the cumulative forces in the three-dimensional direction. The proposed technique is currently suitable to evaluate loading characteristics of semi monocoque structures. A further treatment of the technique for other types of structure is currently under development.
APA, Harvard, Vancouver, ISO, and other styles
34

Kong, Yunhui, Guodong Chen, Bingli Liu, Miao Xie, Zhengbo Yu, Cheng Li, Yixiao Wu, et al. "3D Mineral Prospectivity Mapping of Zaozigou Gold Deposit, West Qinling, China: Machine Learning-Based Mineral Prediction." Minerals 12, no. 11 (October 26, 2022): 1361. http://dx.doi.org/10.3390/min12111361.

Full text
Abstract:
This paper focuses on researching the scientific problem of deep extraction and inference of favorable geological and geochemical information about mineralization at depth, based on which a deep mineral resources prediction model is established and machine learning approaches are used to carry out deep quantitative mineral resources prediction. The main contents include: (i) discussing the method of 3D geochemical anomaly extraction under the multi-fractal content-volume (C-V) models, extracting the 12 element anomalies and constructing a 3D geochemical anomaly data volume model for laying the data foundation for researching geochemical element distribution and association; (ii) extracting the element association characteristics of primary geochemical halos and inferring deep metallogenic factors based on compositional data analysis (CoDA), including quantitatively extracting the geochemical element associations corresponding to ore-bearing structures (Sb-Hg) based on a data-driven CoDA framework, quantitatively identifying the front halo element association (As-Sb-Hg), near-ore halo element association (Au-Ag-Cu-Pb-Zn) and tail halo element association (W-Mo-Co-Bi), which provide quantitative indicators for the primary haloes’ structural analysis at depth; (iii) establishing a deep geological and geochemical mineral resources prediction model, which is constructed by five quantitative mineralization indicators as input variables: fracture buffer zone, element association (Sb-Hg) of ore-bearing structures, metallogenic element Au anomaly, near-ore halo element association Au-Ag-Cu-Pb-Zn and the ratio of front halo to tail halo (As-Sb-Hg)/(W-Mo-Bi); and (iv) three-dimensional MPM based on the maximum entropy model (MaxEnt) and Gaussian mixture model (GMM), and delineating exploration targets at depth. The results show that the C-V model can identify the geological element distribution and the CoDA method can extract geochemical element associations in 3D space reliably, and the machine learning methods of MaxEnt and GMM have high performance in 3D MPM.
APA, Harvard, Vancouver, ISO, and other styles
35

Tapuria, Archana, Philipp Bruland, Brendan Delaney, Dipak Kalra, and Vasa Curcin. "Comparison and transformation between CDISC ODM and EN13606 EHR standards in connecting EHR data with clinical trial research data." DIGITAL HEALTH 4 (January 2018): 205520761877767. http://dx.doi.org/10.1177/2055207618777676.

Full text
Abstract:
Objectives Integrating Electronic Health Record (EHR) systems into the field of clinical trials still contains several challenges and obstacles. Heterogeneous standards and specifications are used to represent healthcare and clinical trial information. Therefore, this work investigates the mapping and data interoperability between healthcare and research standards: EN13606 used for the EHRs and the Clinical Data Interchange Standards Consortium Operational Data Model (CDISC ODM) used for clinical research. Methods Based on the specifications of CDISC ODM 1.3.2 and EN13606, a mapping between the structure and components of both standards has been performed. Archetype Definition Language (ADL) forms built with the EN13606 editor were transformed to ODM XML and reviewed. As a proof of concept, clinical sample data has been transformed into ODM and imported into an electronic data capture system. Reverse transformation from ODM to ADL has also been performed and finally reviewed concerning map-ability. Results The mapping between EN13606 and CDISC ODM shows the similarities and differences between the components and overall record structure of the two standards. An EN13606 archetype corresponds with a group of items within CDISC ODM. Transformations of element names, descriptions, different languages, datatypes, cardinality, optionality, units, value range and terminology codes are possible from EN13606 to CDISC ODM and vice versa. Conclusion It is feasible to map data elements between EN13606 and CDISC ODM and transformation of forms between ADL and ODM XML format is possible with only minor limitations. EN13606 can accommodate clinical information in a more structured manner with more constraints, whereas CDISC ODM is more suitable and specific for clinical trials and studies. It is feasible to transform EHR data in the EN13606 form to ODM to transfer it into research database. The attempt to use EN13606 to build a study protocol (that was already built with CDISC ODM) also suggests the possibility of using EN13606 standard in place of CDISC ODM if needed to avoid transformations.
APA, Harvard, Vancouver, ISO, and other styles
36

Higson, Theodore S., June E. Tessiatore, Sean A. Bennett, Raymond C. Derk, and Michael A. Kotarski. "The molecular organization of the Star/asteroid region, a region necessary for proper eye development in Drosophila melanogaster." Genome 36, no. 2 (April 1, 1993): 356–66. http://dx.doi.org/10.1139/g93-049.

Full text
Abstract:
The Star/asteroid (S/ast) region of Drosophila melanogaster has been cloned by P element transposon tagging using the snw chromosome as a source of defective P elements. In each mutation examined, the element integrated into the region was a 0.5-kb element from a region proximal to sn and not one of the head-to-head elements from the sn locus. Previously described spontaneous and X-ray induced mutations of S and ast were located on the molecular map by Southern analysis and restriction endonuclease mapping of genomic clones. S mutations are either large deletions of the cloned region or DNA breaks located near the P element insertions that cause ast mutations. Both S and ast mutations reduce the steady-state amounts of a 3.4-kb RNA. The molecular data, together with the phenotypic interactions observed for S and ast alleles, are consistent with the interpretation that S and ast mutations are lesions within the same gene or within genes that are functionally related.Key words: Drosophila, Star, asteroid, P elements.
APA, Harvard, Vancouver, ISO, and other styles
37

Bocci, Giuliano, Valentina Bianchi, and Silvio Cruschina. "Mapping focus to prosody in Italian." Isogloss. Open Journal of Romance Linguistics 10, no. 7 (December 14, 2024): 1–25. https://doi.org/10.5565/rev/isogloss.422.

Full text
Abstract:
Italian wh-questions with bare wh-elements are characterized by an exceptional prosodic pattern, whereby the nuclear pitch accent (NPA) is assigned neither to the wh-element nor to the default rightmost position, but it rather falls on the lexical verb, even though this is not semantically interpreted as a focus. Based on evidence from production, in previous work we argue that the NPA assignment is a reflex of the cyclic syntactic derivation, being sensitive to a syntactic [focus] feature borne by the wh-phrase. The striking dissociation between the NPA and focal interpretation that emerges from the production data raises the question of whether Italian hearers are sensitive to this marked prosodic pattern in understanding a question. To address this question, we carried out a comprehension experiment, where we manipulated the position of the NPA in biclausal wh-questions including two verbs: the verb of the matrix clause and the embedded verb. The results of this experiment confirm the psychological reality of our theoretical analysis, suggesting that hearers exploit prosodic cues to parse the sentence and to assign the correct interpretation to structures that only differ at the surface level with respect to the position of the NPA.
APA, Harvard, Vancouver, ISO, and other styles
38

Fiorillo, Graziano, and Hani Nassif. "Application of Machine Learning Techniques for the Analysis of National Bridge Inventory and Bridge Element Data." Transportation Research Record: Journal of the Transportation Research Board 2673, no. 7 (June 6, 2019): 99–110. http://dx.doi.org/10.1177/0361198119853568.

Full text
Abstract:
The MAP-21 Act requires information on bridge assets to be at the element level for management operations in the U.S.A. This approach has the objective of improving future predictions of the performance of bridge assets for a more precise evaluation of condition and correct allocation of management funds to keep bridges in a good state of repair. Although bridge element conditions were introduced in the 1990s, the application of such data had never been mandatory for bridge asset management until 2014, therefore, the amount of historical data on bridge element (BE) condition is still limited. On the other hand, National Bridge Inventory (NBI) ratings have been collected since the 1970s and a wide range of data are available. Therefore, it is natural to ask whether BE condition can be predicted using NBI data. In the past, researchers statistically related BE and NBI data, but little has been done to revert NBI to BE. This paper addresses both challenges of mapping BE–NBI condition data using several machine learning techniques. The results of the analysis of these techniques applied to a sample of about 9,000 bridges from northeastern states of the U.S.A. shows that between 79.8% and 100% of the NBI ratings for deck, superstructure, and substructure can be predicted within a rating error of ± 1. The back-mapping operation of NBI time-dependent ratings to BE deterioration profiles for deck, superstructure, and substructure can also be predicted accurately with a probability greater than 50% at the 95% confidence level.
APA, Harvard, Vancouver, ISO, and other styles
39

Szpakowski, Sebastian, James McCusker, and Michael Krauthammer. "Using Semantic Web Technologies to Annotate and Align Microarray Designs." Cancer Informatics 8 (January 2009): CIN.S2335. http://dx.doi.org/10.4137/cin.s2335.

Full text
Abstract:
In this paper, we annotate and align two different gene expression microarray designs using the Genomic ELement Ontology (GELO). GELO is a new ontology that leverages an existing community resource, Sequence Ontology (SO), to create views of genomically-aligned data in a semantic web environment. We start the process by mapping array probes to genomic coordinates. The coordinates represent an implicit link between the probes and multiple genomic elements, such as genes, transcripts, miRNA, and repetitive elements, which are represented using concepts in SO. We then use the RDF Query Language (SPARQL) to create explicit links between the probes and the elements. We show how the approach allows us to easily determine the element coverage and genomic overlap of the two array designs. We believe that the method will ultimately be useful for integration of cancer data across multiple omic studies. The ontology and other materials described in this paper are available at http://krauthammerlab.med.yale.edu/wiki/Gelo .
APA, Harvard, Vancouver, ISO, and other styles
40

Ahmed, A. "GIS and Remote Sensing for Malaria Risk Mapping, Ethiopia." ISPRS - International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences XL-8 (November 27, 2014): 155–61. http://dx.doi.org/10.5194/isprsarchives-xl-8-155-2014.

Full text
Abstract:
Integrating malaria data into a decision support system (DSS) using Geographic Information System (GIS) and remote sensing tool can provide timely information and decision makers get prepared to make better and faster decisions which can reduce the damage and minimize the loss caused. This paper attempted to asses and produce maps of malaria prone areas including the most important natural factors. The input data were based on the geospatial factors including climatic, social and Topographic aspects from secondary data. The objective of study is to prepare malaria hazard, Vulnerability, and element at risk map which give the final output, malaria risk map. The malaria hazard analyses were computed using multi criteria evaluation (MCE) using environmental factors such as topographic factors (elevation, slope and flow distance to stream), land use/ land cover and Breeding site were developed and weighted, then weighted overlay technique were computed in ArcGIS software to generate malaria hazard map. The resulting malaria hazard map depicts that 19.2 %, 30.8 %, 25.1 %, 16.6 % and 8.3 % of the District were subjected to very high, high, moderate, low and very low malaria hazard areas respectively. For vulnerability analysis, health station location and speed constant in Spatial Analyst module were used to generate factor maps. For element at risk, land use land cover map were used to generate element at risk map. Finally malaria risk map of the District was generated. Land use land cover map which is the element at risk in the District, the vulnerability map and the hazard map were overlaid. The final output based on this approach is a malaria risk map, which is classified into 5 classes which is Very High-risk area, High-risk area, Moderate risk area, Low risk area and Very low risk area. The risk map produced from the overlay analysis showed that 20.5 %, 11.6 %, 23.8 %, 34.1 % and 26.4 % of the District were subjected to very high, high, moderate, low and very low malaria risk respectively. This help to plan valuable measures to be taken in early warning, monitor, control and prevent malaria epidemics.
APA, Harvard, Vancouver, ISO, and other styles
41

Wu, Jiahao, Wei Han, Jia Chen, and Sheng Wang. "Improving Geological Remote Sensing Interpretation via Optimal Transport-Based Point–Surface Data Fusion." Remote Sensing 16, no. 1 (December 22, 2023): 53. http://dx.doi.org/10.3390/rs16010053.

Full text
Abstract:
High-quality geological remote sensing interpretation (GRSI) products play a vital role in a wide range of fields, including the military, meteorology, agriculture, the environment, mapping, etc. Due to the importance of GRSI products, this research aimed to improve their accuracy. Although deep-learning (DL)-based GRSI has reduced dependence on manual interpretation, the limited accuracy of multiple geological element interpretation still poses a challenge. This issue can be attributed to small inter-class differences, the uneven distribution of geological elements, sensor limitations, and the complexity of the environment. Therefore, this paper proposes a point–surface data optimal fusion method (PSDOF) to improve the accuracy of GRSI products based on optimal transport (OT) theory. PSDOF combines geological survey data (which has spatial location and geological element information called point data) with a geological remote sensing DL interpretation product (which has limited accuracy and is called surface data) to improve the quality of the resulting output. The method performs several steps to enhance accuracy. First, it calculates the gray-scale correlation feature information for the pixels adjacent to the geological survey points. Next, it determines the distribution of the feature information for geological elements in the vicinity of the point data. Finally, it incorporates complementary information from the survey points into the geological elements’ interpretation boundary, as well as calculates the optimal energy loss for point–surface fusion, thus resulting in an optimal boundary. The experiments conducted in this study demonstrated the superiority of the proposed model in addressing the problem of the limited accuracy of GRSI products.
APA, Harvard, Vancouver, ISO, and other styles
42

Dennis, Jack B. "Static Mapping of Functional Programs: An Example in Signal Processing." Scientific Programming 5, no. 2 (1996): 121–35. http://dx.doi.org/10.1155/1996/360960.

Full text
Abstract:
Complex signal-processing problems are naturally described by compositions of program modules that process streams of data. In this article we discuss how such compositions may be analyzed and mapped onto multiprocessor computers to effectively exploit the massive parallelism of these applications. The methods are illustrated with an example of signal processing for an optical surveillance problem. Program transformation and analysis are used to construct a program description tree that represents the given computation as an acyclic interconnection of stream-processing modules. Each module may be mapped to a set of threads run on a group of processing elements of a target multiprocessor. Performance is considered for two forms of multiprocessor architecture, one based on conventional DSP technology and the other on a multithreaded-processing element design.
APA, Harvard, Vancouver, ISO, and other styles
43

Mackay, Trudy F. C. "Transposable elements and fitness in Drosophila melanogaster." Genome 31, no. 1 (January 1, 1989): 284–95. http://dx.doi.org/10.1139/g89-046.

Full text
Abstract:
Transposable elements constitute a significant fraction of the Drosophila melanogaster genome. The five families of moderately repeated transposable elements identified to date occupy dispersed and variable genomic locations, but have relatively constant copy numbers per individual. What effect to these elements have on the fitness of the individuals harboring them? Experimental evidence relating to this question is reviewed. The relevant data fall into two broad categories. The first involves the determination of the distribution of transposable elements in natural populations, by restriction mapping or in situ hybridization, and the comparison of the observed distribution with different theoretical expectations. The second approach is to study directly the effects of new transposable element-induced mutations on fitness. The P family of transposable elements is a particularly efficient mutagen, and the results of experiments in which initially P-free chromosomes are contaminated with P elements are discussed with regard to P-induced fitness mutations.Key words: transposable elements, Drosophila melanogaster, insertional mutagenesis, fitness, P element mutagenesis, hybrid dysgenesis.
APA, Harvard, Vancouver, ISO, and other styles
44

Pei, Yongjie, and Xiangyang Cui. "A Novel Triangular Prism Element Based on Smoothed Finite Element Method." International Journal of Computational Methods 15, no. 07 (October 12, 2018): 1850058. http://dx.doi.org/10.1142/s0219876218500585.

Full text
Abstract:
In this paper, a novel triangular prism element based on smoothed finite element method (SFEM) is proposed for three-dimensional static and dynamic mechanics problems. The accuracy of the proposed element is comparable to that of the hexahedral element while keeping good adaptability as the tetrahedral element on a surface dimension. In the process of constructing the proposed element, one triangular prism element is further divided into two smoothing cells. Very simple shape functions and a constant smoothing function are used in the construction of the smoothed strains and the smoothed nominal stresses. The divergence theorem is applied to convert the volume integral to the integrals of all the surrounding surfaces of a smoothing cell. Thus, no gradient of shape function and no mapping or coordinate transformation are involved in the process of creating the discretized system equations. Afterwards, several numerical examples include elastic-static and free vibration problems are provided to demonstrate the accuracy and efficiency of the proposed element. Meanwhile, an explicit scheme of the proposed element is given for dynamic large-deformation analysis of elastic-plastic materials, and the numerical results show good agreement with the experimental data.
APA, Harvard, Vancouver, ISO, and other styles
45

Zhou, Qianwen, Changqing Zhu, and Na Ren. "Robust Watermarking Algorithm for Building Information Modeling Based on Element Perturbation and Invisible Characters." Applied Sciences 13, no. 23 (December 4, 2023): 12957. http://dx.doi.org/10.3390/app132312957.

Full text
Abstract:
With the increasing ease of building information modeling data usage, digital watermarking technology has become increasingly crucial for BIM data copyright protection. In response to the problem that existing robust watermarking methods mainly focus on BIM exchange formats and cannot adapt to BIM data, a novel watermarking algorithm specifically designed for BIM data, which combines element perturbation and invisible character embedding, is proposed. The proposed algorithm first calculates the centroid of the enclosing box to locate the elements, and establishes a synchronous relationship between the element coordinates and the watermarked bits using a mapping mechanism, by which the watermarking robustness is effectively enhanced. Taking into consideration both data availability and the need for watermark invisibility, the algorithm classifies the BIM elements based on their mobility, and perturbs the movable elements while embedding invisible characters within the attributes of the immovable elements. Then, the watermark information after dislocation is embedded into the data. We use building model and structural model BIM data to carry out the experiments, and the results demonstrate that the signal-to-noise ratio and peak signal-to-noise ratio before and after watermark embedding are both greater than 100 dB. In addition, the increased information redundancy accounts for less than 0.15% of the original data., which means watermark embedding has very little impact on the original data. Additionally, the NC coefficient of watermark extraction is higher than 0.85 when facing attacks such as translation, element addition, element deletion, and geometry–property separation. These findings indicate a high level of imperceptibility and robustness offered by the algorithm. In conclusion, the robust watermarking algorithm for BIM data fulfills the practical requirements and provides a feasible solution for protecting the copyright of BIM data.
APA, Harvard, Vancouver, ISO, and other styles
46

Gong, Jian Chun, and Deng Rong Zhou. "Manned Cableway Tower Operating Mode of the Finite Element Analysis." Advanced Materials Research 791-793 (September 2013): 831–34. http://dx.doi.org/10.4028/www.scientific.net/amr.791-793.831.

Full text
Abstract:
Make a tower calculation model according to the requirement, this paper using i-deas software the finite element analysis on it. According to given all kinds of tower under the condition of displacement field, stress field, the analysis process of tower adopts hexahedral and tetrahedral entity processing units mapping, analysis of the results using i-deas software in the post-processing module for data processing and graphical display
APA, Harvard, Vancouver, ISO, and other styles
47

O'Brochta, David A., William D. Warren, Kenneth J. Saville, and Peter W. Atkinson. "Hermes, a Functional Non-Drosophilid Insect Gene Vector From Musca domestica." Genetics 142, no. 3 (March 1, 1996): 907–14. http://dx.doi.org/10.1093/genetics/142.3.907.

Full text
Abstract:
Abstract Hermes is a short inverted repeat-type transposable element from the house fly, Musca domestica. Using an extra-chromosomal transpositional recombination assay, we show that Hermes elements can accurately transpose in M. domestica embryos. To test the ability of Hermes to function in species distantly related to M. domestica we used a nonautonomous Hermes element containing the Drosophila melanogaster white (w +) gene and created D. melanogaster germline transformants. Transgenic G1, insects were recovered from 34.6% of the fertile Go adults developing from microinjected w − embryos. This transformation rate is comparable with that observed using P or hobo vectors in D. melanogaster, however, many instances of multiple-element insertions and large clusters were observed. Genetic mapping, Southern blotting, polytene chromosome in situ hybridization and DNA sequence analyses confirmed that Hermes elements were chromosomally integrated in transgenic insects. Our data demonstrate that Hermes elements transpose at high rates in D. melanogaster and may be an effective gene vector and gene-tagging agent in this species and distantly related species of medical and agricultural importance.
APA, Harvard, Vancouver, ISO, and other styles
48

Kumbhar, Pramod Y., A. Francis, N. Swaminathan, R. K. Annabattula, and S. Natarajan. "Development of User Element Routine (UEL) for Cell-Based Smoothed Finite Element Method (CSFEM) in Abaqus." International Journal of Computational Methods 17, no. 02 (October 24, 2019): 1850128. http://dx.doi.org/10.1142/s0219876218501281.

Full text
Abstract:
In this paper, we discuss the implementation of a cell-based smoothed finite element method (CSFEM) within the commercial finite element software Abaqus. The salient feature of the CSFEM is that it does not require an explicit form of the derivative of the shape functions and there is no need for isoparametric mapping. This implementation is accomplished by employing the user element subroutine (UEL) feature in Abaqus. The details on the input data format together with the proposed user element subroutine, which forms the core of the finite element analysis are given. A few benchmark problems from linear elastostatics in both two and three dimensions are solved to validate the proposed implementation. The developed UELs and the associated input files can be downloaded from https://github.com/nsundar/SFEM_in_Abaqus .
APA, Harvard, Vancouver, ISO, and other styles
49

KUMAR, V. "SMOOTHED FINITE ELEMENT METHODS FOR THERMO-MECHANICAL IMPACT PROBLEMS." International Journal of Computational Methods 10, no. 01 (February 2013): 1340010. http://dx.doi.org/10.1142/s0219876213400100.

Full text
Abstract:
We present a Smoothed Finite Element Methods (SFEM) for thermo-mechanical impact problems. The smoothing is applied to the strains and the standard finite element approach is used for the temperature field. The SFEM allows for highly accurate results and large deformations. No isoparametric mapping is needed; the shape functions are computed in the physical domain. Moreover, no derivatives of the shape functions must be computed. We implemented a visco-plastic constitutive model and validate the method by comparing numerical results to experimental data.
APA, Harvard, Vancouver, ISO, and other styles
50

Hennigan, A. N., and A. Jacobson. "Functional mapping of the translation-dependent instability element of yeast MATalpha1 mRNA." Molecular and Cellular Biology 16, no. 7 (July 1996): 3833–43. http://dx.doi.org/10.1128/mcb.16.7.3833.

Full text
Abstract:
The determinants of mRNA stability include specific cis-acting destabilizing sequences located within mRNA coding and noncoding regions. We have developed an approach for mapping coding-region instability sequences in unstable yeast mRNAs that exploits the link between mRNA translation and turnover and the dependence of nonsense-mediated mRNA decay on the activity of the UPF1 gene product. This approach, which involves the systematic insertion of in-frame translational termination codons into the coding sequence of a gene of interest in a upf1delta strain, differs significantly from conventional methods for mapping cis-acting elements in that it causes minimal perturbations to overall mRNA structure. Using the previously characterized MATalpha1 mRNA as a model, we have accurately localized its 65-nucleotide instability element (IE) within the protein coding region. Termination of translation 5' to this element stabilized the MATalpha1 mRNA two- to threefold relative to wild-type transcripts. Translation through the element was sufficient to restore an unstable decay phenotype, while internal termination resulted in different extents of mRNA stabilization dependent on the precise location of ribosome stalling. Detailed mutagenesis of the element's rare-codon/AU-rich sequence boundary revealed that the destabilizing activity of the MATalpha1 IE is observed when the terminal codon of the element's rare-codon interval is translated. This region of stability transition corresponds precisely to a MATalpha1 IE sequence previously shown to be complementary to 18S rRNA. Deletion of three nucleotides 3' to this sequence shifted the stability boundary one codon 5' to its wild-type location. Conversely, constructs containing an additional three nucleotides at this same location shifted the transition downstream by an equivalent sequence distance. Our results suggest a model in which the triggering of MATalpha1 mRNA destabilization results from establishment of an interaction between translating ribosomes and a downstream sequence element. Furthermore, our data provide direct molecular evidence for a relationship between mRNA turnover and mRNA translation.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography