Journal articles on the topic 'Out-of-sample Embedding'

To see the other types of publications on this topic, follow the link: Out-of-sample Embedding.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Out-of-sample Embedding.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Wang, Jianzhong. "Mathematical analysis on out-of-sample extensions." International Journal of Wavelets, Multiresolution and Information Processing 16, no. 05 (September 2018): 1850042. http://dx.doi.org/10.1142/s021969131850042x.

Full text
Abstract:
Let [Formula: see text] be a data set in [Formula: see text], where [Formula: see text] is the training set and [Formula: see text] is the test one. Many unsupervised learning algorithms based on kernel methods have been developed to provide dimensionality reduction (DR) embedding for a given training set [Formula: see text] ([Formula: see text]) that maps the high-dimensional data [Formula: see text] to its low-dimensional feature representation [Formula: see text]. However, these algorithms do not straightforwardly produce DR of the test set [Formula: see text]. An out-of-sample extension method provides DR of [Formula: see text] using an extension of the existent embedding [Formula: see text], instead of re-computing the DR embedding for the whole set [Formula: see text]. Among various out-of-sample DR extension methods, those based on Nyström approximation are very attractive. Many papers have developed such out-of-extension algorithms and shown their validity by numerical experiments. However, the mathematical theory for the DR extension still need further consideration. Utilizing the reproducing kernel Hilbert space (RKHS) theory, this paper develops a preliminary mathematical analysis on the out-of-sample DR extension operators. It treats an out-of-sample DR extension operator as an extension of the identity on the RKHS defined on [Formula: see text]. Then the Nyström-type DR extension turns out to be an orthogonal projection. In the paper, we also present the conditions for the exact DR extension and give the estimate for the error of the extension.
APA, Harvard, Vancouver, ISO, and other styles
2

Strange, Harry, and Reyer Zwiggelaar. "A Generalised Solution to the Out-of-Sample Extension Problem in Manifold Learning." Proceedings of the AAAI Conference on Artificial Intelligence 25, no. 1 (August 4, 2011): 471–76. http://dx.doi.org/10.1609/aaai.v25i1.7908.

Full text
Abstract:
Manifold learning is a powerful tool for reducing the dimensionality of a dataset by finding a low-dimensional embedding that retains important geometric and topological features. In many applications it is desirable to add new samples to a previously learnt embedding, this process of adding new samples is known as the out-of-sample extension problem. Since many manifold learning algorithms do not naturally allow for new samples to be added we present an easy to implement generalized solution to the problem that can be used with any existing manifold learning algorithm. Our algorithm is based on simple geometric intuition about the local structure of a manifold and our results show that it can be effectively used to add new samples to a previously learnt embedding. We test our algorithm on both artificial and real world image data and show that our method significantly out performs existing out-of-sample extension strategies.
APA, Harvard, Vancouver, ISO, and other styles
3

Fanuel, Michaël, Antoine Aspeel, Jean-Charles Delvenne, and Johan A. K. Suykens. "Positive Semi-definite Embedding for Dimensionality Reduction and Out-of-Sample Extensions." SIAM Journal on Mathematics of Data Science 4, no. 1 (February 10, 2022): 153–78. http://dx.doi.org/10.1137/20m1370653.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Bengio, Yoshua, Olivier Delalleau, Nicolas Le Roux, Jean-François Paiement, Pascal Vincent, and Marie Ouimet. "Learning Eigenfunctions Links Spectral Embedding and Kernel PCA." Neural Computation 16, no. 10 (October 1, 2004): 2197–219. http://dx.doi.org/10.1162/0899766041732396.

Full text
Abstract:
In this letter, we show a direct relation between spectral embedding methods and kernel principal components analysis and how both are special cases of a more general learning problem: learning the principal eigenfunctions of an operator defined from a kernel and the unknown data-generating density. Whereas spectral embedding methods provided only coordinates for the training points, the analysis justifies a simple extension to out-of-sample examples (the Nyström formula) for multidimensional scaling (MDS), spectral clustering, Laplacian eigenmaps, locally linear embedding (LLE), and Isomap. The analysis provides, for all such spectral embedding methods, the definition of a loss function, whose empirical average is minimized by the traditional algorithms. The asymptotic expected value of that loss defines a generalization performance and clarifies what these algorithms are trying to learn. Experiments with LLE, Isomap, spectral clustering, and MDS show that this out-of-sample embedding formula generalizes well, with a level of error comparable to the effect of small perturbations of the training set on the embedding.
APA, Harvard, Vancouver, ISO, and other styles
5

Yang, Yi, Feiping Nie, Shiming Xiang, Yueting Zhuang, and Wenhua Wang. "Local and Global Regressive Mapping for Manifold Learning with Out-of-Sample Extrapolation." Proceedings of the AAAI Conference on Artificial Intelligence 24, no. 1 (July 3, 2010): 649–54. http://dx.doi.org/10.1609/aaai.v24i1.7696.

Full text
Abstract:
Over the past few years, a large family of manifold learning algorithms have been proposed, and applied to various applications. While designing new manifold learning algorithms has attracted much research attention, fewer research efforts have been focused on out-of-sample extrapolation of learned manifold. In this paper, we propose a novel algorithm of manifold learning. The proposed algorithm, namely Local and Global Regressive Mapping (LGRM), employs local regression models to grasp the manifold structure. We additionally impose a global regression term as regularization to learn a model for out-of-sample data extrapolation. Based on the algorithm, we propose a new manifold learning framework. Our framework can be applied to any manifold learning algorithms to simultaneously learn the low dimensional embedding of the training data and a model which provides explicit mapping of the out-of-sample data to the learned manifold. Experiments demonstrate that the proposed framework uncover the manifold structure precisely and can be freely applied to unseen data.
APA, Harvard, Vancouver, ISO, and other styles
6

Tong, Ying, Jiachao Zhang, and Rui Chen. "Discriminative Sparsity Graph Embedding for Unconstrained Face Recognition." Electronics 8, no. 5 (May 7, 2019): 503. http://dx.doi.org/10.3390/electronics8050503.

Full text
Abstract:
In this paper, we propose a new dimensionality reduction method named Discriminative Sparsity Graph Embedding (DSGE) which considers the local structure information and the global distribution information simultaneously. Firstly, we adopt the intra-class compactness constraint to automatically construct the intrinsic adjacent graph, which enhances the reconstruction relationship between the given sample and the non-neighbor samples with the same class. Meanwhile, the inter-class compactness constraint is exploited to construct the penalty adjacent graph, which reduces the reconstruction influence between the given sample and the pseudo-neighbor samples with the different classes. Then, the global distribution constraints are introduced to the projection objective function for seeking the optimal subspace which compacts intra-classes samples and alienates inter-classes samples at the same time. Extensive experiments are carried out on AR, Extended Yale B, LFW and PubFig databases which are four representative face datasets, and the corresponding experimental results illustrate the effectiveness of our proposed method.
APA, Harvard, Vancouver, ISO, and other styles
7

Liu, Bing, Shixiong Xia, and Yong Zhou. "Multiple Kernel Spectral Regression for Dimensionality Reduction." Journal of Applied Mathematics 2013 (2013): 1–8. http://dx.doi.org/10.1155/2013/427462.

Full text
Abstract:
Traditional manifold learning algorithms, such as locally linear embedding, Isomap, and Laplacian eigenmap, only provide the embedding results of the training samples. To solve the out-of-sample extension problem, spectral regression (SR) solves the problem of learning an embedding function by establishing a regression framework, which can avoid eigen-decomposition of dense matrices. Motivated by the effectiveness of SR, we incorporate multiple kernel learning (MKL) into SR for dimensionality reduction. The proposed approach (termed MKL-SR) seeks an embedding function in the Reproducing Kernel Hilbert Space (RKHS) induced by the multiple base kernels. An MKL-SR algorithm is proposed to improve the performance of kernel-based SR (KSR) further. Furthermore, the proposed MKL-SR algorithm can be performed in the supervised, unsupervised, and semi-supervised situation. Experimental results on supervised classification and semi-supervised classification demonstrate the effectiveness and efficiency of our algorithm.
APA, Harvard, Vancouver, ISO, and other styles
8

DABBAGH, LANJA A., and Ismael M. Saeed. "Narrative Embedding in The Postmodern American Novel with Reference to Stephen King’s Desperation." Humanities Journal of University of Zakho 1, no. 1 (June 30, 2013): 7–11. http://dx.doi.org/10.26436/hjuoz.2013.1.1.87.

Full text
Abstract:
The inclusion of a story within another story is a device which implies that there is an embedding of at least an additional narrative subordinated to the one that seems to frame it. This invariably signifies the shift back and forth of enunciative levels and/or text-world dimensions. Stephen King’s Desperation (1996) is the American postmodern novel which embodies the multiple framing of the fictional discourse by including some stretches and story-telling accounts based on chronicles, reproduced in chapter three, part III of the said novel. The embedded narrative is also given an autonomous subtitle “The American West: Legendary Shadows”, dealing with how the Chinese mine workers were treated in 1858-1859, while the novel discourse narrates events that took place in the second half of the twentieth century. The paper provides a sample analysis of the embeddings to reveal the intricate connection between the text-world polarities, its multiverse interaction, and the narrator’s/narrators’ presentational modes. Stephen King, as the inventor of the narrative levels, works out a complex set of diegesis vs. mimesis, as well as management of reporting distance in the representation of polyphony, voice, and voice-related issues in the embeddings. Despite the complexity and idiosyncracies of the structure, Desperation manages to attract the reader’s attention to the end, owing to the symbiotic relations between the frame and the embedding; and, moreover, owing to the interchangeable positions of these two in the course of the events, as will be proved in the paper.
APA, Harvard, Vancouver, ISO, and other styles
9

Chen, Xu, Xiaoli Qi, Zhenya Wang, Chuangchuang Cui, Baolin Wu, and Yan Yang. "Fault diagnosis of rolling bearing using marine predators algorithm-based support vector machine and topology learning and out-of-sample embedding." Measurement 176 (May 2021): 109116. http://dx.doi.org/10.1016/j.measurement.2021.109116.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

SOOFI, ABDOL S., and LIANGYUE CAO. "PREDICTION AND VOLATILITY OF BLACK MARKET CURRENCIES: EVIDENCE FROM RENMINBI AND RIAL EXCHANGE RATES." International Journal of Theoretical and Applied Finance 05, no. 06 (September 2002): 659–66. http://dx.doi.org/10.1142/s0219024902001638.

Full text
Abstract:
We perform out-of-sample prediction on both fixed and black market Chinese renminbi/US dollar, and black market rial/US dollar exchange rates by using the time-delay embedding technique and the local linear prediction method. We also predict an artificially generated chaotic time series with and without noise for the purpose of validation of the methods used in this study. In all examples tested, our prediction results significantly outperform those by the benchmark mean value predictor based on a statistic defined by Harvey et al. [11]. Another interesting result found in this paper is that one may use the embedding dimension as a measure of volatility of a financial asset.
APA, Harvard, Vancouver, ISO, and other styles
11

Yilmaz, Cumali, Fuat Güzel, and Gülbahar Akkaya Sayğılı. "Production of A New Magnetic Nanocomposite From Weeds Biochar." Academic Perspective Procedia 2, no. 3 (November 22, 2019): 1229–32. http://dx.doi.org/10.33793/acperpro.02.03.137.

Full text
Abstract:
In this study, the synthesis and characterization of a novel composite material was carried out by co-precipitation method. Weeds biochar was successfully transferred to a magnetic material via embedding MnFe2O4 nanoparticles to its structure. The synthesized spinel ferrite composite was characterized with scanning electron microscopy and vibrating sample magnetometer spectroscopy and the results of the analyses of non-magnetic and magnetic material were compared.
APA, Harvard, Vancouver, ISO, and other styles
12

Chicote, Beatriz, Unai Irusta, Elisabete Aramendi, Raúl Alcaraz, José Rieta, Iraia Isasi, Daniel Alonso, María Baqueriza, and Karlos Ibarguren. "Fuzzy and Sample Entropies as Predictors of Patient Survival Using Short Ventricular Fibrillation Recordings during out of Hospital Cardiac Arrest." Entropy 20, no. 8 (August 9, 2018): 591. http://dx.doi.org/10.3390/e20080591.

Full text
Abstract:
Optimal defibrillation timing guided by ventricular fibrillation (VF) waveform analysis would contribute to improved survival of out-of-hospital cardiac arrest (OHCA) patients by minimizing myocardial damage caused by futile defibrillation shocks and minimizing interruptions to cardiopulmonary resuscitation. Recently, fuzzy entropy (FuzzyEn) tailored to jointly measure VF amplitude and regularity has been shown to be an efficient defibrillation success predictor. In this study, 734 shocks from 296 OHCA patients (50 survivors) were analyzed, and the embedding dimension (m) and matching tolerance (r) for FuzzyEn and sample entropy (SampEn) were adjusted to predict defibrillation success and patient survival. Entropies were significantly larger in successful shocks and in survivors, and when compared to the available methods, FuzzyEn presented the best prediction results, marginally outperforming SampEn. The sensitivity and specificity of FuzzyEn were 83.3% and 76.7% when predicting defibrillation success, and 83.7% and 73.5% for patient survival. Sensitivities and specificities were two points above those of the best available methods, and the prediction accuracy was kept even for VF intervals as short as 2s. These results suggest that FuzzyEn and SampEn may be promising tools for optimizing the defibrillation time and predicting patient survival in OHCA patients presenting VF.
APA, Harvard, Vancouver, ISO, and other styles
13

Bronold, F. X., and H. Fehske. "Invariant embedding approach to secondary electron emission from metals." Journal of Applied Physics 131, no. 11 (March 21, 2022): 113302. http://dx.doi.org/10.1063/5.0082468.

Full text
Abstract:
Based on an invariant embedding principle for the backscattering function, we calculate the electron emission yield for metal surfaces at very low electron impact energies. Solving the embedding equation within a quasi-isotropic approximation and using the effective mass model for the solid experimental data are fairly well reproduced provided (i) incoherent scattering on ion cores is allowed to contribute to the scattering cascades inside the solid and (ii) the transmission through the surface potential takes into account Bragg gaps due to coherent scattering on crystal planes parallel to the surface as well as randomization of the electron’s lateral momentum due to elastic scattering on surface defects. Our results suggest that in order to get secondary electrons out of metals, the large energy loss due to inelastic electron–electron scattering has to be compensated for by incoherent elastic electron–ion core scattering, irrespective of the crystallinity of the sample.
APA, Harvard, Vancouver, ISO, and other styles
14

Teymur, Yekbun Avşar, Fuat Güzel, and Gülbahar Akkaya Sayğılı. "Fabrication And Characterization of Biomagnetic Composite Material." Academic Perspective Procedia 2, no. 3 (November 22, 2019): 1233–37. http://dx.doi.org/10.33793/acperpro.02.03.138.

Full text
Abstract:
In this study, the synthesis and characterization of a biomagnetic composite material was achieved by a simple and cost effective method. Tomato processing waste was successfully converted into a magnetic material via embedding Fe3O4 nanoparticles to its structure. Due to its low cost and ease of application, co-precipitation method was used for loading the magnetite nanoparticles. Characterization studies were carried out with Fourier transform infrared spectroscopy, scanning electron microscopy and vibrating sample magnetometer spectroscopy and the outcomes of the analyses of non-magnetic and magnetic material were compared.
APA, Harvard, Vancouver, ISO, and other styles
15

Liao, Jiayu, Xiaolan Liu, and Mengying Xie. "Inductive Latent Space Sparse and Low-rank Subspace Clustering Algorithm." Journal of Physics: Conference Series 2224, no. 1 (April 1, 2022): 012124. http://dx.doi.org/10.1088/1742-6596/2224/1/012124.

Full text
Abstract:
Abstract Sparse subspace clustering (SSC) and low-rank representation (LRR) are the most popular algorithms for subspace clustering. However, SSC and LRR are transductive methods and cannot deal with the new data not involved in the training data. When a new data comes, SSC and LRR need to calculate over all the data again, which is a time-consuming thing. On the other hand, for high-dimensional data, dimensionality reduction is firstly performed before running SSC and LRR algorithms which isolate the dimensionality reduction and the following subspace clustering. To overcome these shortcomings, in this paper, two sparse and low-rank subspace clustering algorithms based on simultaneously dimensionality reduction and subspace clustering which can deal with out-of-sample data were proposed. The proposed algorithms divide the whole data set into in-sample data and out-of-sample data. The in-sample data are used to learn the projection matrix and the sparse or low-rank representation matrix in the low-dimensional space. The membership of in-sample data is obtained by spectral clustering. In the low dimensional embedding space, the membership of out of sample data is obtained by collaborative representation classification (CRC). Experimental results on a variety of data sets verify that our proposed algorithms can handle new data in an efficient way.
APA, Harvard, Vancouver, ISO, and other styles
16

Fan, Lim Chu, Sallimah Salleh, and Kumar Laxman. "Embedding video technology in enhancing the understanding of the biology concept of breathing: A Brunei perspective." E-Learning and Digital Media 15, no. 5 (September 2018): 217–34. http://dx.doi.org/10.1177/2042753018797260.

Full text
Abstract:
This study was carried out in an attempt to investigate the impact of embedding video technology into classroom lessons designed using technological pedagogical content knowledge (TPACK) framework in improving students’ conceptual understanding, focused on the concept of breathing. This study hypothesized that embedding video technology into classroom teaching would assist students in visualizing the dynamic biological processes, while improving students’ conceptual understanding of the biology concept of breathing. This study sought to answer two research questions: (1) What are the students’ misconceptions on breathing? (2) Does the integration of technology in lesson improve students’ understanding of the concept? In this study, participants underwent four cycles of interventions, reflecting on the four knowledge dimensions of the TPACK framework (declarative, procedural, schematic and strategic). Mixed research method was employed in this study. Drawing–writing technique, pre- and post-tests and students’ interviews were used to collect data. The quantitative data derived from the students’ pre- and post-tests scores were analysed using SPSS paired sample t-test, while the qualitative data obtained from the drawing–writing technique and students’ interviews were thematically analysed based on the content. Results of this study indicated that there was a significantly greater improvement in students’ conceptual understanding of the biology concept of breathing after the interventions, thus demonstrating the positive impact of embedding video technology into classroom lessons planned using TPACK framework.
APA, Harvard, Vancouver, ISO, and other styles
17

Raffo, Antonio, Gustavo Avolio, Dominique M. M. P. Schreurs, Sergio Di Falco, Valeria Vadalà, Francesco Scappaviva, Giovanni Crupi, Bart Nauwelaers, and Giorgio Vannini. "On the evaluation of the high-frequency load line in active devices." International Journal of Microwave and Wireless Technologies 3, no. 1 (January 18, 2011): 19–24. http://dx.doi.org/10.1017/s1759078710000838.

Full text
Abstract:
In this work a de-embedding technique oriented to the evaluation of the load line at the intrinsic resistive core of microwave FET devices is presented. The approach combines vector high-frequency nonlinear load-pull measurements with an accurate description of the reactive nonlinearities, thus allowing one to determine the actual load line of the drain–source current generator under realistic conditions. Thanks to the proposed approach, the dispersive behavior of the resistive core and the compatibility of the voltage and current waveforms with reliability requirements can be directly monitored. Different experiments carried out on a gallium nitride HEMT sample are reported.
APA, Harvard, Vancouver, ISO, and other styles
18

Liu, Mingming, Bing Liu, Chen Zhang, and Wei Sun. "Spectral Nonlinearly Embedded Clustering Algorithm." Mathematical Problems in Engineering 2016 (2016): 1–9. http://dx.doi.org/10.1155/2016/9264561.

Full text
Abstract:
As is well known, traditional spectral clustering (SC) methods are developed based on themanifold assumption, namely, that two nearby data points in the high-density region of a low-dimensional data manifold have the same cluster label. But, for some high-dimensional and sparse data, such an assumption might be invalid. Consequently, the clustering performance of SC will be degraded sharply in this case. To solve this problem, in this paper, we propose a general spectral embedded framework, which embeds the true cluster assignment matrix for high-dimensional data into a nonlinear space by a predefined embedding function. Based on this framework, several algorithms are presented by using different embedding functions, which aim at learning the final cluster assignment matrix and a transformation into a low dimensionality space simultaneously. More importantly, the proposed method can naturally handle the out-of-sample extension problem. The experimental results on benchmark datasets demonstrate that the proposed method significantly outperforms existing clustering methods.
APA, Harvard, Vancouver, ISO, and other styles
19

Espadoto, Mateus, Nina Sumiko Tomita Hirata, and Alexandru C. Telea. "Deep learning multidimensional projections." Information Visualization 19, no. 3 (May 18, 2020): 247–69. http://dx.doi.org/10.1177/1473871620909485.

Full text
Abstract:
Dimensionality reduction methods, also known as projections, are often used to explore multidimensional data in machine learning, data science, and information visualization. However, several such methods, such as the well-known t-distributed stochastic neighbor embedding and its variants, are computationally expensive for large datasets, suffer from stability problems, and cannot directly handle out-of-sample data. We propose a learning approach to construct any such projections. We train a deep neural network based on sample set drawn from a given data universe, and their corresponding two-dimensional projections, compute with any user-chosen technique. Next, we use the network to infer projections of any dataset from the same universe. Our approach generates projections with similar characteristics as the learned ones, is computationally two to four orders of magnitude faster than existing projection methods, has no complex-to-set user parameters, handles out-of-sample data in a stable manner, and can be used to learn any projection technique. We demonstrate our proposal on several real-world high-dimensional datasets from machine learning.
APA, Harvard, Vancouver, ISO, and other styles
20

HUANG, HONG, JIANWEI LI, and HAILIANG FENG. "SUBSPACES VERSUS SUBMANIFOLDS: A COMPARATIVE STUDY IN SMALL SAMPLE SIZE PROBLEM." International Journal of Pattern Recognition and Artificial Intelligence 23, no. 03 (May 2009): 463–90. http://dx.doi.org/10.1142/s0218001409007168.

Full text
Abstract:
Automatic face recognition is a challenging problem in the biometrics area, where the dimension of the sample space is typically larger than the number of samples in the training set and consequently the so-called small sample size problem exists. Recently, neuroscientists emphasized the manifold ways of perception, and showed the face images may reside on a nonlinear submanifold hidden in the image space. Many manifold learning methods, such as Isometric feature mapping, Locally Linear Embedding, and Locally Linear Coordination are proposed. These methods achieved the submanifold by collectively analyzing the overlapped local neighborhoods and all claimed to be superior to such subspace methods as Eigenfaces and Fisherfaces in terms of classification accuracy. However, in literature, no systematic comparative study for face recognition is performed among them. In this paper, we carry out a comparative study in face recognition among them, and the study considers theoretical aspects as well as simulations performed using CMU PIE and FERET face databases.
APA, Harvard, Vancouver, ISO, and other styles
21

LEUNG, H. Y., L. M. CHENG, and L. L. CHENG. "A ROBUST WATERMARKING SCHEME USING SELECTIVE CURVELET COEFFICIENTS." International Journal of Wavelets, Multiresolution and Information Processing 07, no. 02 (March 2009): 163–81. http://dx.doi.org/10.1142/s0219691309002830.

Full text
Abstract:
In this paper, a selective curvelet coefficient digital watermarking algorithm is proposed. Traditionally, curvelet watermarks are embedded into all sample frequency bands. However, the study of individual band behavior and the use of single band for watermarking have not been reported. The selective band will provide an addition security feature against any physical tampering. This paper aims to give an intensive study on the robustness of watermarking using selective curvelet coefficients from a single band and to find out the best band for embedding watermark. Wrapping of specially selected Fourier samples is employed to implement Fast Discrete Curvelet Transforms (FDCT) to transform the digital image to the curvelet domain.
APA, Harvard, Vancouver, ISO, and other styles
22

Schubert-Bischoff, Peter, and Thomas Krist. "Fast Cross-Sectioning Technique for Thin Films by Ultramicrotomy." Microscopy and Microanalysis 3, S2 (August 1997): 359–60. http://dx.doi.org/10.1017/s1431927600008680.

Full text
Abstract:
The usual method to prepare specimen for transmission electron microscopy (TEM) with an ultramicrotome (UM) is to embed a small piece of the sample in a resin which needs between hours and days to harden. Then the resin is trimmed and finally the thin sections are cut. We developed a method to produce cross sections with an UM without embedding the sample. This leads to first TEM pictures in less then 30 minutes after the end of the thin film production.We produce neutron mirrors which consist of metallic monolayers (figl) or multilayers (fig.2). The materials we use are mostly cobalt-iron alloys and silicon. The quality of the mirrors critically depends on the structural and hence also on the magnetic properties which are strongly influenced by the settings of the sputter parameters (plasma current, pressure of the working gas, substrate temperature, sputtering high voltage). To find out the best parameters for each purpose a quick analysing method is needed for a fast feedback
APA, Harvard, Vancouver, ISO, and other styles
23

Öztürk, Ümit, and Atınç Yılmaz. "An Optimization Technique for Linear Manifold Learning-Based Dimensionality Reduction: Evaluations on Hyperspectral Images." Applied Sciences 11, no. 19 (September 28, 2021): 9063. http://dx.doi.org/10.3390/app11199063.

Full text
Abstract:
Manifold learning tries to find low-dimensional manifolds on high-dimensional data. It is useful to omit redundant data from input. Linear manifold learning algorithms have applicability for out-of-sample data, in which they are fast and practical especially for classification purposes. Locality preserving projection (LPP) and orthogonal locality preserving projection (OLPP) are two known linear manifold learning algorithms. In this study, scatter information of a distance matrix is used to construct a weight matrix with a supervised approach for the LPP and OLPP algorithms to improve classification accuracy rates. Low-dimensional data are classified with SVM and the results of the proposed method are compared with some other important existing linear manifold learning methods. Class-based enhancements and coefficients proposed for the formulization are reported visually. Furthermore, the change on weight matrices, band information, and correlation matrices with p-values are extracted and visualized to understand the effect of the proposed method. Experiments are conducted on hyperspectral imaging (HSI) with two different datasets. According to the experimental results, application of the proposed method with the LPP or OLPP algorithms outperformed traditional LPP, OLPP, neighborhood preserving embedding (NPE) and orthogonal neighborhood preserving embedding (ONPE) algorithms. Furthermore, the analytical findings on visualizations show consistency with obtained classification accuracy enhancements.
APA, Harvard, Vancouver, ISO, and other styles
24

Thomas, Charles R. "Field Independence and Technology Students' Achievement." Perceptual and Motor Skills 62, no. 3 (June 1986): 859–62. http://dx.doi.org/10.2466/pms.1986.62.3.859.

Full text
Abstract:
The Hidden-figures Test was used as a measure of field independence-dependence for a sample of 256 mechanical engineering technology students. Small but significant Pearson product-moment correlations resulted for 12 out of 39 comparisons of field-independence with course achievement, for a comparison of field-independence with grade point average, and for comparisons of field-independence with Scholastic Aptitude Test scores. Significant correlations with field-independence particularly occurred for courses in technical drawing, mechanics courses with a strong emphasis on drawing free-body diagrams, a course with a strong diagramming emphasis, and for several courses with a strong structural emphasis. The small but significant course comparisons clearly indicated an embedding phenomenon which is dependent on underlying structural complexities and which field-dependent individuals find troublesome.
APA, Harvard, Vancouver, ISO, and other styles
25

Choi, Pyuck-Pa, Tala'at Al-Kassab, Young-Soon Kwon, Ji-Soon Kim, and Reiner Kirchheim. "Application of Focused Ion Beam to Atom Probe Tomography Specimen Preparation from Mechanically Alloyed Powders." Microscopy and Microanalysis 13, no. 5 (September 28, 2007): 347–53. http://dx.doi.org/10.1017/s1431927607070717.

Full text
Abstract:
Focused ion-beam milling has been applied to prepare needle-shaped atom probe tomography specimens from mechanically alloyed powders without the use of embedding media. The lift-out technique known from transmission electron microscopy specimen preparation was modified to cut micron-sized square cross-sectional blanks out of single powder particles. A sequence of rectangular cuts and annular milling showed the highest efficiency for sharpening the blanks to tips. First atom probe results on a Fe95Cu5 powder mechanically alloyed in a high-energy planetary ball mill for 20 h have been obtained. Concentration profiles taken from this powder sample showed that the Cu distribution is inhomogeneous on a nanoscale and that the mechanical alloying process has not been completed yet. In addition, small clusters of oxygen, stemming from the ball milling process, have been detected. Annular milling with 30 keV Ga ions and beam currents ≥50 pA was found to cause the formation of an amorphous surface layer, whereas no structural changes could be observed for beam currents ≤10 pA.
APA, Harvard, Vancouver, ISO, and other styles
26

Li, Mingzhen, Jianming Liu, Tao Zhu, Wenjun Zhou, and Chengke Zhou. "A Novel Traveling-Wave-Based Method Improved by Unsupervised Learning for Fault Location of Power Cables via Sheath Current Monitoring." Sensors 19, no. 9 (May 5, 2019): 2083. http://dx.doi.org/10.3390/s19092083.

Full text
Abstract:
In order to improve the practice in maintenance of power cables, this paper proposes a novel traveling-wave-based fault location method improved by unsupervised learning. The improvement mainly lies in the identification of the arrival time of the traveling wave. The proposed approach consists of four steps: (1) The traveling wave associated with the sheath currents of the cables are grouped in a matrix; (2) the use of dimensionality reduction by t-SNE (t-distributed Stochastic Neighbor Embedding) to reconstruct the matrix features in a low dimension; (3) application of the DBSCAN (density-based spatial clustering of applications with noise) clustering to cluster the sample points by the closeness of the sample distribution; (4) the arrival time of the traveling wave can be identified by searching for the maximum slope point of the non-noise cluster with the fewest samples. Simulations and calculations have been carried out for both HV (high voltage) and MV (medium voltage) cables. Results indicate that the arrival time of the traveling wave can be identified for both HV cables and MV cables with/without noise, and the method is suitable with few random time errors of the recorded data. A lab-based experiment was carried out to validate the proposed method and helped to prove the effectiveness of the clustering and the fault location.
APA, Harvard, Vancouver, ISO, and other styles
27

Sherman, Debby. "Dissolving Osmium Tetroxide the Easy Way." Microscopy Today 14, no. 4 (July 2006): 65. http://dx.doi.org/10.1017/s1551929500050355.

Full text
Abstract:
Osmium crystals tend to cling to the glass ampoule walls making dissolving them difficult. Smashing the vials results in glass shards that may get into sample vials and can be dangerous, or carry into embedding media and damage diamond knives.A simple trick to eliminate smashing osmium vials is to dip the sealed ampoule into liquid nitrogen. This releases the osmium crystals from the glass walls. Then simply break the vial using an ampoule cracker (available through EM supply houses) and pour the osmium crystals into a bottle containing the water for the OsO4 solution. Wait a few minutes to let the ampoule warm-up, and it is easy to see if all of the crystals have been dumped out of the ampoule. Let the solution sit overnight at room temperature in a hood and the next day the crystals will be totally dissolved.
APA, Harvard, Vancouver, ISO, and other styles
28

Wang, Haiyan, Renhong Wan, Shuai Chen, Haiyan Qin, Wei Cao, Luqiang Sun, Yunzhou Shi, Qianhua Zheng, and Ying Li. "Comparison of Efficacy of Acupuncture-Related Therapy in the Treatment of Postherpetic Neuralgia: A Network Meta-Analysis of Randomized Controlled Trials." Evidence-Based Complementary and Alternative Medicine 2022 (October 14, 2022): 1–19. http://dx.doi.org/10.1155/2022/3975389.

Full text
Abstract:
Background. Postherpetic neuralgia (PHN) is the most common sequela of herpes zoster, and the efficacy of the treatment regimens recommended in the guidelines is not entirely reliable. Acupuncture and moxibustion are widely used complementary alternative therapies that have a positive effect on the treatment of PHN. However, there are various forms of acupuncture and moxibustion, and there are differences in efficacy between the different forms. Methods. The retrieval work of randomised controlled trials (RCTs) of acupuncture for PHN in English databases (including PubMed, Cochrane Library, Embase, Web of Science) and Chinese databases (including China National Knowledge Infrastructure (CNKI), WeiPu database, WanFang database, and China Biomedical Literature Database) were conducted from the time of database creation to June 2022. Literature screening, data extraction, and evaluation of risk of bias for the included studies were carried out independently by two researchers, and data analysis was performed using Stata 14.2 software. Results. A total of 30 RCTs including 2138 patients with PHN were included. In terms of pain improvement, acupoint embedding + Western medicine group, bloodletting-cupping group, and bloodletting-cupping + Western medicine group ranked top. In terms of total efficiency, acupuncture + Western medicine group, bloodletting-cupping + Western medicine group, and acupoint embedding group ranked top. There were no statistically significant differences in the incidence of adverse events between treatment regimens. Conclusions. In a comprehensive comparison of the outcome indicators of 14 different treatment regimens, we considered acupoint injection + Western medicine and bloodletting-cupping + Western medicine to be the best combinations for the treatment of PHN. Due to the limitations of the study, the above conclusions still need to be validated in further multi-centre, large-sample prospective randomised controlled clinical trials.
APA, Harvard, Vancouver, ISO, and other styles
29

Deng, Zhongmin, Xinjie Zhang, and Yanlin Zhao. "Transfer Learning Based Method for Frequency Response Model Updating with Insufficient Data." Sensors 20, no. 19 (October 1, 2020): 5615. http://dx.doi.org/10.3390/s20195615.

Full text
Abstract:
Finite element model updating precision depends heavily on sufficient vibration feature extraction. However, adequate amount of sample collection is generally time-consuming in frequency response (FR) model updating. Accurate vibration feature extraction with insufficient data has become a significant challenge in FR model updating. To update the finite element model with a small dataset, a novel approach based on transfer learning is firstly proposed in this paper. A readily available fault diagnosis dataset is selected as ancillary knowledge to train a high-precision mapping from FR data to updating parameters. The proposed transfer learning network is constructed with two branches: source and target domain feature extractor. Considering about the cross-domain feature discrepancy, a domain adaptation method is designed by embedding the extracted features into a shared feature space to train a reliable model updating framework. The proposed method is verified by a simulated satellite example. The comparison results manifest that sample amount dependency has prominently lessened this method and the updated model outperforms the method without transfer learning in accuracy with the small dataset. Furthermore, the updated model is validated through dynamic response out of the training set.
APA, Harvard, Vancouver, ISO, and other styles
30

Kelliher, Felicity, Monica Murphy, and Denis Harrington. "Exploring the role of goal setting and external accountability mechanisms in embedding strategic learning plans in small firms." Journal of Small Business and Enterprise Development 27, no. 3 (May 18, 2020): 449–69. http://dx.doi.org/10.1108/jsbed-07-2019-0253.

Full text
Abstract:
PurposeThis paper explores the role of goal setting and external accountability mechanisms in embedding strategic learning plans in small firms. The research question asks, how are strategic learning plans embedded in small firms?Design/methodology/approachInsights from in-depth action research carried out with three small firm owner-managers (OMs) inform the study.FindingsFindings present valuable insights into how small firms learn strategically, and the link between OM goal setting and external accountability mechanisms in pursuit of embedded learning. A framework for embedding strategic learning plans in small firms is presented.Research limitations/implicationsThis study offers a contribution to knowledge in the areas of small firm learning, strategic planning and social learning theory. While the sample size is small, data and case protocols are in place which allow for replication of the study. As the research is embedded in social learning theory, alternative theoretical frameworks may shed a different light on the research question.Practical implicationsThe study may be of interest to practitioners working in the design, development, delivery and evaluation of learning interventions for small service firms. Given the importance of the small firm sector to the global economy, the research may also be of interest to government agencies, who strive to protect the survival and growth of small firms generally and who set aside resource amounts each year to fund training programmes for small firm OMs.Originality/valueThe research contributes to the body of existing knowledge in the small firm setting concerning social learning theory and small firm learning strategies. It has identified a link between OM goal setting and external accountability mechanisms in pursuit of sustainable organisational learning in small firms and offers a framework for embedding strategic learning plans in small firms. The study answers calls for a more robust framework to advance understanding of how OMs learn and whether that learning is consequently embedded in the organisation. The proposed framework can be used as a guideline for support organisations in assisting small firms in reaching their learning potential. It can also be used by small firms in the attainment of strategy learning capability.
APA, Harvard, Vancouver, ISO, and other styles
31

Kelliher, Felicity, Monica Murphy, and Denis Harrington. "Exploring the role of goal setting and external accountability mechanisms in embedding strategic learning plans in small firms." Journal of Small Business and Enterprise Development 27, no. 5 (April 8, 2020): 705–25. http://dx.doi.org/10.1108/jsbed-12-2019-0411.

Full text
Abstract:
PurposeThis paper explores the role of goal setting and external accountability mechanisms in embedding strategic learning plans in small firms. The research question asks, does an external learning intervention influence how strategic learning plans are embedded in small firms?Design/methodology/approachInsights from in-depth action research carried out with three small firm owner-managers (OMs) inform the study.FindingsFindings present valuable insights into how small firms learn strategically, and the link between OM goal setting and external accountability mechanisms in pursuit of embedded learning. A framework for embedding strategic learning plans in small firms is presented.Research limitations/implicationsThis study offers a contribution to knowledge in the areas of small firm learning, strategic planning and social learning theory. While the sample size is small, data and case protocols are in place which allow for replication of the study. As the research is embedded in social learning theory, alternative theoretical frameworks may shed a different light on the research question.Practical implicationsThe study will be of interest to practitioners working in the design, development, delivery and evaluation of learning interventions for small service firms. Given the importance of the small firm sector to the global economy, the research may also be of interest to government agencies, who strive to protect the survival and growth of small firms generally and who set aside resource amounts each year to fund training programmes for small firm OMs.Originality/valueThe research contributes to the body of existing knowledge in the small firm setting concerning social learning theory and small firm learning strategies. It has identified a link between OM goal setting and external accountability mechanisms in pursuit of sustainable organisational learning in small firms and offers a framework for embedding strategic learning plans in small firms. The study answers calls for a more robust framework to advance understanding of how OMs learn and whether that learning is consequently embedded in the organisation. The proposed framework can be used as a guideline for support organisations in assisting small firms in reaching their learning potential. It can also be used by small firms in the attainment of strategy learning capability.
APA, Harvard, Vancouver, ISO, and other styles
32

Rivaldo, Rian, Handrizal Handrizal, and Herriyance Herriyance. "Pengamanan Pesan Menggunakan Metode MLSB PRNG dan Kompresi File dengan Algoritma RLE pada File Audio." JURNAL SISTEM INFORMASI BISNIS 11, no. 1 (December 30, 2020): 1–8. http://dx.doi.org/10.21456/vol11iss1pp1-8.

Full text
Abstract:
The development of sending messages from one place to another can be done regardless of distance and time. However, the delivery of these messages is hampered by problems of confidentiality and message security. Especially if the data contains important and confidential information that not just anyone is allowed to read and find out about it. In overcoming this problem, steganography techniques can be used with the Modified Least Significant Bit algorithm, where the determination of the embedding index is based on random numbers generated by the Pseudo-Random Number Generator with the Multiply with Carry algorithm. In addition to security, data size is also an important factor in data transmission. The larger the size the more time it will take to transmit the data. Therefore, the Run Length Encoding algorithm is needed to compress the data size, which will shorten the time to transmit the data. In the message extraction process, a stego key is needed to generate random numbers. Based on the testing of the extraction process with an arbitrary key, it is obtained that the message tested is not the original message that has been embedded previously. In the results of the embedding and extraction process, it is obtained that the average value of PSNR is 63.61498 dB, which means the quality of the stego object produced is quite good. Whereas the measurement of file compression performance results with an average value of Compression Ratio at 1.00113, Space Savings at 0.1133%, and Bitrate at 584025.33 bits/sample. These results indicate that RLE algorithm compression is not efficient to compress file sizes.
APA, Harvard, Vancouver, ISO, and other styles
33

Halim, Lionel Reinhart, and Alethea Suryadibrata. "Cyberbullying Sentiment Analysis with Word2Vec and One-Against-All Support Vector Machine." IJNMT (International Journal of New Media Technology) 8, no. 1 (June 27, 2021): 57–64. http://dx.doi.org/10.31937/ijnmt.v8i1.2047.

Full text
Abstract:
Depression and social anxiety are the two main negative impacts of cyberbullying. Unfortunately, a survey conducted by UNICEF on 3rd September 2019 showed that 1 in 3 young people in 30 countries had been victims of cyberbullying. Sentiment analysis research will be conducted to detect a comment that contains cyberbullying. Dataset of cyberbullying is obtained from the Kaggle website, named, Toxic Comment Classification Challenge. The pre-processing process consists of 4 stages, namely comment generalization (convert text into lowercase and remove punctuation), tokenization, stop words removal, and lemmatization. Word Embedding will be used to conduct sentiment analysis by implementing Word2Vec. After that, One-Against-All (OAA) method with the Support Vector Machine (SVM) model will be used to make predictions in the form of multi labelling. The SVM model will go through a hyperparameter tuning process using Randomized Search CV. Then, evaluation will be carried out using Micro Averaged F1 Score to assess the prediction accuracy and Hamming Loss to assess the numbers of pairs of sample and label that are incorrectly classified. Implementation result of Word2Vec and OAA SVM model provide the best result for the data undergoing the process of pre-processing using comment generalization, tokenization, stop words removal, and lemmatization which is stored into 100 features in Word2Vec model. Micro Averaged F1 and Hamming Loss percentage that is produced by the tuned model is 83.40% and 15.13% respectively. Index Terms— Sentiment Analysis; Word Embedding; Word2Vec; One-Against-All; Support Vector Machine; Toxic Comment Classification Challenge; Multi Labelling
APA, Harvard, Vancouver, ISO, and other styles
34

Calvert, Wesley, and Julia F. Knight. "Classification from a Computable Viewpoint." Bulletin of Symbolic Logic 12, no. 2 (June 2006): 191–218. http://dx.doi.org/10.2178/bsl/1146620059.

Full text
Abstract:
Classification is an important goal in many branches of mathematics. The idea is to describe the members of some class of mathematical objects, up to isomorphism or other important equivalence, in terms of relatively simple invariants. Where this is impossible, it is useful to have concrete results saying so. In model theory and descriptive set theory, there is a large body of work showing that certain classes of mathematical structures admit classification while others do not. In the present paper, we describe some recent work on classification in computable structure theory.Section 1 gives some background from model theory and descriptive set theory. From model theory, we give sample structure and non-structure theorems for classes that include structures of arbitrary cardinality. We also describe the notion of Scott rank, which is useful in the more restricted setting of countable structures. From descriptive set theory, we describe the basic Polish space of structures for a fixed countable language with fixed countable universe. We give sample structure and non-structure theorems based on the complexity of the isomorphism relation, and on Borel embeddings.Section 2 gives some background on computable structures. We describe three approaches to classification for these structures. The approaches are all equivalent. However, one approach, which involves calculating the complexity of the isomorphism relation, has turned out to be more productive than the others. Section 3 describes results on the isomorphism relation for a number of mathematically interesting classes—various kinds of groups and fields. In Section 4, we consider a setting similar to that in descriptive set theory. We describe an effective analogue of Borel embedding which allows us to make distinctions even among classes of finite structures. Section 5 gives results on computable structures of high Scott rank. Some of these results make use of computable embeddings. Finally, in Section 6, we mention some open problems and possible directions for future work.
APA, Harvard, Vancouver, ISO, and other styles
35

Kim, Tae-Hoon, Min-Chul Kang, Ga-Bin Jung, Dong Soo Kim, and Cheol-Woong Yang. "Novel Method for Preparing Transmission Electron Microscopy Samples of Micrometer-Sized Powder Particles by Using Focused Ion Beam." Microscopy and Microanalysis 23, no. 5 (September 13, 2017): 1055–60. http://dx.doi.org/10.1017/s1431927617012557.

Full text
Abstract:
AbstractThe preparation of transmission electron microscopy (TEM) samples from powders is quite difficult and challenging. For powders with particles in the 1–5 μm size range, it is especially difficult to select an adequate sample preparation technique. Epoxy is commonly used to bind powder, but drawbacks, such as differential milling originating from unequal milling rates between the epoxy and powder, remain. We propose a new, simple method for preparing TEM samples. This method is especially useful for powders with particles in the 1–5 μm size range that are vulnerable to oxidation. The method uses solder as an embedding agent together with focused ion beam (FIB) milling. The powder was embedded in low-temperature solder using a conventional hot-mounting instrument. Subsequently, FIB was used to fabricate thin TEM samples via the lift-out technique. The solder proved to be more effective than epoxy in producing thin TEM samples with large areas. The problem of differential milling was mitigated, and the solder binder was more stable than epoxy under an electron beam. This methodology can be applied for preparing TEM samples from various powders that are either vulnerable to oxidation or composed of high atomic number elements.
APA, Harvard, Vancouver, ISO, and other styles
36

Zhu, Wenbo, Hao Jin, WeiChang Yeh, Jianwen Chen, Lufeng Luo, Jinhai Wang, and Aiyuan Li. "Leveraging Multimodal Out-of-Domain Information to Improve Low-Resource Speech Translation." Security and Communication Networks 2021 (November 26, 2021): 1–14. http://dx.doi.org/10.1155/2021/9915130.

Full text
Abstract:
Speech translation (ST) is a bimodal conversion task from source speech to the target text. Generally, deep learning-based ST systems require sufficient training data to obtain a competitive result, even with a state-of-the-art model. However, the training data is usually unable to meet the completeness condition due to the small sample problems. Most low-resource ST tasks improve data integrity with a single model, but this optimization has a single dimension and limited effectiveness. In contrast, multimodality is introduced to leverage different dimensions of data features for multiperspective modeling. This approach mutually addresses the gaps in the different modalities to enhance the representation of the data and improve the utilization of the training samples. Therefore, it is a new challenge to leverage the enormous multimodal out-of-domain information to improve the low-resource tasks. This paper describes how to use multimodal out-of-domain information to improve low-resource models. First, we propose a low-resource ST framework to reconstruct large-scale label-free audio by combining self-supervised learning. At the same time, we introduce a machine translation (MT) pretraining model to complement text embedding and fine-tune decoding. In addition, we analyze the similarity at the decoder side. We reduce multimodal invalid pseudolabels by performing random depth pruning in the similarity layer to minimize error propagation and use additional CTC loss in the nonsimilarity layer to optimize the ensemble loss. Finally, we study the weighting ratio of the fusion technique in the multimodal decoder. Our experiment results show that the proposed method is promising for low-resource ST, with improvements of up to +3.6 BLEU points compared to baseline low-resource ST models.
APA, Harvard, Vancouver, ISO, and other styles
37

Datta, Radhika Prosad, and Ranajoy Bhattacharyya. "Predictability of Indian Exchange Rates." Journal of Prediction Markets 12, no. 3 (February 13, 2019): 1–22. http://dx.doi.org/10.5750/jpm.v12i3.1585.

Full text
Abstract:
In this paper we determine the extent of predictability of India’s major spot exchange rates by using the Lyapunov exponent. We first determine whether the series is fractal (self-similar) in nature. If it is indeed so, then next we determine whether the underlying dynamics of the system is deterministic or stochastic. If the dynamics is found to be deterministic then we calculate the Largest Lyapunov Exponent (LLE) to determine whether the series has deterministic chaos. Finally we use the inverse of the Lyapunov exponent to estimate the time period for which out of sample predictions for the series make sense. We find that India’s major spot exchange rates are: a) fractal in nature, b) chaotic with a high embedding dimension and c) The inverse of the LLE gives us a time frame in which any meaningful predictions can be made. These results are interpreted in two ways. First, exploiting the efficient market interpretation of randomness we conclude that since available information is fairly rapidly internalized, chaotic behaviour is mainly due to the unforeseen nature of the pool of new information affecting the systems at such short intervals of time. Second, anti-cyclical central bank interventions are conjectured to be the source of determinism in otherwise almost random movements.
APA, Harvard, Vancouver, ISO, and other styles
38

Zhang, Xianku, Baigang Zhao, and Guoqing Zhang. "Improved parameter identification algorithm for ship model based on nonlinear innovation decorated by sigmoid function." Transportation Safety and Environment 3, no. 2 (June 2021): 114–22. http://dx.doi.org/10.1093/tse/tdab006.

Full text
Abstract:
Abstract This paper investigates the problem of parameter identification for ship nonlinear Nomoto model with small test data, a nonlinear innovation-based identification algorithm is presented by embedding sigmoid function in the stochastic gradient algorithm. To demonstrate the validity of the algorithm, an identification test is carried out on the ship ‘SWAN’ with only 26 sets of test data. Furthermore, the identification effects of the least squares algorithm, original stochastic gradient algorithm and the improved stochastic gradient algorithm based on nonlinear innovation are compared. Generally, the stochastic gradient algorithm is not suitable for the condition of small test data. The simulation results indicate that the improved stochastic gradient algorithm with sigmoid function greatly increases its accuracy of parameter identification and has 14.2% up compared with the least squares algorithm. Then the effectiveness of the algorithm is verified by another identification test on the ship ‘Galaxy’, the accuracy of parameter identification can reach more than 95% which can be used in ship motion simulation and controller design. The proposed algorithm has advantages of the small test data, fast speed and high accuracy of identification, which can be extended to other parameter identification systems with less sample data.
APA, Harvard, Vancouver, ISO, and other styles
39

Baothman, Fatmah Abdulrahman, and Budoor Salem Edhah. "Toward agent-based LSB image steganography system." Journal of Intelligent Systems 30, no. 1 (January 1, 2021): 903–19. http://dx.doi.org/10.1515/jisys-2021-0044.

Full text
Abstract:
Abstract In a digital communication environment, information security is mandatory. Three essential parameters used in the design process of a steganography algorithm are Payload, security, and fidelity. However, several methods are implemented in information hiding, such as Least Significant Bit (LBS), Discrete Wavelet Transform, Masking, and Discrete Cosine Transform. The paper aims to investigate novel steganography techniques based on agent technology. It proposes a Framework of Steganography based on agent for secret communication using LSB. The most common image steganography databases are explored for training and testing. The methodology in this work is based on the statistical properties of the developed agent software using Matlab. The experiment design is based on six statistical feature measures, including Histogram, Mean, Standard deviation, Entropy, Variance and Energy. For steganography, an Ensemble classifier is used to test two scenarios: embedding a single language message and inserting bilingual messages. ROC Curve represents the evaluation metrics. The result shows that the designed agent-based system with 50% training/testing sample set and 0.2 Payload can pick out the best cover image for the provided hidden message size to avoid visual artifact.
APA, Harvard, Vancouver, ISO, and other styles
40

Horgan, Frances, Vanda Cummins, Dawn A. Skelton, Frank Doyle, Maria O’Sullivan, Rose Galvin, Elissa Burton, et al. "Enhancing Existing Formal Home Care to Improve and Maintain Functional Status in Older Adults: Results of a Feasibility Study on the Implementation of Care to Move (CTM) in an Irish Healthcare Setting." International Journal of Environmental Research and Public Health 19, no. 18 (September 6, 2022): 11148. http://dx.doi.org/10.3390/ijerph191811148.

Full text
Abstract:
Background: Care to Move (CTM) provides a series of consistent ‘movement prompts’ to embed into existing movements of daily living. We explored the feasibility of incorporating CTM approaches in home care settings. Methods: Feasibility study of the CTM approach in older adults receiving home care. Recruitment, retention and attrition (three time points), adherence, costs to deliver and data loss analyzed and differentiated pre and post the COVID-19 pandemic. Secondary outcomes, including functional status, physical activity, balance confidence, quality of life, cost to implement CTM. Results: Fifty-five home care clients (69.6% of eligible sample) participated. Twenty were unable to start due to COVID-19 disruptions and health issues, leaving 35 clients recruited, mostly women (85.7%), mean age 82.8 years. COVID-19 disruption impacted on the study, there was 60% retention to T2 assessments (8-weeks) and 13 of 35 (37.1%) completed T3 assessments (6-months). There were improvements with small to medium effect sizes in quality of life, physical function, balance confidence and self-efficacy. Managers were supportive of the roll-out of CTM. The implementation cost was estimated at EUR 280 per carer and annual running costs at EUR 75 per carer. Conclusion: Embedding CTM within home support services is acceptable and feasible. Data gathered can power a definitive trial.
APA, Harvard, Vancouver, ISO, and other styles
41

Abdulhussein, Maryam Alaa, and Alaa Liaq Hashem. "An experimental study of the thermal behavior of bricks integrated with PCM-capsules in building walls." Al-Qadisiyah Journal for Engineering Sciences 14, no. 3 (February 11, 2022): 178–83. http://dx.doi.org/10.30772/qjes.v14i3.775.

Full text
Abstract:
Buildings are the major energy users and by 2035, they will be the fourth largest source of greenhouse gas emissions. Phase change materials (PCMs) are applied to shift the peak-load to the off-peak-load, positively effecting the efficiency of the building. In this paper, an experimental embedding of PCM (Paraffin) with bricks in conventional wall layers is carried out. The effect of this on the thermal diffusion of the inner surface of the wall is studied. Capsules are manufactured to fit the size of the holes inside the bricks and they are filled with Paraffin (147 kJ/kg latent heat, 38oC solidified and 43oC liquidised) and closed in a way that prevents leakage. Each brick contained two rows of holes (5 holes per row). Capsules are placed in the holes at a rate of 5 capsules per brick and 10 capsules per brick. Three wall samples are experimentally tested: traditional wall, wall containing 5 capsules/brick and a wall containing 10 capsules/brick. The indoor test with light intensity has been fixed on 900 W/m2 during the heating period for 4 hours and the remaining period of cooling. The temperature measured and recorded for the internal, external surfaces and the middle of the wall using K-type thermocouples and datalogger. The results indicated that, bricks wall with 10 PCM- capsules per brick reduced the heat flux by 34.17% compared with traditional wall sample, and energy stored 50% more than a wall with 5PCM-capsules per brick. The lowest temperature of the internal wall surface of sample 10 capsules per brick is recorded compared to the reference wall where the difference is more than 3°C.
APA, Harvard, Vancouver, ISO, and other styles
42

Chang, Lili, Rui Zhang, and Chunsheng Wang. "Evaluation and Prediction of Landslide Susceptibility in Yichang Section of Yangtze River Basin Based on Integrated Deep Learning Algorithm." Remote Sensing 14, no. 11 (June 6, 2022): 2717. http://dx.doi.org/10.3390/rs14112717.

Full text
Abstract:
Landslide susceptibility evaluation (LSE) refers to the probability of landslide occurrence in a region under a specific geological environment and trigger conditions, which is crucial to preventing and controlling landslide risk. The mainstream of the Yangtze River in Yichang City belongs to the largest basin in the Three Gorges Reservoir area and is prone to landslides. Affected by global climate change, seismic activity, and accelerated urbanization, geological disasters such as landslide collapses and debris flows in the study area have increased significantly. Therefore, it is urgent to carry out the LSE in the Yichang section of the Yangtze River Basin. The main results are as follows: (1) Based on historical landslide catalog, geological data, geographic data, hydrological data, remote sensing data, and other multi-source spatial-temporal big data, we construct the LSE index system; (2) In this paper, unsupervised Deep Embedding Clustering (DEC) algorithm and deep integration network (Capsule Neural Network based on SENet: SE-CapNet) are used for the first time to participate in non-landslide sample selection, and LSE in the study area and the accuracy of the algorithm is 96.29; (3) Based on the constructed sensitivity model and rainfall forecast data, the main driving mechanisms of landslides in the Yangtze River Basin were revealed. In this paper, the study area’s mid-long term LSE prediction and trend analysis are carried out. (4) The complete results show that the method has good performance and high precision, providing a reference for subsequent LSE, landslide susceptibility prediction (LSP), and change rule research, and providing a scientific basis for landslide disaster prevention.
APA, Harvard, Vancouver, ISO, and other styles
43

Xiong, Jinle, Xueyu Liang, Lina Zhao, Benny Lo, Jianqing Li, and Chengyu Liu. "Improving Accuracy of Heart Failure Detection Using Data Refinement." Entropy 22, no. 5 (May 2, 2020): 520. http://dx.doi.org/10.3390/e22050520.

Full text
Abstract:
Due to the wide inter- and intra-individual variability, short-term heart rate variability (HRV) analysis (usually 5 min) might lead to inaccuracy in detecting heart failure. Therefore, RR interval segmentation, which can reflect the individual heart condition, has been a key research challenge for accurate detection of heart failure. Previous studies mainly focus on analyzing the entire 24-h ECG recordings from all individuals in the database which often led to poor detection rate. In this study, we propose a set of data refinement procedures, which can automatically extract heart failure segments and yield better detection of heart failure. The procedures roughly contain three steps: (1) select fast heart rate sequences, (2) apply dynamic time warping (DTW) measure to filter out dissimilar segments, and (3) pick out individuals with large numbers of segments preserved. A physical threshold-based Sample Entropy (SampEn) was applied to distinguish congestive heart failure (CHF) subjects from normal sinus rhythm (NSR) ones, and results using the traditional threshold were also discussed. Experiment on the PhysioNet/MIT RR Interval Databases showed that in SampEn analysis (embedding dimension m = 1, tolerance threshold r = 12 ms and time series length N = 300), the accuracy value after data refinement has increased to 90.46% from 75.07%. Meanwhile, for the proposed procedures, the area under receiver operating characteristic curve (AUC) value has reached 95.73%, which outperforms the original method (i.e., without applying the proposed data refinement procedures) with AUC of 76.83%. The results have shown that our proposed data refinement procedures can significantly improve the accuracy in heart failure detection.
APA, Harvard, Vancouver, ISO, and other styles
44

Wang, Mengcheng, Shenglin Ma, Yufeng Jin, Wei Wang, Jing Chen, Liulin Hu, and Shuwei He. "A RF Redundant TSV Interconnection for High Resistance Si Interposer." Micromachines 12, no. 2 (February 8, 2021): 169. http://dx.doi.org/10.3390/mi12020169.

Full text
Abstract:
Through Silicon Via (TSV) technology is capable meeting effective, compact, high density, high integration, and high-performance requirements. In high-frequency applications, with the rapid development of 5G and millimeter-wave radar, the TSV interposer will become a competitive choice for radio frequency system-in-package (RF SIP) substrates. This paper presents a redundant TSV interconnect design for high resistivity Si interposers for millimeter-wave applications. To verify its feasibility, a set of test structures capable of working at millimeter waves are designed, which are composed of three pieces of CPW (coplanar waveguide) lines connected by single TSV, dual redundant TSV, and quad redundant TSV interconnects. First, HFSS software is used for modeling and simulation, then, a modified equivalent circuit model is established to analysis the effect of the redundant TSVs on the high-frequency transmission performance to solidify the HFSS based simulation. At the same time, a failure simulation was carried out and results prove that redundant TSV can still work normally at 44 GHz frequency when failure occurs. Using the developed TSV process, the sample is then fabricated and tested. Using L-2L de-embedding method to extract S-parameters of the TSV interconnection. The insertion loss of dual and quad redundant TSVs are 0.19 dB and 0.46 dB at 40 GHz, respectively.
APA, Harvard, Vancouver, ISO, and other styles
45

Jansen, Aren, Gregory Sell, and Vince Lyzinski. "Scalable out-of-sample extension of graph embeddings using deep neural networks." Pattern Recognition Letters 94 (July 2017): 1–6. http://dx.doi.org/10.1016/j.patrec.2017.04.016.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Wang, Zhengge, Sara Larivière, Qiang Xu, Reinder Vos de Wael, Seok-Jun Hong, Zhongyuan Wang, Yun Xu, et al. "Community-informed connectomics of the thalamocortical system in generalized epilepsy." Neurology 93, no. 11 (August 12, 2019): e1112-e1122. http://dx.doi.org/10.1212/wnl.0000000000008096.

Full text
Abstract:
ObjectiveTo study the intrinsic organization of the thalamocortical circuitry in patients with generalized epilepsy with tonic-clonic seizures (GTCS) via resting-state fMRI (rs-fMRI) connectome analysis and to evaluate its relation to drug response.MethodsIn a prospectively followed-up sample of 41 patients and 27 healthy controls, we obtained rs-fMRI and structural MRI. After 1 year of follow-up, 27 patients were classified as seizure-free and 14 as drug-resistant. We examined connectivity within and between resting-state communities in cortical and thalamic subregions. In addition to comparing patients to controls, we examined associations with seizure control. We assessed reproducibility in an independent cohort of 21 patients.ResultsCompared to controls, patients showed a more constrained network embedding of the thalamus, while frontocentral neocortical regions expressed increased functional diversity. Findings remained significant after regressing out thalamic volume and cortical thickness, suggesting independence from structural alterations. We observed more marked network imbalances in drug-resistant compared to seizure-free patients. Findings were similar in the reproducibility dataset.ConclusionsOur findings suggest a pathoconnectomic mechanism of generalized epilepsy centered on diverging changes in cortical and thalamic connectivity. More restricted thalamic connectivity could reflect the tendency to engage in recursive thalamocortical loops, which may contribute to hyperexcitability. Conversely, increased connectional diversity of frontocentral networks may relay abnormal activity to an extended bilateral territory. Network imbalances were observed shortly after diagnosis and related to future drug response, suggesting clinical utility.
APA, Harvard, Vancouver, ISO, and other styles
47

Mars, Ayoub, and Wael Adi. "New Family of Stream Ciphers as Physically Clone-Resistant VLSI-Structures." Cryptography 3, no. 2 (April 6, 2019): 11. http://dx.doi.org/10.3390/cryptography3020011.

Full text
Abstract:
A concept for creating a large class of lightweight stream ciphers as Key Stream Generators KSGs is presented. The resulting class-size exceeds 2323 possible different KSGs. If one unknown cipher from the KSG-class is randomly picked-up and stored irreversibly within a VLSI device, the device becomes physically hard-to-clone. The selected cipher is only usable by the device itself, therefore cloning it requires an invasive attack on that particular device. Being an unknown selection out of 2323 possible KSGs, the resulting cipher is seen as a Secret Unknown Cipher (SUC). The SUC concept was presented a decade ago as a digital alternative to the inconsistent traditional analog Physically Unclonable Functions (PUFs). This work presents one possible practical self-creation technique for such PUFs as hard-to-clone unknown KSGs usable to re-identify VLSI devices. The proposed sample cipher-structure is based on non-linear merging of randomly selected 16 Nonlinear Feedback Shift Registers (NLFSRs). The created KSGs exhibit linear complexities exceeding 281 and a period exceeding 2161. The worst-case device cloning time complexity approaches 2162. A simple lightweight identification protocol for physically identifying such SUC structures in FPGA-devices is presented. The required self-reconfiguring FPGAs for embedding such SUCs are not yet available, however, expected to emerge in the near future. The security analysis and hardware complexities of the resulting clone-resistant structures are evaluated and shown to offer scalable security levels to cope even with the post-quantum cryptography.
APA, Harvard, Vancouver, ISO, and other styles
48

Hetenyi, Gabor, Attila Dr. Lengyel, and Magdolna Dr. Szilasi. "Quantitative analysis of qualitative data: Using voyant tools to investigate the sales-marketing interface." Journal of Industrial Engineering and Management 12, no. 3 (November 18, 2019): 393. http://dx.doi.org/10.3926/jiem.2929.

Full text
Abstract:
Purpose: The present study aims to give a short introduction into the possibilities offered by Voyant Tools to quantitatively explore qualitative data on the Sales-Marketing Interface (SMI).Design/methodology/approach: The study is exploratory in nature. The sample consists of sales and marketing employees of six manufacturing companies. Answers to three open-ended questions were analysed quantitatively and visualised in various ways using the online toolset of Voyant Tools. We experimented with four different tools out of the twenty four offered by Voyant Tools. These tools were: Cyrrus tool, Correlation tool, Topics tool and Scatter plot tool. All four tools that were tested on the data have scalable parameters. Various settings were tested to demonstrate how input conditions influence modelling of the textual data.Findings: It was demonstrated that the four selected text analysis tools can yield valuable information depicted in the form of attractive visualisation formats. It is also highlighted how rushed conclusions can be arrived at by falsely interpreting the visualised data. It is shown how setting different input parameters can affect results. Out of the four examined tools the Scatter plot tool offering an analysis and modelling method based on t-SNE (t-Distributed Stochastic Neighbour Embedding) proved to yield the most complex information about the text. Research limitations/implications: As the study aimed to be exploratory a sample of convenience was used to collect qualitative data. Although quantitative methods can be invaluable tools of preliminary analysis and hypothesis adjustment in the processing of qualitative data, their results should always be checked against the traditional content analysis techniques which are more sensitive to the complex structure of semantic units. These quantitative techniques are to help early exploration of textual data.Practical implications: Managerial implications might be connected to the fact that in a fast changing global business environment managers and corporate decision makers in general might find the attractive visualisation outputs of Voyant Tool easy to analyse and interprete various aspects of business. As Voyant Tools is an open source, free online sofware not even requiring regsitration and at the same time has an impressive array of sophisticated statistical tools, it might be a cost-effective way of analysing qualitative data. Originality/value: As there is virtually no earlier literature on how quantitative data visualisation techniques can be used in marketing research, especially in the analysis of the SMI, utilisation possibilities of Voyant Tools and other quantitative data analysis and visualisation software for handling qualitative data is definitely a worthwhile area for further research.
APA, Harvard, Vancouver, ISO, and other styles
49

Helen Wilford, Sara, and Kutoma Jacqueline Wakunuma. "Perceptions of ethics in IS: how age can affect awareness." Journal of Information, Communication and Ethics in Society 12, no. 4 (November 4, 2014): 270–83. http://dx.doi.org/10.1108/jices-02-2014-0013.

Full text
Abstract:
Purpose – This aim of this paper was to highlight the awareness of ethical issues across the group of information systems (IS) professionals from a range of geographical regions. Design/methodology/approach – An initial survey was conducted that informed in-depth interviews with 26 IS professionals from across the globe. The study identified that around 70 per cent of the sample were over 50 years old. This provided an opportunity to consider age-related differences in perception regarding ethical awareness of both current and emerging technologies. Findings – The project revealed that the more mature IS professionals had a significantly higher level of awareness and perceived understanding regarding the importance of ethical issues than the younger IS professionals. Research limitations/implications – The research was limited to IS professionals and so the findings do not generalise further. Future research would be beneficial to find out if the higher level of ethical awareness is also evident across older people in general or whether it is specific to technology professionals. Practical implications – IS professionals need to be exposed to high standards and expectations of ethical behaviour from senior colleagues, as well as embedding this within technical education. Social implications – Caution with regards to youth culture and youthitisation of the workforce needs to be exerted to avoid rash decision-making and short-termism, which could undermine progress and development. A change in the view of employers to older workers will also require a change in attitudes across Western society, particularly as demographics continue to skew towards an aging population. Originality/value – This paper provides new insight into the ethical awareness of older employees and goes some way to dispel the myths surrounding stereotypes of older workers as being fearful of technology and resistant to change.
APA, Harvard, Vancouver, ISO, and other styles
50

Paltzer, Jason. "Training a Christian Public Health Workforce: A Qualitative Study of Christian Public Health Training Programs." Christian Journal for Global Health 5, no. 3 (November 8, 2018): 12–22. http://dx.doi.org/10.15566/cjgh.v5i3.228.

Full text
Abstract:
Objective: The objective of this qualitative pilot study was to identify opportunities and challenges Christian public health training programs experience when it comes to equipping public health students to work within Christian health mission organizations. Methods: A sample of seven out of seventeen (41 percent response rate) Christian public health institutions from North America, Asia, and Africa completed an online survey. Thematic analysis was conducted to identify major themes in the following areas: values specific to a Christian worldview, competencies focused on integrating a Christian worldview, challenges to integrating a Christian worldview, and training available to students interested in Christian health missions. Results: Values focused on Christ-like humility in serving God and others, discipleship, respecting human dignity in the image of God, and collaborative community partnership. More than half of respondents identified the interrelationship between culture, religion, spirituality, and health as the primary competency integrating a Christian worldview. Global health was identified as a second competency followed by understanding the history and philosophy behind global health and missions. Identified challenges include faith of students and faculty, limited availability of Christian public health textbooks, and secularization of concepts such as poverty and development. Conclusion: The holistic nature of public health is conducive to integrating a Christian worldview into program content. The results show that Christian public health institutions have biblical values and integrate a Christian worldview in understanding the interrelationship between culture, religion, spirituality and health primarily through a global health lens. Programs experience significant challenges to embedding a Christian perspective into other content areas. Opportunities for integrating competencies with a Christian worldview include offering a certificate in global health/development ministry, teaching methods for engaging individuals and groups in holistic health discussions, and incorporating spiritual metrics and instruments into program evaluation courses to measure the influence of faith, hope, and discipleship alongside physical and social health metrics.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography