Journal articles on the topic 'Interpolator testing'

To see the other types of publications on this topic, follow the link: Interpolator testing.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Interpolator testing.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Petráček, Petr, Petr Fojtů, Tomáš Kozlok, and Matěj Sulitka. "Effect of CNC Interpolator Parameter Settings on Toolpath Precision and Quality in Corner Neighborhoods." Applied Sciences 12, no. 19 (September 22, 2022): 9496. http://dx.doi.org/10.3390/app12199496.

Full text
Abstract:
Surface quality, machining time, and precision of the final workpiece are key criteria of optimization in CNC machining. These criteria are influenced by multiple factors, such as path interpolation, feed drive system settings, machine dynamics, and the manufacturing process. The properties of the output of the interpolator indirectly influence all subsequent phases of the machining process, thus influencing the quality of the end product. This paper focuses on the effects of interpolator settings on toolpath quality and precision in corner neighborhoods for the commercial Heidenhain iTNC interpolator. A novel method of toolpath quality evaluation suitable for interpolator output toolpaths is proposed, and the effect of multiple CNC parameters on toolpath quality and precision in corner neighborhoods is quantified based on results obtained on a testing toolpath and verified on a toolpath composed of linear segments only. Both toolpath quality and precision were found to depend primarily on the parameters of limit frequency, contour tolerance, and corner jerk settings with precision additionally depending on angle size. The results show that both toolpath quality and precision in corner neighborhoods can be successfully controlled by the corner jerk limit parameter settings. The presented methodology provides a practical guide for CNC parameter settings in Heidenhain interpolators aimed at predicting toolpath quality and precision in corner neighborhoods.
APA, Harvard, Vancouver, ISO, and other styles
2

Dutra e Silva Júnior, Élvio Carlos, Leandro Soares Indrusiak, Weiler Alves Finamore, and Manfred Glesner. "A Programmable Look-Up Table-Based Interpolator with Nonuniform Sampling Scheme." International Journal of Reconfigurable Computing 2012 (2012): 1–14. http://dx.doi.org/10.1155/2012/647805.

Full text
Abstract:
Interpolation is a useful technique for storage of complex functions on limited memory space: some few sampling values are stored on a memory bank, and the function values in between are calculated by interpolation. This paper presents a programmable Look-Up Table-based interpolator, which uses a reconfigurable nonuniform sampling scheme: the sampled points are not uniformly spaced. Their distribution can also be reconfigured to minimize the approximation error on specific portions of the interpolated function’s domain. Switching from one set of configuration parameters to another set, selected on the fly from a variety of precomputed parameters, and using different sampling schemes allow for the interpolation of a plethora of functions, achieving memory saving and minimum approximation error. As a study case, the proposed interpolator was used as the core of a programmable noise generator—output signals drawn from different Probability Density Functions were produced for testing FPGA implementations of chaotic encryption algorithms. As a result of the proposed method, the interpolation of a specific transformation function on a Gaussian noise generator reduced the memory usage to 2.71% when compared to the traditional uniform sampling scheme method, while keeping the approximation error below a threshold equal to 0.000030518.
APA, Harvard, Vancouver, ISO, and other styles
3

Liu, Yu, and Jie Liu. "Chord Error Closed-Loop Controlled NURBS Interpolator and Trajectory Planning." Applied Mechanics and Materials 16-19 (October 2009): 925–29. http://dx.doi.org/10.4028/www.scientific.net/amm.16-19.925.

Full text
Abstract:
Focusing on the problem of NURBS curve interpolation in high speed manufacture, a new trajectory planning algorithm, which is suitable for chord error closed loop controlled interpolator is proposed. This tragjectory can determine accelerating, decelerating or maintenance last velocity in the next period via judging the braking distance. By the way of testing different calculating time under different CPU core, the real-time characteristic is validated. The simulation shows that chord error closed loop interpolator can automatically adjust the velocity to satisfying the precision demand, through calculating the curvature. In addition, it can assure that the maximal velocity and the acceleration were equal to the referenced parameters and machine runs with the dynamic characteristic of operator set completely.
APA, Harvard, Vancouver, ISO, and other styles
4

Rodriguez-Perez, Daniel, and Noela Sanchez-Carnero. "Multigrid/Multiresolution Interpolation: Reducing Oversmoothing and Other Sampling Effects." Geomatics 2, no. 3 (June 22, 2022): 236–53. http://dx.doi.org/10.3390/geomatics2030014.

Full text
Abstract:
Traditional interpolation methods, such as IDW, kriging, radial basis functions, and regularized splines, are commonly used to generate digital elevation models (DEM). All of these methods have strong statistical and analytical foundations (such as the assumption of randomly distributed data points from a gaussian correlated stochastic surface); however, when data are acquired non-homogeneously (e.g., along transects) all of them show over/under-smoothing of the interpolated surface depending on local point density. As a result, actual information is lost in high point density areas (caused by over-smoothing) or artifacts appear around uneven density areas (“pimple” or “transect” effects). In this paper, we introduce a simple but robust multigrid/multiresolution interpolation (MMI) method which adapts to the spatial resolution available, being an exact interpolator where data exist and a smoothing generalizer where data are missing, but always fulfilling the statistical requirement that surface height mathematical expectation at the proper working resolution equals the mean height of the data at that same scale. The MMI is efficient enough to use K-fold cross-validation to estimate local errors. We also introduce a fractal extrapolation that simulates the elevation in data-depleted areas (rendering a visually realistic surface and also realistic error estimations). In this work, MMI is applied to reconstruct a real DEM, thus testing its accuracy and local error estimation capabilities under different sampling strategies (random points and transects). It is also applied to compute the bathymetry of Gulf of San Jorge (Argentina) from multisource data of different origins and sampling qualities. The results show visually realistic surfaces with estimated local validation errors that are within the bounds of direct DEM comparison, in the case of the simulation, and within the 10% of the bathymetric surface typical deviation in the real calculation.
APA, Harvard, Vancouver, ISO, and other styles
5

Evangelista, Ivan Roy S., Lenmar T. Catajay, Maria Gemel B. Palconit, Mary Grace Ann C. Bautista, Ronnie S. Concepcion II, Edwin Sybingco, Argel A. Bandala, and Elmer P. Dadios. "Detection of Japanese Quails (Coturnix japonica) in Poultry Farms Using YOLOv5 and Detectron2 Faster R-CNN." Journal of Advanced Computational Intelligence and Intelligent Informatics 26, no. 6 (November 20, 2022): 930–36. http://dx.doi.org/10.20965/jaciii.2022.p0930.

Full text
Abstract:
Poultry, like quails, is sensitive to stressful environments. Too much stress can adversely affect birds’ health, causing meat quality, egg production, and reproduction to degrade. Posture and behavioral activities can be indicators of poultry wellness and health condition. Animal welfare is one of the aims of precision livestock farming. Computer vision, with its real-time, non-invasive, and accurate monitoring capability, and its ability to obtain a myriad of information, is best for livestock monitoring. This paper introduces a quail detection mechanism based on computer vision and deep learning using YOLOv5 and Detectron2 (Faster R-CNN) models. An RGB camera installed 3 ft above the quail cages was used for video recording. The annotation was done in MATLAB video labeler using the temporal interpolator algorithm. 898 ground truth images were extracted from the annotated videos. Augmentation of images by change of orientation, noise addition, manipulating hue, saturation, and brightness was performed in Roboflow. Training, validation, and testing of the models were done in Google Colab. The YOLOv5 and Detectron2 reached average precision (AP) of 85.07 and 67.15, respectively. Both models performed satisfactorily in detecting quails in different backgrounds and lighting conditions.
APA, Harvard, Vancouver, ISO, and other styles
6

Rodrigues, Daniel, Carl Duchesne, and Julien Lauzon-Gauthier. "Interpolation of Pathway Based Non-Destructive Testing (NDT) Data for Defect Detection and Localization in Pre-Baked Carbon Anodes." Metals 12, no. 9 (August 26, 2022): 1411. http://dx.doi.org/10.3390/met12091411.

Full text
Abstract:
Producing consistent quality pre-baked carbon anodes for the Hall–Héroult aluminum reduction process is challenging due to the decreasing quality and increasing variability of anode raw materials. Non-destructive testing techniques (NDT) have been developed and recently implemented in manufacturing plants to establish better suited and more efficient quality control schemes than core sampling and characterization. These technologies collect measurements representing effective properties of the materials located along a pathway between two transducers (emitter and receiver), and not spatially-resolved distribution of properties within the anode volume. A method to interpolate pathway-based measurements and provide spatially-resolved distribution of properties is proposed in this work to help NDT technologies achieve their full potential. The interpolation method is tested by simulating acousto-ultrasonic data collected from a large number of 2D and 3D toy examples representing simplified anode internal structures involving randomly generated defects. Experimental validation was performed by characterizing core samples extracted from a set of industrial anodes and correlating their properties with interpolated speed of sound by the algorithm. The method is shown to be successful in determining the defect positions, and the interpolated results are shown to correlate significantly with mechanical properties.
APA, Harvard, Vancouver, ISO, and other styles
7

Kim, Seungjun, Junghoon Jin, and Jongsun Kim. "A Cost-Effective and Compact All-Digital Dual-Loop Jitter Attenuator for Built-Off-Test Applications." Electronics 11, no. 21 (November 7, 2022): 3630. http://dx.doi.org/10.3390/electronics11213630.

Full text
Abstract:
A compact and low-power all-digital CMOS dual-loop jitter attenuator (DJA) for low-cost built-off-test (BOT) applications such as parallel multi-DUT testing is presented. The proposed DJA adopts a new digital phase interpolator (PI)-based clock recovery (CR) loop with an adaptive decimation filter (ADF) function to remove the jitter and phase noise of the input clock, and generate a phase-aligned clean output clock. In addition, by adopting an all-digital multi-phase multiplying delay-locked loop (MDLL), eight low-jitter evenly spaced reference clocks that are required for the PI are generated. In the proposed DJA, both the MDLL and PI-based CR are first-order systems, and so this DJA has the advantage of high system stability. In addition, the proposed DJA has the benefit of a wide operating frequency range, unlike general PLL-based jitter attenuators that have a narrow frequency range and a jitter peaking problem. Implemented in a 40 nm 0.9 V CMOS process, the proposed DJA generates cleaned programmable output clock frequencies from 2.4 to 4.7 GHz. Furthermore, it achieves a peak-to-peak and RMS jitter attenuation of –25.6 dB and –32.6 dB, respectively, at 2.4 GHz. In addition, it occupies an active area of only 0.0257 mm2 and consumes a power of 7.41 mW at 2.4 GHz.
APA, Harvard, Vancouver, ISO, and other styles
8

DeGaetano, Arthur T., and Brian N. Belcher. "Spatial Interpolation of Daily Maximum and Minimum Air Temperature Based on Meteorological Model Analyses and Independent Observations." Journal of Applied Meteorology and Climatology 46, no. 11 (November 1, 2007): 1981–92. http://dx.doi.org/10.1175/2007jamc1536.1.

Full text
Abstract:
Abstract Hourly meteorological forecast model initializations are used to guide the spatial interpolation of daily cooperative network station data in the northeastern United States. The hourly model data are transformed to daily maximum and minimum temperature values and interpolated to the station points after standardization to station elevation based on the model temperature lapse rate. The resulting bias (interpolation − observation) is computed and then interpolated back to the model grids, allowing daily adjustment of the temperature fields based on independent observations. These adjusted data can then be interpolated to the resolution of interest. For testing, the data are interpolated to stations that were withheld during the construction of the bias field. The use of the model initializations as a basis for interpolation improves upon the conventional interpolation of elevation-adjusted station data alone. When inverse-distance-weighted interpolation is used in conjunction with data from a 40-km-model grid, mean annual absolute errors averaged 5% smaller than those from interpolation of station data alone for maximum and minimum temperature, which is a significant decrease. Using data from a 20-km-model grid reduces mean absolute error during June by 10% for maximum temperature and 16% for minimum temperature. Adjustment for elevation based on the model temperature lapse rate improved the interpolation of maximum temperature, but had little effect on minimum temperature. Winter minimum temperature errors were related to snow depth, a feature that likely contributed to the relatively high autocorrelation exhibited by the daily errors.
APA, Harvard, Vancouver, ISO, and other styles
9

Wang, Fen-Jiao, Chang-Lin Mei, Zhi Zhang, and Qiu-Xia Xu. "Testing for Local Spatial Association Based on Geographically Weighted Interpolation of Geostatistical Data with Application to PM2.5 Concentration Analysis." Sustainability 14, no. 21 (November 7, 2022): 14646. http://dx.doi.org/10.3390/su142114646.

Full text
Abstract:
Using local spatial statistics to explore local spatial association of geo-referenced data has attracted much attention. As is known, a local statistic is formulated at a particular sampling unit based on a prespecific proximity relationship and the observations in the neighborhood of this sampling unit. However, geostatistical data such as meteorological data and air pollution data are generally collected from meteorological or monitoring stations which are usually sparsely located or highly clustered over space. For such data, a local spatial statistic formulated at an isolate sampling point may be ineffective because of its distant neighbors, or the statistic is undefinable in the sub-regions where no observations are available, which limits the comprehensive exploration of local spatial association over the whole studied region. In order to overcome the predicament, a local-linear geographically weighted interpolation method is proposed in this paper to obtain the predictors of the underlying spatial process on a lattice spatial tessellation, on which a local spatial statistic can be well formulated at each interpolation point. Furthermore, the bootstrap test is suggested to identify the locations where local spatial association is significant using the interpolated-value-based local spatial statistics. Simulation with comparison to some existing interpolation and test methods is conducted to assess the performance of the proposed interpolation and the suggested test methods and a case study based on PM2.5 concentration data in Guangdong province, China, is used to demonstrate their applicability. The results show that the proposed interpolation method performs accurately in retrieving an underlying spatial process and the bootstrap test with the interpolated-value-based local statistics is powerful in identifying local patterns of spatial association.
APA, Harvard, Vancouver, ISO, and other styles
10

Rimon, Y., E. R. Graber, and A. Furman. "Interpolation of extensive routine water pollution monitoring datasets: methodology and discussion of implications for aquifer management." Hydrology and Earth System Sciences Discussions 10, no. 7 (July 17, 2013): 9363–87. http://dx.doi.org/10.5194/hessd-10-9363-2013.

Full text
Abstract:
Abstract. A large fraction of the fresh water available for human use is stored in groundwater aquifers. Since human activities such as mining, agriculture, industry and urbanization often result in incursion of various pollutants to groundwater, routine monitoring of water quality is an indispensable component of judicious aquifer management. Unfortunately, groundwater pollution monitoring is expensive and usually cannot cover an aquifer with the spatial resolution necessary for making adequate management decisions. Interpolation of monitoring data between points is thus an important tool for supplementing measured data. However, interpolating routine groundwater pollution data poses a special problem due to the nature of the observations. The data from a producing aquifer usually includes many zero pollution concentration values from the clean parts of the aquifer but may span a wide range (up to a few orders of magnitude) of values in the polluted areas. This manuscript presents a methodology that can cope with such datasets and use them to produce maps that present the pollution plumes but also delineates the clean areas that are fit for production. A method for assessing the quality of mapping in a way which is suitable to the data's dynamic range of values is also presented. Local variant of inverse distance weighting is employed to interpolate the data. Inclusion zones around the interpolation points ensure that only relevant observations contribute to each interpolated concentration. Using inclusion zones improves the accuracy of the mapping but results in interpolation grid points which are not assigned a value. That inherent trade-off between the interpolation accuracy and coverage is demonstrated using both circular and elliptical inclusion zones. A leave-one-out cross testing is used to assess and compare the performance of the interpolations. The methodology is demonstrated using groundwater pollution monitoring data from the Coastal aquifer along the Israeli shoreline.
APA, Harvard, Vancouver, ISO, and other styles
11

Yin, Carol, Philip J. Kellman, and Thomas F. Shipley. "Surface Completion Complements Boundary Interpolation in the Visual Integration of Partly Occluded Objects." Perception 26, no. 11 (November 1997): 1459–79. http://dx.doi.org/10.1068/p261459.

Full text
Abstract:
Previous research on perceptual completion has emphasized how the spatial relationships of edges influence the visual integration of the image fragments that result from partial occlusion. We report studies testing the hypothesis that the similarity of surface features also influences visual integration, complementing edge interpolation processes. Using displays that separated edge interpolation processes from surface-feature interpolation processes, we tested the hypotheses that a surface completion process integrates image fragments with similar surface features, and that surface completion is constrained by amodally interpolated and amodally extended boundaries. Both edge relatability and surface-feature similarity were manipulated in a series of paired-comparison and classification tasks. The results of these studies supported the hypotheses and were extended to surface features of colors, textures, and color gradients. Results also suggest that, under certain conditions, surface completion may interact with and influence edge interpolation.
APA, Harvard, Vancouver, ISO, and other styles
12

Tian, Yukuo, Guocheng Xu, Lingbo Wei, Guanghao Zhou, Yuting Lin, Qiuyue Fan, and Xiaopeng Gu. "Phased array ultrasonic S-scan testing of near-detection-surface defects based on a background subtraction algorithm." Materials Research Express 9, no. 3 (March 1, 2022): 036507. http://dx.doi.org/10.1088/2053-1591/ac58f2.

Full text
Abstract:
Abstract In phased array ultrasonic sector scanning testing, echoes from defects near the detection surface are often overlapped with interface echoes and cannot be characterized. Based on linear acoustic theory, a mathematical model combining the background subtraction and the sum of squares differences algorithm is established to extract the characteristics of near-detection-surface defect echoes. The upper surface of the artificial notch defect is used as a near-detection-surface defect for detection. Specially, linear interpolation algorithm is used to suppress the effect of residual interface echo on the defect echo feature extraction and the effect of different interpolation factors is analyzed. The defect location and size were calculated based on the extracted defect characteristics. The simulation and experimental results indicate that the model can effectively extract the near-detection-surface defect echoes characteristics. Furthermore, when the interpolated signal reaches at least 500 MHz, residual interface echoes can be effectively suppressed. More importantly, according to the extracted defect echo characteristics, the localization and quantitative accuracies of near-surface defects can reach 0.2 mm and 0.3 mm, respectively. These findings provide an effective strategy for phased array ultrasonic sector scanning testing of near-detection-surface defects.
APA, Harvard, Vancouver, ISO, and other styles
13

GUSENITSA, Ya N., O. A. SHIRYAMOV, and D. S. BURYJ. "ALGORITHM FOR MODELING THE SPECTRAL TRANSPARENCY COEFFICIENT OF THE ATMOSPHERE IN THE INFRARED WAVELENGTH RANGE IN AERO OBSERVATION OF GROUND OBJECTS BASED ON INTERPOLATION OF EMPIRICAL DATA." Fundamental and Applied Problems of Engineering and Technology, no. 5 (2021): 141–50. http://dx.doi.org/10.33979/2073-7408-2021-349-5-141-150.

Full text
Abstract:
The article presents an algorithm for modeling the spectral transparency of the atmosphere in the infrared wavelength range for aerial observation of ground objects based on the interpolation of empirical data. The proposed algorithm differs from the existing ones in that it combines analytical expressions and empirical data that can be used to interpolate the value of the absorption coefficient of infrared radiation by water vapor and carbon dioxide. The paper presents theoretical calculations of the absorption and scattering of infrared radiation by the atmosphere, on which the proposed algorithm is based. The results of testing the algorithm in the course of flight experimental work are presented.
APA, Harvard, Vancouver, ISO, and other styles
14

Lohmar, Johannes, and Markus Bambach. "Influence of Different Interpolation Techniques on the Determination of the Critical Conditions for the Onset of Dynamic Recrystallisation." Materials Science Forum 762 (July 2013): 331–36. http://dx.doi.org/10.4028/www.scientific.net/msf.762.331.

Full text
Abstract:
Accurate modeling of dynamic recrystallization (DRX) is highly important for forming processes like hot rolling and forging. To correctly predict the overall level of dynamic recrystallization reached, it is vital to determine and model the critical conditions that mark the start of DRX. For the determination of the critical conditions, a criterion has been proposed by Poliak and Jonas. It states that the onset of DRX can be detected from an inflection point in the work hardening rate as a function of flow stress. The work hardening rate is the derivative of the flow stress with respect to strain. Flow curves are in general measured at a certain sampling rate, yielding tabular stress-strain data, which are per se not continuously differentiable. In addition, inevitable jitter occurs in measured flow curves. Hence, flow curves need to be interpolated and smoothed before the work hardening rate and further derivatives necessary for evaluating the criterion by Poliak and Jonas can be computed. In this paper, the polynomial interpolation originally proposed by Poliak and Jonas is compared to a new approach based on radial basis functions using a thin plate spline kernel, which combines surface interpolation of various flow curves and smoothing in a single step. It is shown for different steel grades that the interpolation method used has a crucial influence on the resulting critical conditions for DRX, and that a simultaneous evaluation by surface interpolation might yield consistent critical conditions over a range of testing temperatures.
APA, Harvard, Vancouver, ISO, and other styles
15

Hutchinson, Michael F., Dan W. McKenney, Kevin Lawrence, John H. Pedlar, Ron F. Hopkinson, Ewa Milewska, and Pia Papadopol. "Development and Testing of Canada-Wide Interpolated Spatial Models of Daily Minimum–Maximum Temperature and Precipitation for 1961–2003." Journal of Applied Meteorology and Climatology 48, no. 4 (April 1, 2009): 725–41. http://dx.doi.org/10.1175/2008jamc1979.1.

Full text
Abstract:
Abstract The application of trivariate thin-plate smoothing splines to the interpolation of daily weather data is investigated. The method was used to develop spatial models of daily minimum and maximum temperature and daily precipitation for all of Canada, at a spatial resolution of 300 arc s of latitude and longitude, for the period 1961–2003. Each daily model was optimized automatically by minimizing the generalized cross validation. The fitted trivariate splines incorporated a spatially varying dependence on ground elevation and were able to adapt automatically to the large variation in station density over Canada. Extensive quality control measures were performed on the source data. Error estimates for the fitted surfaces based on withheld data across southern Canada were comparable to, or smaller than, errors obtained by daily interpolation studies elsewhere with denser data networks. Mean absolute errors in daily maximum and minimum temperature averaged over all years were 1.1° and 1.6°C, respectively. Daily temperature extremes were also well matched. Daily precipitation is challenging because of short correlation length scales, the preponderance of zeros, and significant error associated with measurement of snow. A two-stage approach was adopted in which precipitation occurrence was estimated and then used in conjunction with a surface of positive precipitation values. Daily precipitation occurrence was correctly predicted 83% of the time. Withheld errors in daily precipitation were small, with mean absolute errors of 2.9 mm, although these were relatively large in percentage terms. However, mean percent absolute errors in seasonal and annual precipitation totals were 14% and 9%, respectively, and seasonal precipitation upper 95th percentiles were attenuated on average by 8%. Precipitation and daily maximum temperatures were most accurately interpolated in the autumn, consistent with the large well-organized synoptic systems that prevail in this season. Daily minimum temperatures were most accurately interpolated in summer. The withheld data tests indicate that the models can be used with confidence across southern Canada in applications that depend on daily temperature and accumulated seasonal and annual precipitation. They should be used with care in applications that depend critically on daily precipitation extremes.
APA, Harvard, Vancouver, ISO, and other styles
16

Abma, Ray, and Nurul Kabir. "3D interpolation of irregular data with a POCS algorithm." GEOPHYSICS 71, no. 6 (November 2006): E91—E97. http://dx.doi.org/10.1190/1.2356088.

Full text
Abstract:
Seismic surveys generally have irregular areas where data cannot be acquired. These data should often be interpolated. A projection onto convex sets (POCS) algorithm using Fourier transforms allows interpolation of irregularly populated grids of seismic data with a simple iterative method that produces high-quality results. The original 2D image restoration method, the Gerchberg-Saxton algorithm, is extended easily to higher dimensions, and the 3D version of the process used here produces much better interpolations than typical 2D methods. The only parameter that makes a substantial difference in the results is the number of iterations used, and this number can be overestimated without degrading the quality of the results. This simplicity is a significant advantage because it relieves the user of extensive parameter testing. Although the cost of the algorithm is several times the cost of typical 2D methods, the method is easily parallelized and still completely practical.
APA, Harvard, Vancouver, ISO, and other styles
17

Christiansen, Peter, and Niels Bay. "Revisiting Veerman’s interpolation method." Proceedings of the Institution of Mechanical Engineers, Part L: Journal of Materials: Design and Applications 233, no. 2 (September 3, 2016): 189–96. http://dx.doi.org/10.1177/1464420716667758.

Full text
Abstract:
This article describes an investigation of Veerman’s interpolation method and its applicability for determining sheet metal formability. The theoretical foundation is established and its mathematical assumptions are clarified. An exact Lagrangian interpolation scheme is also established for comparison. Bulge testing and tensile testing of aluminium sheets containing electro-chemically etched circle grids are performed to experimentally determine the forming limit of the sheet material. The forming limit is determined using (a) Veerman’s interpolation method, (b) exact Lagrangian interpolation and (c) FE-simulations. A comparison of the determined forming limits yields insignificant differences in the limit strain obtained with Veerman’s method or exact Lagrangian interpolation for the two sheet metal forming processes investigated. The agreement with the FE-simulations is reasonable.
APA, Harvard, Vancouver, ISO, and other styles
18

ROHIT, NAGORI, and CHAUDHARI K. N. "Development of high spatial resolution weather data using daily meteorological observations over Indian region." MAUSAM 71, no. 4 (August 4, 2021): 605–16. http://dx.doi.org/10.54302/mausam.v71i4.43.

Full text
Abstract:
Finer spatial resolution interpolated weather data is essential to enable utilization of satellite-based images in studies related to crop growth dynamics, etc. Satellite data are available daily at 1 × 1 km or at the most within 5 × 5 km grid. To make the weather data timely available at the same spatial scale, the procedure has been developed to generate the spatially interpolated weather data product over India. Daily weather data (minimum & maximum temperature and rainfall) available at point scale on India Meteorology Department web site have been used in this study. A semi-automated user interactive Graphical User Interface (GUI) has been developed which quality checks the temperature data sets by filling the missing data sets as well as providing a platform to correct erroneous data which have been identified using statistical methods taking spatial as well as temporal incompatibility into account. Daily spatially interpolated product is generated in image form using thin plate spline interpolation technique that uses the quality checked weather data as well as elevation information from CARTODEM data in order to account for effect of elevation on temperature. The validation was performed using “Jack-knife testing method” for three different seasons i.e., monsoon, summer and winter. The mean absolute errors for decadal averaged products were found to vary within 1.2-1.5 °C for maximum temperature, 1.1-1.7 °C for minimum temperature, 1.0-7.0 mm for rainfall considering all seasons with higher error observed in monsoon for maximum temperature and rainfall and in winter for minimum temperature. It was found that errors were close to 1 °C for stations with elevation less than 550 m whereas in central portion of India, mean absolute errors were found to be less than 1 °C.
APA, Harvard, Vancouver, ISO, and other styles
19

Arseni, Voiculescu, Georgescu, Iticescu, and Rosu. "Testing Different Interpolation Methods Based on Single Beam Echosounder River Surveying. Case Study: Siret River." ISPRS International Journal of Geo-Information 8, no. 11 (November 10, 2019): 507. http://dx.doi.org/10.3390/ijgi8110507.

Full text
Abstract:
Bathymetric measurements play an important role in assessing the sedimentation rate, deposition of pollutants, erosion rate, or monitoring of morphological changes in a river, lake, or accumulation basin. In order to create a coherent and continuous digital elevation model (DEM) of a river bed, various data interpolation methods are used, especially when single-beam bathymetric measurements do not cover the entire area and when there are areas which are not measured. Interpolation methods are based on numerical models applied to natural landscapes (e.g., meandering river) by taking into account various morphometric and morphologies and a wide range of scales. Obviously, each interpolation method, used in standard or customised form, yields different results. This study aims at testing four interpolation methods in order to determine the most appropriate method which will give an accurate description of the riverbed, based on single-beam bathymetric measurements. The four interpolation methods selected in the present research are: inverse distance weighting (IDW), radial basis function (RBF) with completely regularized spline (CRS) which uses deterministic interpolation, simple kriging (KRG) which is a geo-statistical method, and Topo to Raster (TopoR), a particular method specifically designed for creating continuous surfaces from various elevation points, contour, or polygon data, suitable for creating surfaces for hydrologic analysis. Digital elevation models (DEM’s) were statistically analyzed and precision and errors were evaluated. The single-beam bathymetric measurements were made on the Siret River, between 0 and 35 km. To check and validate the methods, the experiment was repeated for five randomly selected cross-sections in a 1500 m section of the river. The results were then compared with the data extracted from each elevation model generated with each of the four interpolation methods. Our results show that: 1) TopoR is the most accurate technique, and 2) the two deterministic methods give large errors in bank areas, for the entire river channel and for the particular cross-sections.
APA, Harvard, Vancouver, ISO, and other styles
20

Lu, Tianhuan, Jun Zhang, Fuyu Dong, Yingke Li, Dezi Liu, Liping Fu, Guoliang Li, and Zuhui Fan. "Testing PSF Interpolation in Weak Lensing with Real Data." Astronomical Journal 153, no. 4 (April 4, 2017): 197. http://dx.doi.org/10.3847/1538-3881/aa661e.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Mishra, D. K. "ADC testing using interpolated fast Fourier transform (IFFT) technique." International Journal of Electronics 90, no. 7 (July 2003): 459–69. http://dx.doi.org/10.1080/00207210310001621536.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Garrigan, P., and P. J. Kellman. "Three-dimensional contour interpolation: Testing the 90-degree constraint." Journal of Vision 2, no. 7 (March 14, 2010): 356. http://dx.doi.org/10.1167/2.7.356.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Fulvio, J. M., M. Singh, and L. T. Maloney. "Breakdown of contour interpolation: Testing a multiple-contours hypothesis." Journal of Vision 7, no. 9 (March 18, 2010): 111. http://dx.doi.org/10.1167/7.9.111.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Gagliardi, Valerio, Luca Bianchini Ciampoli, Sebastiano Trevisani, Fabrizio D’Amico, Amir M. Alani, Andrea Benedetto, and Fabio Tosti. "Testing Sentinel-1 SAR Interferometry Data for Airport Runway Monitoring: A Geostatistical Analysis." Sensors 21, no. 17 (August 27, 2021): 5769. http://dx.doi.org/10.3390/s21175769.

Full text
Abstract:
Multi-Temporal Interferometric Synthetic Aperture Radar (MT-InSAR) techniques are gaining momentum in the assessment and health monitoring of infrastructure assets. Amongst others, the Persistent Scatterers Interferometry (PSI) technique has proven to be viable for the long-term evaluation of ground scatterers. However, its effectiveness as a routine tool for certain critical application areas, such as the assessment of millimetre-scale differential displacements in airport runways, is still debated. This research aims to demonstrate the viability of using medium-resolution Copernicus ESA Sentinel-1A (C-Band) SAR products and their contribution to improve current maintenance strategies in case of localised foundation settlements in airport runways. To this purpose, “Runway n.3” of the “Leonardo Da Vinci International Airport” in Fiumicino, Rome, Italy was investigated as an explanatory case study, in view of historical geotechnical settlements affecting the runway area. In this context, a geostatistical study is developed for the exploratory spatial data analysis and the interpolation of the Sentinel-1A SAR data. The geostatistical analysis provided ample information on the spatial continuity of the Sentinel 1 data in comparison with the high-resolution COSMO-SkyMed data and the ground-based topographic levelling data. Furthermore, a comparison between the PSI outcomes from the Sentinel-1A SAR data—interpolated through Ordinary Kriging—and the ground-truth topographic levelling data demonstrated the high accuracy of the Sentinel 1 data. This is proven by the high values of the correlation coefficient (r = 0.94), the multiple R-squared coefficient (R2 = 0.88) and the Slope value (0.96). The results of this study clearly support the effectiveness of using Sentinel-1A SAR data as a continuous and long-term routine monitoring tool for millimetre-scale displacements in airport runways, paving the way for the development of more efficient and sustainable maintenance strategies for inclusion in next generation Airport Pavement Management Systems (APMSs).
APA, Harvard, Vancouver, ISO, and other styles
25

Tam, Vivian W. Y., and Khoa N. Le. "Optimization of Aggregate Testing Using Polynomial Regression." Journal of Green Building 2, no. 3 (August 1, 2007): 144–52. http://dx.doi.org/10.3992/jgb.2.3.144.

Full text
Abstract:
Recycled Aggregate (RA) has only been used as sub-grade applications, roadwork and unbound materials because it is believed that RA has lower strength and poorer quality than normal aggregate. This paper assesses RA quality by proposing a new testing procedure. The new procedure is developed by using correlation information among the standard twenty three tests commonly employed in the construction industry. Samples of RA from ten demolition sites were obtained with service life ranging from ten to forty years. One additional set of samples was specifically collected from the Tuen Mun Area 38 Recycling Plant, Hong Kong for testing. The characteristics of these eleven sets of samples were then compared with normal aggregate samples. A Vandermonde matrix for interpolation polynomials coefficient estimation is employed to yield detailed mathematical relationships between results of two particular tests, from that redundant tests can be identified. Different orders of interpolation polynomials are used for comparison, hence the best-fit equations with lowest fitting errors from different orders of polynomials can be found. This paper shows that four out of twenty three tests are required in the new testing procedure.
APA, Harvard, Vancouver, ISO, and other styles
26

Bai, Ying, Nailong Guo, and Gerald Agbegha. "Fuzzy Interpolation and Other Interpolation Methods Used in Robot Calibrations." Journal of Robotics 2012 (2012): 1–9. http://dx.doi.org/10.1155/2012/376293.

Full text
Abstract:
A novel interpolation algorithm, fuzzy interpolation, is presented and compared with other popular interpolation methods widely implemented in industrial robots calibrations and manufacturing applications. Different interpolation algorithms have been developed, reported, and implemented in many industrial robot calibrations and manufacturing processes in recent years. Most of them are based on looking for the optimal interpolation trajectories based on some known values on given points around a workspace. However, it is rare to build an optimal interpolation results based on some random noises, and this is one of the most popular topics in industrial testing and measurement applications. The fuzzy interpolation algorithm (FIA) reported in this paper provides a convenient and simple way to solve this problem and offers more accurate interpolation results based on given position or orientation errors that are randomly distributed in real time. This method can be implemented in many industrial applications, such as manipulators measurements and calibrations, industrial automations, and semiconductor manufacturing processes.
APA, Harvard, Vancouver, ISO, and other styles
27

Hermawan, Muhammad, Herry Sujaini, and Novi Safriadi. "Influence Analysis of Smoothing Algorithms in Language Modelling for Indonesian Statistical Machine Translation." International Journal of Engineering and Applied Science Research 1, no. 1 (July 30, 2020): 11. http://dx.doi.org/10.26418/ijeasr.v1i1.41088.

Full text
Abstract:
The diversity of languages makes the need for translation so that communication between individuals of different languages can be appropriately established. The statistical translator engine (SMT) was a translator engine based on a statistical approach to parallel corpus analysis. One crucial part of SMT was language modeling (LM). LM was the calculation of word probability from a corpus-based on n-grams. There was a smoothing algorithm in LM where this algorithm will bring up the probability of a word whose value was zero. This study compares the use of the best smoothing algorithm from each of the three LM according to the standard Moses, namely KenLM, SRILM, and IRSTLM. For SRILM using smoothing algorithm interpolation with Witten-bell and interpolation with Ristads natural discounting, for KenLM using interpolation with modified Kneser-ney smoothing algorithm, and for IRSTLM using modified Kneser-ney and Witten-bell algorithm which was referenced based on previously researched. This study uses a corpus of 10,000 sentences. Tests carried out by BLEU and testing by Melayu Sambas linguists. Based on the results of BLEU testing and linguist testing, the best smoothing algorithm was chosen, namely modified Kneser-ney in KenLM LM, where the average results of automated testing, for Indonesian-Melayu Sambas and vice versa were 41. 6925% and 46. 66%. Moreover, for testing linguists, the accuracy of the Indonesian-Melayu Sambas language and vice versa was 77. 3165% and 77. 9095%
APA, Harvard, Vancouver, ISO, and other styles
28

Pansky, Ainat, and Einat Tenenboim. "Inoculating against eyewitness suggestibility via interpolated verbatim vs. gist testing." Memory & Cognition 39, no. 1 (November 6, 2010): 155–70. http://dx.doi.org/10.3758/s13421-010-0005-8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Tam, Vivian W. Y., and Khoa N. Le. "Aggregate testing using 2nd-, 7th- and 10th-order interpolation polynomials." Resources, Conservation and Recycling 52, no. 1 (November 2007): 39–57. http://dx.doi.org/10.1016/j.resconrec.2007.02.001.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

Martins, Carlos Gil, and Jaimie Cross. "Technical note: TEOS-10 Excel – implementation of the Thermodynamic Equation Of Seawater – 2010 in Excel." Ocean Science 18, no. 3 (May 6, 2022): 627–38. http://dx.doi.org/10.5194/os-18-627-2022.

Full text
Abstract:
Abstract. This paper and associated software implement the Thermodynamic Equation Of Seawater – 2010 (TEOS-10) in Excel for an efficient estimation of Absolute Salinity (SA), Conservative Temperature (Θ), and derived thermodynamic properties of seawater – potential density (σΘ), in situ density (ρSA,Θ,p), and sound speed (c). Vertical profile template plots for these parameters are included as is an SA–Θ diagram template, which includes plotting of the density field (computation of user-selected σΘ lines is included). Absolute Salinity can be directly measured with the aid of a densimeter (IOC, SCOR and IAPSO, 2010: p. 82), but in TEOS-10 its estimation relies on the interpolation of data from casts of seawater from the world ocean (IOC, SCOR and IAPSO, 2010), and the Excel workbook introduced here (TEOS-10 Excel, available at https://doi.org/10.5281/zenodo.4748829) includes a subset of the TEOS-10 look-up tables necessary for this estimation, namely the Absolute Salinity Anomaly [deltaSA_ref] and the Absolute Salinity Anomaly Ratio [SAAR_ref] look-up tables. As the user simply needs to paste new data into the spreadsheet to automatically compute the oceanographic parameters referred above, this tool may prove to be useful for all who are not comfortable using the full-featured TEOS-10 programming language environments (e.g. MATLAB, FORTRAN, C) but rather need a simpler way of computing fundamental properties of seawater (e.g. density, sound speed) while adhering to current standards. Returned values are the same (up to 15 decimal places; i.e. difference = 0.000000000000000) as the ones obtained with the MATLAB version of the GSW (Gibbs Sea Water) toolbox (McDougall and Barker, 2011) available at the TEOS-10 website (https://www.teos-10.org, last access: 13 April 2022). This paper describes the Excel workbook, its use, and the included VBA (Visual Basic for Applications) functions. Quality control against the GSW toolbox is also addressed, namely issues detected with the interpolated values returned by the toolbox when there are missing values in the reference look-up table. In these situations, the GSW toolbox replaces missing values with a level pressure horizontal interpolation of neighbour points, while it is clear from the testing results that vertical interpolation, which was then implemented in TEOS-10 Excel, returns a more robust solution.
APA, Harvard, Vancouver, ISO, and other styles
31

Zhao, Yan Wei, X. B. Tian, X. W. Xu, and Xu Ming Chu. "Tangent Following Control Method of Arc with CNC Cutting Machine Tool." Materials Science Forum 626-627 (August 2009): 477–82. http://dx.doi.org/10.4028/www.scientific.net/msf.626-627.477.

Full text
Abstract:
In order to resolve the problem of cutter rotation controlling in the process of cutting using in CNC cutting machine tool, the principle of cutting of CNC cutting machine-tool is analyzed and the coordinate of the machine tool’s motion is built. Tangent Following interpolation of arc based on data sampling algorithm is researched. Interpolation formula is educed and its arithmetic flow chart is given. Two kinds of inaccuracy are analyzed and their comparison graph is studied. PC and Controller motion control system is used in this paper and close-loop speed control system is selected. Dynamic simulation of this interpolation is realized by testing the program written by VC++6.0 under the Googol motion control platform. The result shows that the Tangent Following interpolation can satisfy the need of cutting control process well.
APA, Harvard, Vancouver, ISO, and other styles
32

Ivanyos, Gábor, Marek Karpinski, Miklos Santha, Nitin Saxena, and Igor E. Shparlinski. "Polynomial Interpolation and Identity Testing from High Powers Over Finite Fields." Algorithmica 80, no. 2 (January 5, 2017): 560–75. http://dx.doi.org/10.1007/s00453-016-0273-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Robinson, T. P., and G. Metternicht. "Testing the performance of spatial interpolation techniques for mapping soil properties." Computers and Electronics in Agriculture 50, no. 2 (February 2006): 97–108. http://dx.doi.org/10.1016/j.compag.2005.07.003.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Li, Yongjian, Yihang Xu, Mingzhi Shao, Hong Gao, Chongbang Tang, and Qian Lu. "Fast Detection Using PID Pre-Interpolation Algorithm for Magnetic Particle Testing." IEEE Transactions on Magnetics 56, no. 12 (December 2020): 1–7. http://dx.doi.org/10.1109/tmag.2020.3025050.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Gosling, P. D., and B. N. Bridgens. "Material Testing & Computational Mechanics — A New Philosophy for Architectural Fabrics." International Journal of Space Structures 23, no. 4 (November 2008): 215–32. http://dx.doi.org/10.1260/026635108786959870.

Full text
Abstract:
This paper proposes a new perspective on the analysis of fabric structures – the concept that material testing and computational mechanics are mutually dependent and, by implication, not to be considered as independent. Current representations of fabric stress-strain behaviour are based on plane-stress assumptions, and tend to simplify the available data (e.g. use of secant elastic moduli). Young's moduli and Poisson's ratios were determined for each test so as to provide the best fit plane to the scattered data points. These planar representations provided limited correlation with test data. The elastic constants do not comply with plane stress theory since coated woven fabrics are not homogeneous materials: They are composites with the interaction of orthogonal yarns making them act as a constrained mechanism. A new approach to incorporating fabric test data in structural analysis is proposed here: Use of direct correlation between pairs of stresses and strains. This avoids the inherent approximation in defining elastic constants or other parameters to quantify the fabric behaviour. A simple triangular interpolation scheme is recommended which is robust and avoids the risk of unreliable interpolation and extrapolation when using a functional representation of the data.
APA, Harvard, Vancouver, ISO, and other styles
36

Bancheri, Marialaura, Francesco Serafin, Michele Bottazzi, Wuletawu Abera, Giuseppe Formetta, and Riccardo Rigon. "The design, deployment, and testing of kriging models in GEOframe with SIK-0.9.8." Geoscientific Model Development 11, no. 6 (June 13, 2018): 2189–207. http://dx.doi.org/10.5194/gmd-11-2189-2018.

Full text
Abstract:
Abstract. This work presents a software package for the interpolation of climatological variables, such as temperature and precipitation, using kriging techniques. The purposes of the paper are (1) to present a geostatistical software that is easy to use and easy to plug in to a hydrological model; (2) to provide a practical example of an accurately designed software from the perspective of reproducible research; and (3) to demonstrate the goodness of the results of the software and so have a reliable alternative to other, more traditional tools. A total of 11 types of theoretical semivariograms and four types of kriging were implemented and gathered into Object Modeling System-compliant components. The package provides real-time optimization for semivariogram and kriging parameters. The software was tested using a year's worth of hourly temperature readings and a rain storm event (11 h) recorded in 2008 and retrieved from 97 meteorological stations in the Isarco River basin, Italy. For both the variables, good interpolation results were obtained and then compared to the results from the R package gstat.
APA, Harvard, Vancouver, ISO, and other styles
37

Luo, Yuan, Chao Ji, Yi Zhang, and Zhang Fang Hu. "A Fractal Based Subpixel Image Edge Detection Algorithm." Applied Mechanics and Materials 239-240 (December 2012): 1546–51. http://dx.doi.org/10.4028/www.scientific.net/amm.239-240.1546.

Full text
Abstract:
Application of machine vision method for MEMS dynamic parameters were measured, the testing image have a certain degree of ambiguity.This paper presents a sub-pixel algorithm based on fractal and wavelet transform: Firstly, using self-similar characteristics of fractal interpolation to overcome the problem ,that can not be accurate interpolation and the edge of the image reconstruction. Then because of abilities of high resolution and anti-noise,using wavelet transform modulus maxima,the image edge detection.The experimental results show that the algorithm can reach 0.02 pixel accuracy.
APA, Harvard, Vancouver, ISO, and other styles
38

Naceur, Selmi, and Bel Hadj Salah Hedi. "Finite Element and Experimental Investigation of the Multipoint Flexible Hydroforming." Key Engineering Materials 554-557 (June 2013): 1290–97. http://dx.doi.org/10.4028/www.scientific.net/kem.554-557.1290.

Full text
Abstract:
FINITE ELEMENT AND EXPERIMENTAL INVESTIGATIONS OF THE MULTI-POINT FLEXIBLE HYDOFORMING. N. Selmi*, H. BelHadjSalah* *Mechanical Engineering Laboratory (LGM), National Engineering School of Monastir (ENIM), University of Monastir, Avenue Ibn El Jazzar 5019, Monastir, Tunisia. naselmi2002@yahoo.fr, hedi.belhadjsalah@enim.rnu.tn. ABSTRACT Multi-point flexible forming (MPF) process is relatively recent flexible techniques [1], instead of the conventional fixed shape die sets, the basic idea in this process, consist to form the sheet metal between a pair of opposed matrices of punch elements, by adjusting the height of the punch elements [2]. Production of many parts with different geometry will be possible, just by using one same device and the need to design and manufacturing of various dies will be avoided that lead to great saving in time and manufacturing cost specially in the field of small batch or single production. The hydroforming process is attractive compared with conventional solid die forming processes, the basic idea consist to suppress one tool of two forming tools (punch or die), which is replaced by hydraulic pressure, only one tool is necessary to define the final shape of formed sheet. The multipoint flexible hydroforming, proposed in this paper, is an original process which combines the hydroforming and the multipoint flexible forming [3], to obtain a synergy of the advantages of both processes. The new process, subject of this work, is a combination of the last described processes that keep the whole flexibility of the basic multipoint flexible forming (with two dies), by using, only at one side, a single multipoint die to perform completely the final part shape, the fluid pressure is applied on the other side of the sheet metal part and substitutes advantageously the second die. Firstly, investigations were carried out by numerical simulation, to quantify, the effect of the most influent parameters on the process performances, and to highlight the ability of this new process, in the production of complex forms, as well as its contribution in quality, placed with regards existing flexible processes. Secondly, to prove the feasibility and to carry out a valuable experimental investigation of the multipoint flexible hydroforming, an experimental prototype was designed and realized, and successful doubly curved shell shape parts were obtained by the new process testing set up. The part profiles and the thickness distribution were in agreement with those obtained by numerical investigation furthermore, numerical investigation for efficient methods to suppress the dimpling phenomenon and edge buckling were confirmed by experimental investigation. From investigations it appears that the parameters attached to the discreet character of the multipoint tool, have an important effect on the quality of the final metal sheet product, such as, the punch elements density, the punch elements extremity curvature radius, the blank and the elastomeric interpolator thicknesses. From simulation results, it emerges essentially, that an adequate setting of parameters can upgrade the thickness distribution, reduce the residual stress and attenuate the dimples. References: [1] Zhong-Yi Cai, Shao-Hui Wanga, Ming-Zhe Li, (2008), Numerical investigation of multi-point forming process for sheet metal: wrinkling, dimpling and spring back, Int J Adv Manuf Technol (2008) 37:927–936. [2] Zhong-Yi Cai, Shao-Hui Wang, Xu-Dong Xu, Ming-Zhe Li (2009), Numerical simulation for the multi-point stretch forming process of sheet metal, journal of materials processing technology 209 (2009) 396–407. [3] N. Selmi, H. Bel hadj salah, Simulation numérique de l’hydroformage à matrice flexible, 7éme journées scientifiques en mécanique et matériaux JSTMM2010, Hammamet 26-27 novembre2010.
APA, Harvard, Vancouver, ISO, and other styles
39

Quarles, Charles. "Matthew 27:52-53 as a Scribal Interpolation: Testing a Recent Proposal." Bulletin for Biblical Research 27, no. 2 (January 1, 2017): 207–26. http://dx.doi.org/10.5325/bullbiblrese.27.2.0207.

Full text
Abstract:
Abstract In his recent commentary on Matthew, C. A. Evans suggested that Matt 27:52-53 is an early scribal interpolation. Although no extant manuscripts omit these verses, Evans argued thatthe absence of references to these verses in Christian literature prior to the Council of Nicaea supports this conjectural emendation. Although several German and Dutch scholars ofthe 18th and 19th centuries also proposed this emendation, most appealed primarily to historical and theological evidence. A thorough text-critical analysis has not previously beenpublished. This essay explores the internal and external evidence and concludes that these verses belong to the earliest text of Matthew that can be established from currentlyavailable evidence.
APA, Harvard, Vancouver, ISO, and other styles
40

Bustomi, M. A. "Testing of Image Resolution Enhancement Techniques Using Bi-cubic Spatial Domain Interpolation." Journal of Physics: Conference Series 1417 (December 2019): 012028. http://dx.doi.org/10.1088/1742-6596/1417/1/012028.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

van Loon, M. "Testing interpolation and filtering techniques in connection with a semi-Lagrangian method." Atmospheric Environment. Part A. General Topics 27, no. 15 (October 1993): 2351–64. http://dx.doi.org/10.1016/0960-1686(93)90403-l.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Alshehri, Hashim M., Hasib Khan, and Zareen A. Khan. "Existence and Numerical Analysis of Imperfect Testing Infectious Disease Model in the Sense of Fractional-Order Operator." Journal of Function Spaces 2021 (July 28, 2021): 1–11. http://dx.doi.org/10.1155/2021/3297562.

Full text
Abstract:
In the present paper, we study a mathematical model of an imperfect testing infectious disease model in the sense of the Mittage-Leffler kernel. The Banach contraction principle has been used for the existence and uniqueness of solutions of the suggested model. Furthermore, a numerical method equipped with Lagrangian polynomial interpolation has been utilized for the numerical outcomes. Diagramming and discussion are used to clarify the effects of related parameters in the fractional-order imperfect testing infectious disease model.
APA, Harvard, Vancouver, ISO, and other styles
43

Shen, Bin, Dang Jin Qi, Liu Qun Fan, and Zhi Hao Zhu. "Direct Interpolation of Tool Orientation in 5-Axis Milling." Advanced Materials Research 228-229 (April 2011): 402–7. http://dx.doi.org/10.4028/www.scientific.net/amr.228-229.402.

Full text
Abstract:
By circumference milling with 5-axis machine or machining center has quite the machining efficiency and the surface quality of the products been improved. But when inclined inside surface of cavity is circumference milled with the default linear interpolation of rotation angle algorithm, it results in a serious non-linear error, which takes on over cutting or under cutting, e.g. the inclined plane will be over milled by the machine with dual swivel-head machining center. In order to resolve this problem, the direct interpolation algorithm of tool orientation is developed after analyzing the algorithm of linear interpolation of rotation angles and its non-linear error is. The algorithm is reported and tested with the object of a machine with AB-axis swivel head. The testing results proved that it can resolve the problem of non-linear error.
APA, Harvard, Vancouver, ISO, and other styles
44

Beck, Markus, and Gerhard Fischerauer. "Modeling Warp in Corrugated Cardboard Based on Homogenization Techniques for In-Process Measurement Applications." Applied Sciences 12, no. 3 (February 6, 2022): 1684. http://dx.doi.org/10.3390/app12031684.

Full text
Abstract:
A model for describing warp—characterized as a systematic, large-scale deviation from the intended flat shape—in corrugated board based on Kirchhoff plate theory is proposed. It is based on established homogenization techniques and only a minimum of model assumptions. This yields general results applicable to any kind of corrugated cardboard. Since the model is intended to be used with industrial data, basic material properties which are usually not measured in practice are summarized to a few parameters. Those parameters can easily be fitted to the measurement data, allowing the user to systematically identify ways to reduce warp in a given situation in practice. In particular, the model can be used both as a filter to separate the warp from other surface effects such as washboarding, and to interpolate between discrete sample points scattered across the surface of a corrugated board sheet. Applying the model only requires height measurements of the corrugated board at several known (not necessarily exactly predetermined) locations across the corrugated board and acts as an interpolation or regression method between those points. These data can be acquired during production in a cost-efficient way and do not require any destructive testing of the board. The principle of an algorithm for fitting measured data to the model is presented and illustrated with examples taken from ongoing measurements. Additionally, the case of warp-free board is analyzed in more detail to deduce additional theoretical conditions necessary to reach this state.
APA, Harvard, Vancouver, ISO, and other styles
45

Kvasnikov, V. P., S. V. Yehorov, and T. Yu Shkvarnytska. "Technology for restoring functional dependencies to determine reliability parameters." BULLETIN OF THE KARAGANDA UNIVERSITY-MATHEMATICS 101, no. 1 (March 30, 2021): 78–86. http://dx.doi.org/10.31489/2021m1/78-86.

Full text
Abstract:
The problem of determining the properties of the object by analyzing the numerical and qualitative characteristics of a discrete sample is considered. A method has been developed to determine the probability of trouble-free operation of electronic systems for the case if the interpolation fields are different between several interpolation nodes. A method has been developed to determine the probability of trouble-free operation if the interpolation polynomial is the same for the entire interpolation domain. It is shown that local interpolation methods give more accurate results, in contrast to global interpolation methods. It is shown that in the case of global interpolation it is possible to determine the value of the function outside the given values by extrapolation methods, which makes it possible to predict the probability of failure. It is shown that the use of approximation methods to determine the probability of trouble-free operation reduces the error of the second kind. A method for analyzing the qualitative characteristics of functional dependences has been developed, which allows us to choose the optimal interpolation polynomial. With sufficient statistics, using the criteria of consent, it is possible to build mathematical models for the analysis of failure statistics of electronic equipment. Provided that the volume of statistics is not large, such statistics may not be sufficient and the application of consent criteria will lead to unsatisfactory results. Another approach is to use an approximation method that is applied to statistical material that was collected during testing or controlled operation. In this regard, it is extremely important to develop a method for determining the reliability of electronic systems in case of insufficiency of the collected statistics of failures of electronic equipment.
APA, Harvard, Vancouver, ISO, and other styles
46

Stasik, Paweł M., and Julian Balcerek. "Extensible Implementation of Reliable Pixel Art Interpolation." Foundations of Computing and Decision Sciences 44, no. 2 (June 1, 2019): 213–39. http://dx.doi.org/10.2478/fcds-2019-0011.

Full text
Abstract:
Abstract Pixel art is aesthetics that emulates the graphical style of old computer systems. Graphics created with this style needs to be scaled up for presentation on modern displays. The authors proposed two new modifications of image scaling for this purpose: a proximity-based coefficient correction and a transition area restriction. Moreover a new interpolation kernel has been introduced. The presented approaches are aimed at reliable and flexible bitmap scaling while overcoming limitations of existing methods. The new techniques were introduced in an extensible. NET application that serves as both an executable program and a library. The project is designed for prototyping and testing interpolation operations and can be easily expanded with new functionality by adding it to the code or by using the provided interface.
APA, Harvard, Vancouver, ISO, and other styles
47

Johnston, D. N., and J. E. Drew. "Measurement of Positive Displacement Pump Flow Ripple and Impedance." Proceedings of the Institution of Mechanical Engineers, Part I: Journal of Systems and Control Engineering 210, no. 1 (February 1996): 65–74. http://dx.doi.org/10.1243/pime_proc_1996_210_437_02.

Full text
Abstract:
The secondary source method forms the British Standard for pump fluid-borne noise testing. This is a powerful technique but requires care in order to produce accurate results. This paper describes practical aspects for implementing the method. The requirements for the test rig, data acquisition system and analysis are detailed. The British Standard specifies that either mathematical modelling or linear interpolation is used on the source impedance measurements. A method for smoothing the impedance results is described in this paper, which is shown to give more repeatable results than linear interpolation. Some physically realistic mathematical models of pump impedance are described, and their use in determining the internal flow ripple discussed.
APA, Harvard, Vancouver, ISO, and other styles
48

Clausen, Michael, Andreas Dress, Johannes Grabmeier, and Marek Karpinski. "On zero-testing and interpolation of k-sparse multivariate polynomials over finite fields." Theoretical Computer Science 84, no. 2 (July 1991): 151–64. http://dx.doi.org/10.1016/0304-3975(91)90157-w.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Zheng, Huai Guo. "Research on Agricultural Products Quality Control Based on Computer Vision Information Technology." Applied Mechanics and Materials 380-384 (August 2013): 1085–88. http://dx.doi.org/10.4028/www.scientific.net/amm.380-384.1085.

Full text
Abstract:
In 2012, the ministry of agriculture released data showed that the import and export trade deficit of China's agricultural products had reached 49190000000 US dollars. China is a large agricultural country, agricultural crop occupies world front row, but export transactions accounted for the total trading lower percentage, the main reason is the quality testing technology that does not pass, detection technology is relatively backward, in this background, this paper puts forward a kind of advanced agricultural products quality control technology - computer visual information retrieval control technology. On the basis of the two-order upwind and linear interpolation theory, the use of image segmentation technology can successfully establish a computer vision inspection of agricultural products to mathematical model, and the establishment of the interpolation functions. Finally this paper is based on peanut detection as an example, the 50 cases of peanut quality are detected by using computer vision nondestructive technology testing, and the use of MATLAB software carries out mathematical statistics for its quality situation, the test results and the artificial inspection results are compared to verify the reliability of computer vision quality detection.
APA, Harvard, Vancouver, ISO, and other styles
50

Kamaruddin, Sharir Aizat, Aimie Rifhan Hashim, Munawwarah Arsyad, Nur Izzati Mohd Rizal, Behzad Khairiah Lee, Noordalila Ramli, Nurrul Zalaikha Zainal, et al. "ASSESSMENT OF THE SPLINE INTERPOLATION METHOD'S PERFORMANCE IN PREDICTING AND MAPPING THE PHOSPHATE OVER THE COASTAL WATERS OF PULAU TUBA, LANGKAWI, KEDAH." Journal of Tourism, Hospitality and Environment Management 7, no. 28 (June 22, 2022): 89–101. http://dx.doi.org/10.35631/jthem.728007.

Full text
Abstract:
To date, there has been limited knowledge on the applicability of the spline interpolation method to estimate coastal water quality parameters. This study aims to assess the performance of the regularised and tension spline interpolation techniques for predicting and mapping the phosphate concentration in the surface water of Pulau Tuba, Langkawi, Kedah. Sampling points were set up randomly along the coastal water of Pulau Tuba in November 2018. Samples were obtained using the Niskin Water Sampler at 1 meter below the surface and immediately transported into the marine technology laboratory. Phosphate levels were determined using an Ultraviolet Visible spectrophotometer and the ascorbic acid technique. The Geographic Positioning System (GPS) was used to record the geolocation of each sampling point. The training set (50%) and the testing set (50%) were chosen randomly based on 20 sampling points. The regularized and tension spline interpolation methods were used for this study. Evaluation of prediction of models was carried out using the Root Mean Square Error (RMSE) statistics. The study found that the tension spline interpolation method better predicted phosphate concentration. The Root Mean Square Error (RMSE) was reported at 0.106 and 0.094 for the regularized and tension spline interpolation methods, respectively. Concerned parties can utilize the study's findings as guidelines for monitoring, managing, and developing strategic strategies for the long-term development of the coastal waters of Pulau Tuba, Langkawi, Kedah.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography