Academic literature on the topic 'Segment mapping technique'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Segment mapping technique.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Segment mapping technique"

1

Pandey, Shweta, and Deepak Chawla. "Evolving segments of online clothing buyers: an emerging market study." Journal of Advances in Management Research 15, no. 4 (October 1, 2018): 536–57. http://dx.doi.org/10.1108/jamr-12-2017-0121.

Full text
Abstract:
Purpose While marketers want to drive higher repurchases for better business sustainability, repeat shopping experiences may change customer perceptions of the online channel, resulting in the emergence of new segment typologies. Therefore, the purpose of this paper is to explore the segmentation of online clothing shoppers using a repeat online clothing shopper base. Further, it analyses segment positions in a perceptual space to derive relevant positioning insights for the various segments. Design/methodology/approach Segmentation is done using dual bases of e-lifestyle and website quality factors for which the scales are derived from literature and then adapted and validated using a two-phase process across two samples of 271 and 644 experienced shoppers, respectively, in India. Positions of the segments are explored using the discriminant analysis-based perceptual mapping technique. Findings Three segments are found, namely disengaged averse online shoppers, interactive convenience seekers and adept online shopping optimists with the latter two having a higher propensity to purchase clothes online. Perceptual mapping of the segment positions reveals dimensions, which can be used for appropriate positioning. Research limitations/implications The research methodology may be replicated for other products and country contexts, and additional factors may be explored for further insights. Practical implications The study reveals insights on the evolving nature of segments as shoppers gain experience of online shopping for clothes and highlights the varied reasons for the growing acceptability of the online channel. The findings reveal key targeting and positioning strategies for e-marketers. Originality/value This is one of the first studies of its kind in India, which explores the segmentation of repeat online clothing shoppers in India using dual bases. Another distinctive feature of the study is its use of the perceptual mapping technique to draw inferences about factors that differentiate multi-segment buying behavior.
APA, Harvard, Vancouver, ISO, and other styles
2

Pepe, Alessia, Laura Pistoia, Nicola Martini, Daniele De Marchi, Andrea Barison, Aurelio Maggio, Piera Giovangrossi, et al. "Detection of Myocardial Iron Overload with Magnetic Resonance By Native T1 and T2* Mapping Using a Segmental Approach." Blood 132, Supplement 1 (November 29, 2018): 2346. http://dx.doi.org/10.1182/blood-2018-99-112559.

Full text
Abstract:
Abstract Introduction. T2* measurement of myocardial iron overload (MIO) is presently the gold standard for monitoring and tailoring the chelation in thalassemia patients. Native T1 mapping has been proposed also for the MIO quantification because it is known that iron can reduce native T1 values. No data are available in literature comparing T1 and T2* mapping using a segmental approach including the whole left ventricle. The goal of our study was to assess the relationship between T1 and T2* values using a segmental approach. Methods. 29 patients with hemoglopinopathies (18 females, 45.39±13.49 years) enrolled in the Extension Myocardial Iron Overload in Thalassemia (eMIOT) Network were considered. Native T1 and T2* images were acquired, respectively, with the Modified Look-Locker Inversion recovery (MOLLI) and with the multi-echo gradient-echo techniques. Three parallel short-axis views (basal, medium and apical) of the left ventricle (LV) were acquired with ECG-gating. The myocardial T1 and T2* distribution was mapped into a 16-segment LV model, according to the AHA/ACC model. The lower limit of normal for each segment was established as mean±2 standard deviations on data acquired on 14 healthy volunteers. In 25 patients also post-contrastografic images were acquired. Results. T1 images showed more pronounced motion artifacts and lower contrast-to-noise-ratio, determining the exclusion of 18/464 segments. No segments were excluded by T2* mapping. So, globally, 446 segmental T1 and T2* values were considered. The mean of all segmental T2* and T1 values were, respectively, 37.83±11.30 ms and 982.72±118.24 ms. Normal T2* and T1 values were found in 374 segments (83.9%) while 29 (6.5%) segments had pathologic T2* and T1 values. For 33 segments (7.4%) (13 patients) a pathologic T1 value was detected in presence of a normal T2* value. For 10 segments (2.2%) a pathologic T2* value was detected in presence of a normal T1 value. Out of the 9 patients with pathologic T2* values in presence of normal T1, in 7 patients post-contrastografic images were acquired; in all segments with pathologic T2* value macroscopic fibrosis by late gadolinium enhancement technique and/or microscopic fibrosis by T1 mapping were found. The relation between segmental T1 and T2* values is shown in the figure. For patients with pathologic segmental T2* values there was a linear relationship between T1 and T2* values (R=0.735, P<0.0001) while the whole data was fitted with a quadratic curve. Conclusion. T2* and T1 mapping showed a good correlation in identifying iron by a segmental approach. However, we found a scatter between results. In 9 patients T1 mapping was not able to detect iron probably due to the presence of macroscopic and/or microscopic fibrosis that it is known to increase the native T1 . Conversely, in 13 patients T1 mapping seems to be more sensitive than T2* (sensitive to different iron chemistry or error measurements?). Further studies on larger population and correlation with clinical outcome are need. Figure. Figure. Disclosures Pepe: Chiesi Farmaceutici S.p.A., ApoPharma Inc., and Bayer: Other: No profit support.
APA, Harvard, Vancouver, ISO, and other styles
3

Shepherd, James, Pete Bunting, and John Dymond. "Operational Large-Scale Segmentation of Imagery Based on Iterative Elimination." Remote Sensing 11, no. 6 (March 18, 2019): 658. http://dx.doi.org/10.3390/rs11060658.

Full text
Abstract:
Image classification and interpretation are greatly aided through the use of image segmentation. Within the field of environmental remote sensing, image segmentation aims to identify regions of unique or dominant ground cover from their attributes such as spectral signature, texture and context. However, many approaches are not scalable for national mapping programmes due to limits in the size of images that can be processed. Therefore, we present a scalable segmentation algorithm, which is seeded using k-means and provides support for a minimum mapping unit through an innovative iterative elimination process. The algorithm has also been demonstrated for the segmentation of time series datasets capturing both the intra-image variation and change regions. The quality of the segmentation results was assessed by comparison with reference segments along with statistics on the inter- and intra-segment spectral variation. The technique is computationally scalable and is being actively used within the national land cover mapping programme for New Zealand. Additionally, 30-m continental mosaics of Landsat and ALOS-PALSAR have been segmented for Australia in support of national forest height and cover mapping. The algorithm has also been made freely available within the open source Remote Sensing and GIS software Library (RSGISLib).
APA, Harvard, Vancouver, ISO, and other styles
4

Tatarkanov, Aslan, Islam Alexandrov, Alexander Muranov, and Abas Lampezhev. "Development of a Technique for the Spectral Description of Curves of Complex Shape for Problems of Object Classification." Emerging Science Journal 6, no. 6 (December 1, 2022): 1455–75. http://dx.doi.org/10.28991/esj-2022-06-06-015.

Full text
Abstract:
Vascular pathology symptoms can be determined by retinal image segmentation and classification. However, the retinal images from non-invasive diagnostics have a complex structure containing tree-like vascular beds, multiple segment boundaries, false segments, and various distortions. It should be noted that complex structure images’ segmentation does not always provide a single solution. Thus, the goal is to increase the efficiency of vascular diagnostics. This study aims to develop a technique for describing the geometric properties of complexly structured image segments used for classifying vascular pathologies based on retinal images. The advantages and disadvantages of the existing methods and algorithms of segmentation were considered. The most effective use areas of the mentioned methods and algorithms are revealed. Through detecting retinal thrombosis, the algorithm's efficiency for constructing a mathematical model of an arbitrary shape segment based on the morphological processing of binary and halftone images was justified. A modified variant of this algorithm based on the spectral analysis procedure of arbitrary shape boundary curves was used for the spectral description of complex shape curves for classifying vascular pathologies based on retinal images. Two approaches have been developed. The first one allows obtaining a closing segment of the curve from a symmetric mapping of the initial parametric curves. The second involves intelligent data processing and obtaining contours of minimum thickness, forming convex sets. The results of experiments confirm the possibility of practical use of the developed technique to solve problems of vascular pathology classification based on retinal images, showing the correct forecast probability was 0.93 with all associated risk factors. Doi: 10.28991/ESJ-2022-06-06-015 Full Text: PDF
APA, Harvard, Vancouver, ISO, and other styles
5

Madaras, Martin, and Roman Ďurikovič. "Skeleton-based 3D Surface Parameterization Applied on Texture Mapping." Journal of Applied Mathematics, Statistics and Informatics 8, no. 2 (December 1, 2012): 5–19. http://dx.doi.org/10.2478/v10294-012-0010-6.

Full text
Abstract:
Abstract Assume a 2D manifold surface topologically equivalent to a sphere with handles we propose a novel 3D surface parametrization along the surface skeleton. First, we use a global mapping of the surface vertices onto a computed skeleton. Second, we use local mapping of the surrounding area of each skeleton segment into a small rectangle whose size is derived based on the surface properties around the segment. Each rectangle can be textured by assigning the local u;v texture coordinates. Furthermore, these rectangles are packed into a large squared texture called skeleton texture map (STM) by approximately solving a palette loading problem. Our technique enables the mapping of a texture onto the surface without necessity to store texture coordinates together with the model data. In other words it is enough to store the geometry data with STM and the coordinates are calculated on the fly.
APA, Harvard, Vancouver, ISO, and other styles
6

Han, Seong-Soo. "NAND Flash Main Memory Database Index Management Technique Using the T* Tree Segment Mapping Log." International Journal of Urban Design for Ubiquitous Computing 6, no. 2 (September 30, 2018): 7–12. http://dx.doi.org/10.21742/ijuduc.2018.6.2.02.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Lemenkova, Polina. "Exploring structured scripting cartographic technique of GMT for ocean seafloor modeling: A case of the East Indian Ocean." Maritime Technology and Research 3, no. 2 (February 1, 2021): 162–84. http://dx.doi.org/10.33175/mtr.2021.248158.

Full text
Abstract:
This paper examines spatial variations in the geomorphology of the Ninety East Ridge (NER), located in the Indian Ocean. The NER is an extraordinary long linear bathymetric feature with topography reflecting complex geophysical setting and geologic evolution. The research is based on a compilation of high-resolution bathymetric, geological, and gravity datasets clipped for the study area extent (65° - 107°E, 35°S - 21°N): General Bathymetric Chart of the Oceans (GEBCO), Earth Gravitational Model (EGM2008, EGM96). The submarine geomorphology of the NER was modeled by digitized cross-sectional profiles using Generic Mapping Tools (GMT). The availability of the method is explained by 1) the free datasets; 2) the open source GMT toolset; 3) the available tutorials of the GMT and the codes explained in this work. Three segments of the NER were selected, digitized, and modeled: 1) northern 89°E, 7°S to 90°E, 7°N; 2) central 88.4°E, 14.7°S to 88.8°E, 8.2°S; 3) southern 87.9°E, 17°S to 87.5°E, 27°S. Measured depths were visualized in graphs, compared, and statistically analyzed by the histograms. The northern segment has a steepness of 21.3° at the western slopes, and 14.5° at the eastern slope. The slopes on the eastern flank have dominant SE orientation. The central segment has a bell-shaped form, with the highest steepness comparing to the northern and southern segments. The eastern flank has a steepness of 49.5°. A local depression at a distance of 50 km off from the axis (90°E) continues parallel to the NER, with the shape of the narrow minor trench. The western slope has a steepness of 57.6°, decreasing to 15.6°. The southern segment has a dome-like shape form. Compared to the northern and central segments, it has a less pronounced ridge crest, with a steepness of 24.9° on the west. The eastern flank has a steepness of 36.8° until 70 km, gradually becoming steeper at 44.23°. A local minor trench structure can be seen on its eastern flank (100 km off the axis). This corresponds to the very narrow long topographic depressions stretching parallel to this segment of the NER at 90.5°E. The study contributes to regional geographic studies of Indian Ocean geomorphology and cartographic presentation of GMT functionality for marine research and oceanographic studies.
APA, Harvard, Vancouver, ISO, and other styles
8

Villareal, M. K., and A. F. Tongco. "Multi-sensor Fusion Workflow for Accurate Classification and Mapping of Sugarcane Crops." Engineering, Technology & Applied Science Research 9, no. 3 (June 8, 2019): 4085–91. http://dx.doi.org/10.48084/etasr.2682.

Full text
Abstract:
This study aims to assess the classification accuracy of a novel mapping workflow for sugarcane crops identification that combines light detection and ranging (LiDAR) point clouds and remotely-sensed orthoimages. The combined input data of plant height LiDAR point clouds and multispectral orthoimages were processed using a technique called object-based image analysis (OBIA). The use of multi-source inputs makes the mapping workflow unique and is expected to yield higher accuracy compared to the existing techniques. The multi-source inputs are passed through five phases: data collection, data fusion, image segmentation, accuracy validation, and mapping. Data regarding sugarcane crops were randomly collected in ten sampling sites in the study area. Five out of the ten sampling sites were designated as training sites and the remaining five as validation sites. Normalized digital surface model (nDSM) was created using the LiDAR data. The nDSM was paired with Orthophoto and segmented for feature extraction in OBIA by developing a rule-set in eCognition software. A rule-set was created to classify and to segment sugarcane using nDSM and Orthophoto from the training and validation area sites. A machine learning algorithm called support vector machine (SVM) was used to classify entities in the image. The SVM was constructed using the nDSM. The height parameter nDSM was applied, and the overall accuracy assessment was 98.74% with Kappa index agreement (KIA) 97.47%, while the overall accuracy assessment of sugarcane in the five validation sites were 94.23%, 80.28%, 94.50%, 93.59%, and 93.22%. The results suggest that the mapping workflow of sugarcane crops employing OBIA, LiDAR data, and Orthoimages is attainable. The techniques and process used in this study are potentially useful for the classification and mapping of sugarcane crops.
APA, Harvard, Vancouver, ISO, and other styles
9

Pratama, I. Putu Agi, Ratna Komala Dewi, and Ni Putu Artini. "MAPPING THE CONSUMERS COFFEE POWDER MANGSI COFFEE BASED ON SEGMENTING, TARGETING, AND POSITIONING IN DENPASAR CITY." Agrisocionomics: Jurnal Sosial Ekonomi Pertanian 5, no. 1 (June 17, 2021): 102–9. http://dx.doi.org/10.14710/agrisocionomics.v5i1.8378.

Full text
Abstract:
Not all customers can be served by the company. Each company needs to identify marketsegments that can be served effectively by differentiating the main market segments, aiming at one ortwo segments and developing products so that there are always new breakthroughs. A company in orderto excel in competition must be able to recognize its market segment, target and product position againstits competitors. This research aims to determine the segmenting, targeting, and positioning of MangsiCoffee powder. The sampling technique used is the nonprobability sampling method that is accidentialsampling. Segmenting and targeting is done by using crosstab analysis, while positioning uses multidimensional scaling analysis and correspondence analysis. Mangsi Coffee powder market segmentationbased on the characteristics of consumers are men who are adults, graduated from tertiary education (last education), work as employees with monthly expenditure above the Denpasar City UMK in 2019(Rp. 2,553,000.00). Mangsi Coffee consumers based on psychographic segmentation tend to choosequality products and make coffee consumption habits a trend and lifestyle while Mangsi Coffeeconsumers based on segmentation of consumer behavior tend to choose products according to theirbenefits. Positioning using multi-dimensional scaling analysis (MDS) shows that, Mangsi Coffeepowder is not in one quadrant with all three competing products. The map shows that all four productsare in different quadrants. CA analysis (correspondence analysis), shows the superiority of MangsiCoffee powder products when compared with competitors' products is an attribute in terms ofpackaging. The importance of companies to pay attention to segmenting, targeting, and positioning tobe able to focus on achieving company goals and survive in fierce market competition. The strategy toincrease sales is carried out by adjusting the results of studies in research and the conditions of theMangsi Coffee company so that it can take policies that are in accordance with company goals.
APA, Harvard, Vancouver, ISO, and other styles
10

Uematsu, Sumio. "Thermographic imaging of cutaneous sensory segment in patients with peripheral nerve injury." Journal of Neurosurgery 62, no. 5 (May 1985): 716–20. http://dx.doi.org/10.3171/jns.1985.62.5.0716.

Full text
Abstract:
✓ Sensory examination based on the patient's subjective assessment of symptoms may raise difficult questions about whether the individual's expressed complaint is based on organic nerve damage, psychogenic factors, or even malingering. A prototype computerized telethermograph has allowed clinical quantification of peripheral nerve injury. The system makes possible mapping and imaging of the damaged area, as well as skin temperature measurements. In normal persons, the skin temperature difference between sides of the body was only 0.24° ± 0.073°C. In contrast, in patients with peripheral nerve injury, the temperature of the skin innervated by the damaged nerve deviated an average of 1.55°C (p < 0.001). The new technique requires further refinement, but it appears that use of this method may be cost-effective in helping to resolve medicolegal conflicts concerning peripheral nerve injury.
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Segment mapping technique"

1

(13991187), Joseph W. Daley. "Mixed model methods for quantitative trait loci estimation in crosses between outbred lines." Thesis, 2003. https://figshare.com/articles/thesis/Mixed_model_methods_for_quantitative_trait_loci_estimation_in_crosses_between_outbred_lines/21376767.

Full text
Abstract:

Methodology is developed for Quantitative Trait Loci (QTL) analysis in F2 and backcross designed experiments between outbred lines using a mixed model framework through the modification of segment mapping techniques. Alleles are modelled in the F1 and parental generations allowing the estimation of individual additive allele effects while accounting for QTL segregation within lines as well as differences in mean QTL effects between lines.

Initially the theory, called F1 origin mapping, is developed for a single trait scenario involving possible multiple QTL and polygenic variation. Additive genetic variances are estimated via Restricted Maximum Likelihood (REML) and allele effects are modelled using Best Linear Unbiased Prediction (BLUP). Simulation studies are carried out comparing F1 origin mapping with existing segment mapping methods in a number of genetic scenarios. While there was no significant difference in the estimation of effects between the two methods the average CPU time of one hundred replicates was 0.26 seconds for F1 origin mapping and 3.77 seconds for the segment mapping method. This improvement in computation efficiency is due to the restructuring of IBD matrices which result in the inversion and REML iteration over much smaller matrices.

Further theory is developed which extends F1 origin mapping from single to multiple trait scenarios for F2 crosses between outbred lines. A bivariate trait is simulated using a single QTL with and without a polygenic component. A single trait and bivariate trait analysis are performed to compare the two approaches. There was no significant difference in the estimation of QTL effects between the two approaches. However, there was a slight improvement in the accuracy of QTL position estimates in the multiple trait approach. The advantage of F1 origin mapping with regard to computational efficiency becomes even more important with multiple trait analysis and allows the investigation of interesting biological models of gene expression.

F1 origin mapping is developed further to model the correlation structure inherent in repeated measures data collected on F2 crosses between outbred lines. A study was conducted to show that repeated measures F1 origin mapping and multiple trait F1 origin mapping give similar results in certain circumstances. Another simulation study was also conducted in which five regular repeated measures where simulated with allele breed difference effects and allele variances increasing linearly over time. Various polynomial orders of fit where investigated with the linear order of fit most parsimoniously modelling the data. The linear order of fit correctly identified the increasing trend in both the additive allele difference and allele variance. Repeated measures F1 origin mapping possesses the benefits of using the correlated nature of repeated measures while increasing the efficiency of QTL parameter estimation. Hence, it would be useful for QTL studies on measurements such as milk yield or live weights when collected at irregular intervals.

Theory is developed to combine the data from QTL studies involving F2 and backcross designed experiments. Genetic covariance matrices are developed for random QTL effects by modelling allele variation in the parental generation instead of the offspring generation for an F2 and backcross between outbred lines. The result is a general QTL estimation method called parental origin mapping. Phenotypes and genotypes from such a study involving Romney and Merino sheep are analysed providing evidence for a QTL affecting adult and hogget fibre diameter.

By coupling these new methods with computer software programs such as ASREML, F1 origin mapping and parental origin mapping provide powerful and flexible tools for QTL studies with the ability to efficiently handle single traits, multiple traits and repeated measures.

APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Segment mapping technique"

1

Needham, R., R. Naemi, and N. Chockalingam. "Advancements in data analysis and visualisation techniques to support multiple single-subject analyses: an assessment of movement coordination and coordination variability." In Studies in Health Technology and Informatics. IOS Press, 2021. http://dx.doi.org/10.3233/shti210454.

Full text
Abstract:
Vector coding is a data analysis technique that quantifies inter-segmental coordination and coordination variability of human movement. The usual reporting of vector coding time-series data can be difficult to interpret when multiple trials are superimposed on the same figure. This study describes and presents novel data visualisations for displaying data from vector coding that supports multiple single- subject analyses. The dataset used in this study describes the lumbar-pelvis coordination in the transverse plane during a gait cycle. The data visualisation techniques presented in this study consists of the use of colour and data bars to map and profile coordination pattern and coordination variability data. The use of colour mapping provides the option to classify commonalities and differences in patterns of coordination between segment couplings and between individuals across a big dataset. Data bars display segmental dominancy data that can provide an intuitive summary on coupling angle distribution over time. The data visualisation in this study may provide further insight on how people with adolescent idiopathic scoliosis perform goal-orientated movements following an intervention, which would support clinical management strategies.
APA, Harvard, Vancouver, ISO, and other styles
2

Brinkmann, Benjamin H., Brian Nils Lundstrom, Gregory A. Worrell, and Terrence D. Lagerlund. "Quantitative EEG Analysis, EEG Mapping, and Magnetoencephalography." In Clinical Neurophysiology, edited by Devon I. Rubin, 247–60. 5th ed. Oxford University PressNew York, 2021. http://dx.doi.org/10.1093/med/9780190067854.003.0015.

Full text
Abstract:
Abstract Quantitative EEG Analysis, EEG Mapping, and Magnetoencephalography reviews quantitative methods that facilitate interpretation of multichannel electroencephalographic (EEG) recordings and extract information not readily obtainable with visual analysis alone. Fourier analysis assesses dominant background frequencies, looks for trends over long periods, and can be applied to epileptic seizures. Pattern recognition can automatically detect epileptiform discharges, seizures, and high-frequency oscillations. Montage reformatting allows the same EEG segment to be viewed using different montages, including the Laplacian montage. Cross-correlation and cross-spectral analysis quantifies time relationships between channels. Interpolation techniques and topographical displays help visualize spatial distributions of cortical activity. Multivariate statistical methods find independent components of a multichannel EEG recording and can be used for artifact removal. Machine learning approaches can facilitate identification of seizures and interictal features. Cortical projection and source localization techniques attempt to resolve underlying generators of EEG activity. Magnetic source imaging finds underlying generators of the magnetoencephalogram.
APA, Harvard, Vancouver, ISO, and other styles
3

Men, Hao, and Kishore Pochiraju. "Algorithms for 3D Map Segment Registration." In Geographic Information Systems, 502–28. IGI Global, 2013. http://dx.doi.org/10.4018/978-1-4666-2038-4.ch031.

Full text
Abstract:
Many applications require dimensionally accurate and detailed maps of the environment. Mobile mapping devices with laser ranging devices can generate highly detailed and dimensionally accurate coordinate data in the form of point clouds. Point clouds represent scenes with numerous discrete coordinate samples obtained about a relative reference frame defined by the location and orientation of the sensor. Color information from the environment obtained from cameras can be mapped to the coordinates to generate color point clouds. Point clouds obtained from a single static vantage point are generally incomplete because neither coordinate nor color information exists in occluded areas. Changing the vantage point implies movement of the coordinate frame and the need for sensor position and orientation information. Merging multiple point cloud segments generated from different vantage points using features of the scene enables construction of 3D maps of large areas and filling in gaps left from occlusions. Map registration algorithms identify areas with common features in overlapping point clouds and determine optimal coordinate transformations that can register or merge one point cloud into another point cloud’s coordinate system. Algorithms can also match the attributes other than coordinates, such as optical reflection intensity and color properties, for more efficient common point identification. The extra attributes help resolve ambiguities, reduce the time, and increase precision for point cloud registration. This chapter describes a comprehensive parametric study on the performance of a specialized Iterative Closest Point (ICP) algorithm that uses color information. This Hue-assisted ICP algorithm, a variant developed by the authors, registers point clouds in a 4D (x, y, z, hue) space. A mobile robot with integrated 3D sensor generated color point cloud used for verification and performance measurement of various map registration techniques. The chapter also identifies various algorithms required to accomplish complete map generation using mobile robots.
APA, Harvard, Vancouver, ISO, and other styles
4

Werneck, Nicolau Leal, and Anna Helena Reali Costa. "Mapping with Monocular Vision in Two Dimensions." In Nature-Inspired Computing Design, Development, and Applications, 364–74. IGI Global, 2012. http://dx.doi.org/10.4018/978-1-4666-1574-8.ch022.

Full text
Abstract:
This article presents the problem of building bi-dimensional maps of environments when the sensor available is a camera used to detect edges crossing a single line of pixels and motion is restricted to a straight line along the optical axis. The position over time must be provided or assumed. Mapping algorithms for these conditions can be built with the landmark parameters estimated from sets of matched detection from multiple images. This article shows how maps that are correctly up to scale can be built without knowledge of the camera intrinsic parameters or speed during uniform motion, and how performing an inverse parameterization of the image coordinates turns the mapping problem into the fitting of line segments to a group of points. The resulting technique is a simplified form of visual SLAM that can be better suited for applications such as obstacle detection in mobile robots.
APA, Harvard, Vancouver, ISO, and other styles
5

Kaskes, Pim, Thomas Déhais, Sietze J. de Graaff, Steven Goderis, and Philippe Claeys. "Micro–X-ray fluorescence (µXRF) analysis of proximal impactites: High-resolution element mapping, digital image analysis, and quantifications." In Large Meteorite Impacts and Planetary Evolution VI. Geological Society of America, 2021. http://dx.doi.org/10.1130/2021.2550(07).

Full text
Abstract:
ABSTRACT Quantitative insights into the geochemistry and petrology of proximal impactites are fundamental to understand the complex processes that affected target lithologies during and after hypervelocity impact events. Traditional analytical techniques used to obtain major- and trace-element data sets focus predominantly on either destructive whole-rock analysis or laboratory-intensive phase-specific micro-analysis. Here, we present micro–X-ray fluorescence (µXRF) as a state-of-the-art, time-efficient, and nondestructive alternative for major- and trace-element analysis for both small and large samples (up to 20 cm wide) of proximal impactites. We applied µXRF element mapping on 44 samples from the Chicxulub, Popigai, and Ries impact structures, including impact breccias, impact melt rocks, and shocked target lithologies. The µXRF mapping required limited to no sample preparation and rapidly generated high-resolution major- and trace-element maps (~1 h for 8 cm2, with a spatial resolution of 25 µm). These chemical distribution maps can be used as qualitative multi-element maps, as semiquantitative single-element heat maps, and as a basis for a novel image analysis workflow quantifying the modal abundance, size, shape, and degree of sorting of segmented components. The standardless fundamental parameters method was used to quantify the µXRF maps, and the results were compared with bulk powder techniques. Concentrations of most major elements (Na2O–CaO) were found to be accurate within 10% for thick sections. Overall, we demonstrate that µXRF is more than only a screening tool for heterogeneous impactites, because it rapidly produces bulk and phase-specific geochemical data sets that are suitable for various applications within the earth sciences.
APA, Harvard, Vancouver, ISO, and other styles
6

Huo, Jing, Matthew S. Brown, and Kazunori Okada. "CADrx for GBM Brain Tumors." In Machine Learning in Computer-Aided Diagnosis, 297–314. IGI Global, 2012. http://dx.doi.org/10.4018/978-1-4666-0059-1.ch014.

Full text
Abstract:
The goal of this chapter is to describe a Computer-Aided Therapeutic Response Assessment (CADrx) system for early prediction of drug treatment response for Glioblastoma Multiforme (GBM) brain tumors with Diffusion Weighted (DW) MR images. In conventional Macdonald assessment, tumor response is assessed nine weeks or more post-treatment. However, this chapter will investigate the ability of DW-MRI to assess response earlier, at five weeks post treatment. The Apparent Diffusion Coefficient (ADC) map, calculated from DW images, has been shown to reveal changes in the tumor’s microenvironment preceding morphologic tumor changes. ADC values in treated brain tumors could theoretically both increase due to the cell kill (and thus reduce cell density) and decrease due to inhibition of edema. In this chapter, the authors investigate the effectiveness of features that quantify changes from pre- and post-treatment tumor ADC histograms to detect treatment response. There are three parts in this technique: First, tumor regions were segmented on T1w contrast enhanced images by Otsu’s thresholding method and mapped from T1w images onto ADC images by a 3D Region of Interest (ROI) mapping tool. Second, ADC histograms of the tumor region were extracted from both pre- and five weeks post-treatment scans and fitted by a two-component Gaussian Mixture Models (GMM). The GMM features as well as standard histogram-based features were extracted. Finally, supervised machine learning techniques were applied for classification of responders or non-responders. The approach was evaluated with a dataset of 85 patients with GBM under chemotherapy, in which 39 responded and 46 did not, based on tumor volume reduction. The authors compared adaBoost, random forest, and support vector machine classification algorithms, using ten-fold cross validation, resulting in the best accuracy of 69.41% and the corresponding area under the curve (Az) of 0.70.
APA, Harvard, Vancouver, ISO, and other styles
7

Wang, Jason T. L., and Thomas G. Marr. "Pattern Discovery and Classification in Biosequences." In Pattern Discovery in Biomolecular Data. Oxford University Press, 1999. http://dx.doi.org/10.1093/oso/9780195119404.003.0009.

Full text
Abstract:
With the significant growth of the amount of biosequence data, it becomes increasingly important to develop new techniques for finding “knowledge” from the data. Pattern discovery is a fundamental operation in such applications. It attempts to find patterns in biosequences that can help scientists to analyze the property of a sequence or predict the function of a new entity. The discovered patterns may also help to classify an unknown sequence, that is, assign the sequence to an existing family. In this chapter, we show how to discover active patterns in a set of protein sequences and classify an unlabeled DNA sequence. We use protein sequences as an example to illustrate our discovery algorithm, though the algorithm applies to sequences of any sort, including both protein and DNA. The patterns we wish to discover within a set of sequences are regular expressions of the form *X1 * X2 * ... . The X1,X2,... are segments of a sequence, that is, subsequences made up of consecutive letters, and * represents a variable length don’t care (VLDC). In matching the expression *X1 * X2 * ... with a sequence S, the VLDCs may substitute for zero or more letters in S at zero cost. The dissimilarity measure used in comparing two sequences is the edit distance, that is, the minimum cost of edit operations used to transform one subsequence to the other after an optimal and zero-cost substitution for the VLDCs, where the edit operations include insertion, deletion, and change of one letter to another (Wagner and Fischer, 1974; K. Zhang et al., 1994). That is, we find a one-to-one mapping from each VLDC to a subsequence of the data sequence and from each pattern subsequence to a subsequence of the data sequence such that the following two conditions are satisfied, (i) The mapping preserves the left-to-right ordering (if a VLDC at position i in the pattern maps to a subsequence starting at position i1 and ending at position i2 in the data sequence, and a VLDC at position j in the pattern maps to a subsequence starting at position j1 and ending at position j2 in the data sequence, and i < j, then i2 < j2).
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Segment mapping technique"

1

Antani, Kavit R., Bryan Pearce, Mary E. Kurz, Laine Mears, Kilian Funk, and Maria E. Mayorga. "Manual Precedence Mapping and Application of a Novel Precedence Relationship Learning Technique to Real-World Automotive Assembly Line Balancing." In ASME 2013 International Manufacturing Science and Engineering Conference collocated with the 41st North American Manufacturing Research Conference. American Society of Mechanical Engineers, 2013. http://dx.doi.org/10.1115/msec2013-1235.

Full text
Abstract:
An assembly line is a flow-oriented production system where the productive units performing the operations, referred to as stations, are aligned in a serial manner. The work pieces visit stations successively as they are moved along the line usually by some kind of transportation system, e.g., a conveyor belt. An important decision problem, called Assembly Line Balancing Problem (ALBP), arises and has to be solved when (re-) configuring an assembly line. It consists of distributing the total workload for manufacturing any unit of the product to be assembled among the work stations along the line. The assignment of tasks to stations is constrained by task sequence restrictions which can be expressed in a precedence graph. However, most manufacturers usually do not have precedence graphs or if they do, the information on their precedence graphs is inadequate. As a consequence, the elaborate solution procedures for different versions of ALBP developed by more than 50 years of intensive research are often not applicable in practice. Unfortunately, the known approaches for precedence graph generation are not suitable for the conditions in the automotive industry. Therefore, we describe a detailed application of a new graph generation approach first introduced by Klindworth et al. [1] that is based on learning from past feasible production sequences. This technique forms a sufficient precedence graph that guarantees feasible line balances. Experiments indicate that the proposed procedure is able to approximate the real precedence graph sufficiently well to detect nearly optimal solutions even for a real-world automotive assembly line segment with up to 317 tasks. In particular, it seems to be promising to use interviews with experts in a selective manner by analyzing maximum and minimum graphs to identify still assumed relations that are crucial for the graph’s structure. Thus, the new approach seems to be a major step to close the gap between theoretical line balancing research and practice of assembly line planning.
APA, Harvard, Vancouver, ISO, and other styles
2

Yang, Yahan, Ali Samii, Zhenlong Zhao, and Guotong Ren. "Semi-Elimination Methodology for Simulating High Flow Features in a Reservoir." In SPE Reservoir Simulation Conference. SPE, 2021. http://dx.doi.org/10.2118/203925-ms.

Full text
Abstract:
Abstract Despite the rapid rise of computing power and advances in computational techniques in past decades, it is still challenging in reservoir simulation to model complex and detailed features that are represented by small cells with large permeability values, for example, fractures, multi-segment wells, etc. While those features may carry a large amount of flow and thus have a significant impact on the performance prediction, the combination of small volume and large permeability unfortunately leads to well-known time stepping and convergence difficulties during Newton iteration. We address this issue of high flow through small cells by developing a new semi-elimination computational technique. At the beginning of simulation, we construct a set of pressure basis which is a mapping from pressures at surrounding cells in the bulk of reservoir to pressures at those small cells. Next, we start the time-stepping scheme. For each time step or iteration within a time step, small cells are first employed to provide an accurate computation of flow rates and derivatives using upstream weighting and a flow partitioning scheme. Afterwards, small cells are eliminated and a linear system of equations is assembled and solved involving only bulk cells. This semi-elimination technique allows us to fundamentally avoid the drawbacks caused by including small cells in the global system of equations, while capturing their effect on the flow of hydrocarbon in the reservoir. One of the advantages of the proposed techniques over other existing methods is that it is fully implicit and preserves upstream weighting and compositions of the flow field even after small cells are eliminated, which enhances numerical stability and accuracy of simulation results. Application of this technique to several synthetic and field models demonstrates significant performance and accuracy improvement over standard approaches. This method thus offers a practical way to model complex and dynamic flow behaviors in important features without incurring penalties in speed and robustness of the simulation.
APA, Harvard, Vancouver, ISO, and other styles
3

BERKOWITZ, KATHERINE, RISHABH D. GUHA, OLUWATIMILEHIN OLUWAJIRE, and LANDON GRACE. "A MACHINE LEARNING APPROACH FOR IMPACT DAMAGE QUANTIFICATION IN POLYMER MATRIX COMPOSITES." In Proceedings for the American Society for Composites-Thirty Seventh Technical Conference. Destech Publications, Inc., 2022. http://dx.doi.org/10.12783/asc37/36412.

Full text
Abstract:
Largely due to superior properties compared to traditional materials, the use of polymer matrix composites (PMC) has been expanding in several industries such as aerospace, transportation, defense, and marine. However, the anisotropy and nonhomogeneity of these structures contribute to the difficulty in evaluating structural integrity; damage sites can occur at multiple locations and length scales and are hard to track over time. This can lead to unpredictable and expensive failure of a safety-critical structure, thus creating a need for non-destructive evaluation (NDE) techniques which can detect and quantify small-scale damage sites and track their progression. Our research group has improved upon classical microwave techniques to address these needs; utilizing a custom device to move a sample within a resonant cavity and create a spatial map of relative permittivity. We capitalize on the inevitable presence of moisture within the polymer network to detect damage. The differing migration inclinations of absorbed water molecules in a pristine versus a damaged composite alters the respective concentrations of the two chemical states of moisture. The greater concentration of free water molecules residing in the damage sites exhibit highly different relative permittivity when compared to the higher ratio of polymer-bound water molecules in the undamaged areas. Currently, the technique has shown the ability to detect impact damage across a range of damage levels and gravimetric moisture contents but is not able to specifically quantify damage extent with regards to impact energy level. The applicability of machine learning (ML) to composite materials is substantial, with uses in areas like manufacturing and design, prediction of structural properties, and damage detection. Using traditional NDE techniques in conjunction with supervised or unsupervised ML has been shown to improve the accuracy, reliability, or efficiency of the existing methods. In this work, we explore the use of a combined unsupervised/supervised ML approach to determine a damage boundary and quantification of single-impact specimens. Dry composite specimens were damaged via drop tower to induce one central impact site of 0, 2, or 3 Joules. After moisture exposure, Entrepreneur Dr, Raleigh, North Carolina 27695, U.S.A. 553 each specimen underwent dielectric mapping, and spatial permittivity maps were created at a variety of gravimetric moisture contents. An unsupervised K-means clustering algorithm was applied to the dielectric data to segment the levels of damage and define a damage boundary. Subsequently, supervised learning was used to quantify damage using features including but not limited to thickness, moisture content, permittivity values of each cluster, and average distance between points in each cluster. A regression model was trained on several samples with impact energy as the predicted variable. Evaluation was then performed based on prediction accuracy for samples in which the impact energies are not known to the model.
APA, Harvard, Vancouver, ISO, and other styles
4

Barone, Massimiliano. "Image Wafer Inspection based on Template Matching." In 10th International Conference on Advances in Computing and Information Technology (ACITY 2020). AIRCC Publishing Corporation, 2020. http://dx.doi.org/10.5121/csit.2020.101505.

Full text
Abstract:
This paper presents a template matching technique for detecting defects in VLSI wafer images. This method is based on traditional techniques of image analysis and image registration, but it combines the prior art of image wafer inspection in a new way, using prior knowledge like the design layout of VLSI wafer manufacturing process. This technique requires a golden template of the patterned wafer image under inspection which is obtained from the wafer image itself mixed to the layout design schemes. First a mapping between physical space and pixel space is needed. Then a template matching is applied for a more accurate alignment between wafer device and template. Finally, a segmented comparison is used for finding out possible defects. Results of the proposed method are presented in terms of visual quality of defect detection, any misalignment at topology level and number of correctly detected defective devices.
APA, Harvard, Vancouver, ISO, and other styles
5

Bruzzi, M., A. Baldi, A. Bartoli, I. Cupparo, S. Pallotta, A. Pasquini, M. Scaringella, and C. Talamonti. "Large-area segmented polycrystalline CVD diamond for dose mapping in advanced radiotherapy techniques." In 2016 IEEE Nuclear Science Symposium, Medical Imaging Conference and Room-Temperature Semiconductor Detector Workshop (NSS/MIC/RTSD). IEEE, 2016. http://dx.doi.org/10.1109/nssmic.2016.8069394.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Yahia, Yasser I. O. "Investigation of Segmental SPAFRUnder Static Lateral Load Using Stress Mapping Pre-scale Film Techniques." In 2022 Engineering and Technology for Sustainable Architectural and Interior Design Environments (ETSAIDE). IEEE, 2022. http://dx.doi.org/10.1109/etsaide53569.2022.9906336.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Chijioke, Clinton, Arturo Rodriguez, Andres Enriquez, Vinod Kumar, Vivek Tandon, Jose Terrazas, Daniel Villanueva, and V. M. Krushnarao Kotteda. "FSI of a Cantilever Beam: FVM-FEM and Neural Network Analysis." In ASME 2022 Fluids Engineering Division Summer Meeting. American Society of Mechanical Engineers, 2022. http://dx.doi.org/10.1115/fedsm2022-87636.

Full text
Abstract:
Abstract Fluid structure interaction (FSI) problems are becoming highly complex as it has wide range of applications, which assists to model many real-world problems. The finite volume method (FVM) enforces conservation of the governing physics over the designed control volume. Presently, the FVM is the most common discretization technique used within the computational fluid dynamics (CFD) domain. In this day, multiple techniques for simulating the strongly coupled fluid-structure systems numerically is constantly being researched as the CFD analysis approach evolves swiftly. We introduce a segmented neural network-based approach for learning of FSI problems. The FSI simulation domain is discretized into two smaller sub-domains, i.e., fluid (FVM) and solid (FEM) domains and utilize an autonomous neural network for each. A python based scientific library is used to couple the two networks which takes care of boundary data communication, data mapping and equation coupling. The coupled Ansys fluent-transient structural analysis data will be used for training the two neural networks. Changes in the geometrical and material properties of a solid structure such as: bluff/curved - corners/surfaces, physical dimensions, young’s modulus of elasticity and mass moment of inertia will affect the dynamics of the structure.
APA, Harvard, Vancouver, ISO, and other styles
8

Anderson, Walter, Christoph Haberland, and Mohammad Elahinia. "A Minimally Invasive Cage for Spinal Fusion Surgery Utilizing Superelastic Hinges." In ASME 2013 Conference on Smart Materials, Adaptive Structures and Intelligent Systems. American Society of Mechanical Engineers, 2013. http://dx.doi.org/10.1115/smasis2013-3144.

Full text
Abstract:
A prototype cage implant for spinal fusion surgery has been designed and developed. Spinal fusion surgery is sometimes performed to alleviate low back pain. The cage implant is a spacer that sits in between two vertebrae to allow for bone growth and fusion, all while relieving compression of the spinal cord. The cage implant is minimally invasive in nature, utilizing embedded nitinol hinges as dual purpose actuators and assembly structural elements. The cage implant utilizes elliptical shaped nitinol hinge pins as actuators to allow the cage to be in a straightened before deployment and manipulate its shape to an oblong octagon once within the disc space. A new modeling technique was developed to aid with the design of the nitinol ellipses. The model is MATLAB based and accounts for the non Mises behavior of nitinol through a correction factor for mapping the effective stress and strain. A nitinol rod and an elliptical geometry were examined experimentally and show the robustness of the developed model. These experiments were conducted to design the nitinol hinges for the cage implant. The cage implant is made of two different materials, nitinol hinge actuators and the containing titanium structural segments. The nitinol hinge actuators are completely enclosed within the medical grade titanium segments through the use of selective laser sintering.
APA, Harvard, Vancouver, ISO, and other styles
9

Fallorina, Salvador, Helen Boussalis, Charles Liu, Khosrow Rad, Jane Dong, Dani Nasser, and Paul Thienphrapa. "A Generic Pipelined Task Scheduling Algorithm for Fault-Tolerant Decentralized Control of a Segmented Telescope Testbed." In ASME 2004 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. ASMEDC, 2004. http://dx.doi.org/10.1115/detc2004-57701.

Full text
Abstract:
Control of complex structures requires high computational power to achieve real-time performance. Through decentralized techniques, a complex structure can be controlled by multiple lower-order local controllers, leading to reduced computational complexities. Furthermore, a decentralized approach can both simplify the development of parallel controllers and facilitate fault-tolerant designs. In our research, multiple digital signal processors are employed in a NASA-sponsored segmented telescope testbed to increase the throughput of control tasks. Although increased performance is realized when subsystems are statically mapped to specific processors for control, inefficiency arises if the number of subsystems M is not an integer multiple of the number of processors P (M &gt; P) because (M mod P) processors are necessarily controlling more subsystems than others. Optimality is sacrificed because processors with lighter loads wait for processors with heavier loads. Furthermore this mechanism does not lend itself favorably towards fault tolerance because the failure of a single processor will result in the failure of its subsystem. This paper describes the design and implementation of a pipelined task mapping approach for the decentralized control of a segmented reflector telescope testbed. In our pipelined processing implementation only four of the six subsystems are processed in any given control cycle; the two unprocessed subsystems in each cycle propagate about the system in a round-robin fashion, so processors are never idle. Fault tolerance is facilitated because processors are no longer tied to specific subsystems. Instead, control computations are distributed dynamically such that the pipeline flow structure is maintained. The implementation of a watchdog technology is presented for detecting the possible processor failures. Experimental results are shown comparing the performance of the pipelined and straightforward approaches. The throughput of the system has also been estimated on a system with a larger number of processors. Such estimation shows the linearity of speedup achieved by using the pipelined approach.
APA, Harvard, Vancouver, ISO, and other styles
10

Morris, Lloyd, Homero Murzi, Hernan Espejo, Olga Jamin Salazar De Morris, and Juan Luis Arias Vargas. "Big Data Analysis in Vehicular Market Forecasts for Business Management." In 13th International Conference on Applied Human Factors and Ergonomics (AHFE 2022). AHFE International, 2022. http://dx.doi.org/10.54941/ahfe1002299.

Full text
Abstract:
Information in various markets constitutes the primary basis for making the right decisions in a modern and globalized world. Therefore, opportunities grow based on the availability of data and how the data is structured to obtain information that supports decision-making processes, Ogrean (2018) and Neubert (2018), and even more so when business dynamics revolve around satisfying the demand for the products or services offered, Jacobs and Chase (2009). This article proposes the analysis of the new vehicle market, through operational research techniques, addressing the behavior of vehicle sales for medium and long-term projections for business management. The analysis is developed through Markov Chains and time series analysis techniques, so a complementary approach is used to obtain predictions in future scenarios such as analysis in sales levels related to market shares. Choi et al (2018), indicate that one of the important applications of Big Data in business management is in the field of demand forecasts, becoming one of the common alternatives in prediction for data series over time. The data is taken from Statistics of the National Association of Sustainable Mobility, from 2016 to 2019 for new vehicles in the Colombian market, Andemos (2021). Merkuryeba (2019) proposes procedures between techniques that allow a comprehensive approach to forecasts and where the methods complement each other, it is through the use of the methodology in Markov chain models (Kiral and Uzun 2017), plus the methodology of the time series analysis (Stevenson et al 2015), which with a complementary approach, can reach a more detailed and comprehensive level of analysis for the statement about the future of the variable of interest: vehicle market sales for business management.The results showed that Markov chains were very useful in long-term analysis for sales forecasting and their analysis by market segmentation, for this the sales level is ranked according to the technique of Pareto. Another important contribution to the Markov chain in business management corresponds to the analysis disaggregated by sales rankings, for example in ranking 1 (first 5 brands), was obtained an expectation of value defined at 67.1% of the total sales level, also an internal analysis of this percentage ranking was carried out. Complementarily, for the alternative of times series analysis; we start from the analysis of the demand, where a seasonal behavior of vehicle sales is detected. Rockwell and Davis (2016) and Stevenson et al (2015), establish a procedure for estimating and eliminating seasonal components by using the seasonal index. Additionally, Weller and Crone (2012) and Lau et al (2018), recommend two common alternatives to measure forecast error and making decisions to selected the technique more adequate for business management: mean absolute deviation (MAD) and mean absolute percentage error (MAPE), finally, the result of the three techniques developed: moving average, exponential smoothing, and weighted moving average, the simple exponential smoothing, optimized through MAPE minimization is the selected technique, with which short and medium-term forecasts are defined.This study contributes directly to decision-making in the context of the marketing of new vehicles, as well as in academic settings in relation to research processes in data series under the configuration of big data. In this sense, it was demonstrate that the behavior of sales, segmented by market levels according to the participating brands, can be transformed into estimates of future behavior that establishes an orienting mapping of business objectives with respect to the possible level of participation in quotas of market. Finally, the methodological scheme under an epistemological perspective supported by technical decisions, represent an academic contribution of great relevance for business management, where is recommended to use the time series techniques for short and medium-term forecasts, while Markov chains for the prediction and analysis of the sales structure in medium to long term forecasts.
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Segment mapping technique"

1

Bano, Masooda. Low-Fee Private-Tuition Providers in Developing Countries: An Under-Appreciated and Under- Studied Market—Supply-Side Dynamics in Pakistan. Research on Improving Systems of Education (RISE), August 2022. http://dx.doi.org/10.35489/bsg-rise-wp_2022/107.

Full text
Abstract:
Although low-income parents’ dependence on low-fee private schools has been actively documented in the past decade, existing research and policy discussions have failed to recognise their heavy reliance on low-fee tuition providers in order to ensure that their children complete the primary cycle. By mapping a vibrant supply of low-fee tuition providers in two neighbourhoods in the twin cities of Rawalpindi and Islamabad in Pakistan, this paper argues for understanding the supply-side dynamics of this segment of the education market with the aim of designing better-informed policies, making better use of public spending on supporting private-sector players to reach the poor. Contrary to what is assumed in studies of the private tuition market, the low-fee tuition providers offering services in the Pakistani urban neighbourhoods are not teachers in government schools trying to make extra money by offering afternoon tutorial to children from their schools. Working from their homes, the tutors featured in this paper are mostly women who often have no formal teacher training but are imaginative in their use of a diverse set of teaching techniques to ensure that children from low-income households who cannot get support for education at home cope with their daily homework assignments and pass the annual exams to transition to the next grade. These tutors were motivated to offer tuition by a combination of factors ranging from the need to earn a living, a desire to stay productively engaged, and for some a commitment to help poor children. Arguing that parents expect them to take full responsibility for their children’s educational attainment, these providers view the poor quality of education in schools, the weak maternal involvement in children’s education, and changing cultural norms, whereby children no longer respect authority, as being key to explaining the prevailing low educational levels. The paper presents evidence that the private tuition providers, who may be viewed as education entrepreneurs, have the potential to be used by the state and development agencies to provide better quality education to children from low-income families.
APA, Harvard, Vancouver, ISO, and other styles
2

Mapping the Spatial Distribution of Poverty Using Satellite Imagery in the Philippines. Asian Development Bank, March 2021. http://dx.doi.org/10.22617/spr210076-2.

Full text
Abstract:
The “leave no one behind” principle of the 2030 Agenda for Sustainable Development requires appropriate indicators for different segments of a country’s population. This entails detailed, granular data on population groups that extend beyond national trends and averages. The Asian Development Bank, in collaboration with the Philippine Statistics Authority and the World Data Lab, conducted a feasibility study to enhance the granularity, cost-effectiveness, and compilation of high-quality poverty statistics in the Philippines. This report documents the results of the study, which capitalized on satellite imagery, geospatial data, and powerful machine-learning algorithms to augment conventional data collection and sample survey techniques.
APA, Harvard, Vancouver, ISO, and other styles
3

Mapping the Spatial Distribution of Poverty Using Satellite Imagery in Thailand. Asian Development Bank, April 2021. http://dx.doi.org/10.22617/tcs210112-2.

Full text
Abstract:
The “leave no one behind” principle of the 2030 Agenda for Sustainable Development requires appropriate indicators for different segments of a country’s population. This entails detailed, granular data on population groups that extend beyond national trends and averages. The Asian Development Bank (ADB), in collaboration with the National Statistical Office of Thailand and the Word Data Lab, conducted a feasibility study to enhance the granularity, cost-effectiveness, and compilation of high-quality poverty statistics in Thailand. This report documents the results of the study, providing insights on data collection requirements, advanced algorithmic techniques, and validation of poverty estimates using artificial intelligence to complement traditional data sources and conventional survey methods.
APA, Harvard, Vancouver, ISO, and other styles
4

A Guidebook on Mapping Poverty through Data Integration and Artificial Intelligence. Asian Development Bank, May 2021. http://dx.doi.org/10.22617/spr210131-2.

Full text
Abstract:
The “leave no one behind” principle of the 2030 Agenda for Sustainable Development requires appropriate indicators to be estimated for different segments of a country’s population. The Asian Development Bank, in collaboration with the Philippine Statistics Authority, the National Statistical Office of Thailand, and the World Data Lab, conducted a feasibility study that aimed to enhance the granularity, cost-effectiveness, and compilation of high-quality poverty statistics in the Philippines and Thailand. This accompanying guide to the Key Indicators for Asia and the Pacific 2020 special supplement is based on the study, capitalizing on satellite imagery, geospatial data, and powerful machine-learning algorithms to augment conventional data collection and sample survey techniques.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography