Academic literature on the topic 'Weighting technique'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Weighting technique.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Weighting technique"

1

Adams, Catherine M., and David W. Biers. "Effect of Pairted Comparison Weighting and Weighting Context." Proceedings of the Human Factors and Ergonomics Society Annual Meeting 44, no. 37 (July 2000): 610–13. http://dx.doi.org/10.1177/154193120004403717.

Full text
Abstract:
The present study compared three subjective workload measurement techniques—a unidimensional scale (UW) and two mutidimenional scales (TLX and ModSWAT)–using a continuous memory task. ModSWAT is a technique that uses the sum of the SWAT dimensions to form the workload composite rather than the standard conjoint measurement procedure. Both weighted and unweighted workload measures were obtained for the two multidimensional scales. The paired-comparison technique used by TLX was employed to weight the three SWAT dimensions. For both multidimensional scales, weighting context was also varied. Paired comparison of the workload dimensions was made either prior to the task in a general context or within context of the task just performed. The major findings were that: (1) ModSWAT and UW were more sensitive than TLX to the task demands (i.e., difficulty) of the continuous recognition task; (2) weighting context affected the weights that were assigned the workload dimensions for both TLX and ModSWAT; (3) weighting context did not affect the weighted workload scores; and (4) the weighted and unweighted workload composites were very similar. Results were interpreted within context of averaging across multiple dimensions when several dimensions are insensitive to the task demands. Variability in the weights assigned contributed to the failure to find a effect of weighting.
APA, Harvard, Vancouver, ISO, and other styles
2

Blyth, W. G., and L. J. Marchant. "A Self-weighting Random Sampling Technique." Market Research Society. Journal. 38, no. 4 (July 1996): 1–5. http://dx.doi.org/10.1177/147078539603800411.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Ying Han, Pang, Andrew Teoh Beng Jin, and Lim Heng Siong. "Eigenvector Weighting Function in Face Recognition." Discrete Dynamics in Nature and Society 2011 (2011): 1–15. http://dx.doi.org/10.1155/2011/521935.

Full text
Abstract:
Graph-based subspace learning is a class of dimensionality reduction technique in face recognition. The technique reveals the local manifold structure of face data that hidden in the image space via a linear projection. However, the real world face data may be too complex to measure due to both external imaging noises and the intra-class variations of the face images. Hence, features which are extracted by the graph-based technique could be noisy. An appropriate weight should be imposed to the data features for better data discrimination. In this paper, a piecewise weighting function, known as Eigenvector Weighting Function (EWF), is proposed and implemented in two graph based subspace learning techniques, namely Locality Preserving Projection and Neighbourhood Preserving Embedding. Specifically, the computed projection subspace of the learning approach is decomposed into three partitions: a subspace due to intra-class variations, an intrinsic face subspace, and a subspace which is attributed to imaging noises. Projected data features are weighted differently in these subspaces to emphasize the intrinsic face subspace while penalizing the other two subspaces. Experiments on FERET and FRGC databases are conducted to show the promising performance of the proposed technique.
APA, Harvard, Vancouver, ISO, and other styles
4

Niederlöhner, D., J. Karg, J. Giersch, and G. Anton. "The energy weighting technique: measurements and simulations." Nuclear Instruments and Methods in Physics Research Section A: Accelerators, Spectrometers, Detectors and Associated Equipment 546, no. 1-2 (July 2005): 37–41. http://dx.doi.org/10.1016/j.nima.2005.03.037.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Imran, Muhammad, Abdul Ghafoor, and Victor Sreeram. "Frequency Weighted Model Order Reduction Technique and Error Bounds for Discrete Time Systems." Mathematical Problems in Engineering 2014 (2014): 1–8. http://dx.doi.org/10.1155/2014/498453.

Full text
Abstract:
Model reduction is a process of approximating higher order original models by comparatively lower order models with reasonable accuracy in order to provide ease in design, modeling and simulation for large complex systems. Generally, model reduction techniques approximate the higher order systems for whole frequency range. However, certain applications (like controller reduction) require frequency weighted approximation, which introduce the concept of using frequency weights in model reduction techniques. Limitations of some existing frequency weighted model reduction techniques include lack of stability of reduced order models (for two sided weighting case) and frequency response error bounds. A new frequency weighted technique for balanced model reduction for discrete time systems is proposed. The proposed technique guarantees stable reduced order models even for the case when two sided weightings are present. Efficient technique for frequency weighted Gramians is also proposed. Results are compared with other existing frequency weighted model reduction techniques for discrete time systems. Moreover, the proposed technique yields frequency response error bounds.
APA, Harvard, Vancouver, ISO, and other styles
6

Moarefi, Alireza, Rateb J. Sweis, Seyyed Mahmoud Hoseini Amiri, and Wassim A. AlBalkhy. "Shannon entropy weighting technique as a practical weighting decision-making tool in project management." International Journal of Management Concepts and Philosophy 11, no. 4 (2018): 377. http://dx.doi.org/10.1504/ijmcp.2018.096054.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Hoseini Amiri, Seyyed Mahmoud, Wassim A. AlBalkhy, Alireza Moarefi, and Rateb J. Sweis. "Shannon entropy weighting technique as a practical weighting decision-making tool in project management." International Journal of Management Concepts and Philosophy 11, no. 4 (2018): 377. http://dx.doi.org/10.1504/ijmcp.2018.10017262.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Zhang, Pengfei, Hong Zhao, Jin Huang, Bingcai Liu, and Fen Gao. "Suppression of stitching edge artifacts with weighting technique." Optical Engineering 53, no. 5 (May 21, 2014): 054106. http://dx.doi.org/10.1117/1.oe.53.5.054106.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Lu, George Y., and David W. Wong. "An adaptive inverse-distance weighting spatial interpolation technique." Computers & Geosciences 34, no. 9 (September 2008): 1044–55. http://dx.doi.org/10.1016/j.cageo.2007.07.010.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Odu, G. O. "Weighting methods for multi-criteria decision making technique." Journal of Applied Sciences and Environmental Management 23, no. 8 (September 11, 2019): 1449. http://dx.doi.org/10.4314/jasem.v23i8.7.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Weighting technique"

1

Zakos, John, and n/a. "A Novel Concept and Context-Based Approach for Web Information Retrieval." Griffith University. School of Information and Communication Technology, 2005. http://www4.gu.edu.au:8080/adt-root/public/adt-QGU20060303.104937.

Full text
Abstract:
Web information retrieval is a relatively new research area that has attracted a significant amount of interest from researchers around the world since the emergence of the World Wide Web in the early 1990s. The problems facing successful web information retrieval are a combination of challenges that stem from traditional information retrieval and challenges characterised by the nature of the World Wide Web. The goal of any information retrieval system is to provide an information need fulfilment in response to an information need. In a web setting, this means retrieving as many relevant web documents as possible in response to an inputted query that is typically limited to only containing a few terms expressive of the user's information need. This thesis is primarily concerned with firstly reviewing pertinent literature related to various aspects of web information retrieval research and secondly proposing and investigating a novel concept and context-based approach. The approach consists of techniques that can be used together or independently and aim to provide an improvement in retrieval accuracy over other approaches. A novel concept-based term weighting technique is proposed as a new method of deriving query term significance from ontologies that can be used for the weighting of inputted queries. A technique that dynamically determines the significance of terms occurring in documents based on the matching of contexts is also proposed. Other contributions of this research include techniques for the combination of document and query term weights for the ranking of retrieved documents. All techniques were implemented and tested on benchmark data. This provides a basis for performing comparison with previous top performing web information retrieval systems. High retrieval accuracy is reported as a result of utilising the proposed approach. This is supported through comprehensive experimental evidence and favourable comparisons against previously published results.
APA, Harvard, Vancouver, ISO, and other styles
2

Zakos, John. "A Novel Concept and Context-Based Approach for Web Information Retrieval." Thesis, Griffith University, 2005. http://hdl.handle.net/10072/365878.

Full text
Abstract:
Web information retrieval is a relatively new research area that has attracted a significant amount of interest from researchers around the world since the emergence of the World Wide Web in the early 1990s. The problems facing successful web information retrieval are a combination of challenges that stem from traditional information retrieval and challenges characterised by the nature of the World Wide Web. The goal of any information retrieval system is to provide an information need fulfilment in response to an information need. In a web setting, this means retrieving as many relevant web documents as possible in response to an inputted query that is typically limited to only containing a few terms expressive of the user's information need. This thesis is primarily concerned with firstly reviewing pertinent literature related to various aspects of web information retrieval research and secondly proposing and investigating a novel concept and context-based approach. The approach consists of techniques that can be used together or independently and aim to provide an improvement in retrieval accuracy over other approaches. A novel concept-based term weighting technique is proposed as a new method of deriving query term significance from ontologies that can be used for the weighting of inputted queries. A technique that dynamically determines the significance of terms occurring in documents based on the matching of contexts is also proposed. Other contributions of this research include techniques for the combination of document and query term weights for the ranking of retrieved documents. All techniques were implemented and tested on benchmark data. This provides a basis for performing comparison with previous top performing web information retrieval systems. High retrieval accuracy is reported as a result of utilising the proposed approach. This is supported through comprehensive experimental evidence and favourable comparisons against previously published results.
Thesis (PhD Doctorate)
Doctor of Philosophy (PhD)
School of Information and Communication Technology
Full Text
APA, Harvard, Vancouver, ISO, and other styles
3

Albqmi, Aisha Rashed M. "Integrating three-way decisions framework with multiple support vector machines for text classification." Thesis, Queensland University of Technology, 2022. https://eprints.qut.edu.au/235898/7/Aisha_Rashed_Albqmi_Thesis_.pdf.

Full text
Abstract:
Identifying the boundary between relevant and irrelevant objects in text classification is a significant challenge due to the numerous uncertainties in text documents. Most existing binary text classifiers cannot deal effectively with this problem due to the issue of over-fitting. This thesis proposes a three-way decision model for dealing with the uncertain boundary to improve the binary text classification performance by integrating the distinct aspects of three-way decisions theory and the capacities of the Support Vector Machine. The experimental results show that the proposed models outperform baseline models on the RCV1, Reuters-21578, and R65CO datasets.
APA, Harvard, Vancouver, ISO, and other styles
4

Finnerman, Erik, and Carl Robin Kirchmann. "Evaluation of Alternative Weighting Techniques on the Swedish Stock Market." Thesis, KTH, Matematisk statistik, 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-168294.

Full text
Abstract:
The aim of this thesis is to evaluate how the stock index SIX30RX compares against portfolios based on the same stock selection but with alternative weighting techniques. Eleven alternative weighting techniques are used and divided into three categories; heuristic, optimisation and momentum based ones. These are evaluated from 1990-01-01 until 2014-12-31. The results show that heuristic based weighting techniques overperform and show similar risk characteristics as the SIX30RX index. Optimisation based weighting techniques show strong overperformance but have different risk characteristics manifested in higher portfolio concentration and tracking error. Momentum based weighting techniques have slightly better performance and risk-adjusted performance while their risk concentration and average annual turnover is higher than all other techniques used. Minimum variance is the overall best performing weighting technique in terms of return and risk-adjusted return. Additionally, the equal weighted portfolio overperforms and has similar characteristics as the SIX30RX index despite its simple heuristic approach. In conclusion, all studied alternative weighting techniques except the momentum based ones clearly overperform the SIX30RX index.
APA, Harvard, Vancouver, ISO, and other styles
5

Boman, Trotte, and Samuel Jangenstål. "Beating the MSCI USA Index by Using Other Weighting Techniques." Thesis, KTH, Matematisk statistik, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-209258.

Full text
Abstract:
In this thesis various portfolio weighting strategies are tested. Their performance is determined by their average annual return, Sharpe ratio, tracking error, information ratio and annual standard deviation. The data used is provided by Öhman from Bloomberg and consists of monthly data between 1996-2016 of all stocks that were in the MSCI USA Index at any time between 2002-2016.For any given month we use the last five years of data as a basis for the analysis. Each time the MSCI USA Index changes portfolio constituents we update which constituents are in our portfolio. The traditional weighting strategies used in this thesis are market capitalization, equal, risk-adjusted alpha, fundamental and minimum variance weighting. On top of that, the weighting strategies are used in a cluster framework where the clusters are constructed by using K-means clustering on the stocks each month. The clusters are assigned equal weight and then the traditional weighting strategies are applied within each cluster. Additionally, a GARCH-estimated covariance matrix of the clusters is used to determine the minimum variance optimized weights of the clusters where the constituents within each cluster are equally weighted. We conclude in this thesis that the market capitalization weighting strategy is the one that earns the least of all traditional strategies. From the results we can conclude that there are weighting strategies with higher Sharpe ratio and lower standard deviation. The risk-adjusted alpha in a traditional framework performed best out of all strategies. All cluster weighting strategies with the exception of risk-adjusted alpha outperform their traditional counterpart in terms of return.
I denna rapport prövas olika viktningsstrategier med målet att prestera bättre i termer av genomsnittlig årlig avkastning, Sharpekvot, aktiv risk, informationskvot och årlig standardavvikelse än det marknadsviktade MSCI USA Index. Rapporten är skriven i samarbete med Öhman och data som används kommer från Bloomberg och består av månadsvis data mellan 1996-2016 av alla aktier som var i MSCI USA Index vid någon tidpunkt mellan 2002-2016. För en given månad används senaste fem åren av historisk data för vår analys. Varje gång som MSCI USA Index ändrar portföljsammansättning så uppdaterar vi vilka värdepapper som ingår i vår portfölj. De traditionella viktningsstrategierna som används i denna avhandling är marknadviktat, likaviktat,risk-justerad alpha viktat, fundamental viktat och minsta varians viktat. De klusterviktade strategierna som används i denna avhandling är konstruerade genom att använda K-medel klustring på aktierna varje månad, tilldela lika vikt till varje kluster och sedan använda traditionella viktningsstrategier inom varje kluster. Dessutom används en GARCH skattad kovariansmatris av klustrena för att bestämma minsta varians optimerade vikter för varje kluster där varje aktie inom alla kluster är likaviktade. Vi konstaterar i detta arbete att den marknadsviktade strategin har lägst avkastning av alla viktningsmetoder. Från resultaten kan vi konstatera att det _nns viktningsmetoder med högre Sharpekvot och lägre standardavvikelse. Risk-justerad alpha viktning använt på traditionellt vis är den strategi som presterar bäst av alla metoder. Alla klusterviktade strategier med undantag av risk-justerad alpha viktning presterar bättre än deras traditionella motsvarighet i termer av avkastning.
APA, Harvard, Vancouver, ISO, and other styles
6

Nilubol, Chanin. "Two-dimensional HMM classifier with density perturbation and data weighting techniques for pattern recognition problems." Diss., Georgia Institute of Technology, 2000. http://hdl.handle.net/1853/13538.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Shah, Kashif. "Model adaptation techniques in machine translation." Phd thesis, Université du Maine, 2012. http://tel.archives-ouvertes.fr/tel-00718226.

Full text
Abstract:
Nowadays several indicators suggest that the statistical approach to machinetranslation is the most promising. It allows fast development of systems for anylanguage pair provided that sufficient training data is available.Statistical Machine Translation (SMT) systems use parallel texts ‐ also called bitexts ‐ astraining material for creation of the translation model and monolingual corpora fortarget language modeling.The performance of an SMT system heavily depends upon the quality and quantity ofavailable data. In order to train the translation model, the parallel texts is collected fromvarious sources and domains. These corpora are usually concatenated, word alignmentsare calculated and phrases are extracted.However, parallel data is quite inhomogeneous in many practical applications withrespect to several factors like data source, alignment quality, appropriateness to thetask, etc. This means that the corpora are not weighted according to their importance tothe domain of the translation task. Therefore, it is the domain of the training resourcesthat influences the translations that are selected among several choices. This is incontrast to the training of the language model for which well‐known techniques areused to weight the various sources of texts.We have proposed novel methods to automatically weight the heterogeneous data toadapt the translation model.In a first approach, this is achieved with a resampling technique. A weight to eachbitexts is assigned to select the proportion of data from that corpus. The alignmentscoming from each bitexts are resampled based on these weights. The weights of thecorpora are directly optimized on the development data using a numerical method.Moreover, an alignment score of each aligned sentence pair is used as confidencemeasurement.In an extended work, we obtain such a weighting by resampling alignments usingweights that decrease with the temporal distance of bitexts to the test set. By thesemeans, we can use all the available bitexts and still put an emphasis on the most recentone. The main idea of our approach is to use a parametric form or meta‐weights for theweighting of the different parts of the bitexts. This ensures that our approach has onlyfew parameters to optimize.In another work, we have proposed a generic framework which takes into account thecorpus and sentence level "goodness scores" during the calculation of the phrase‐tablewhich results into better distribution of probability mass of the individual phrase pairs.
APA, Harvard, Vancouver, ISO, and other styles
8

Sigweni, Boyce B. "An investigation of feature weighting algorithms and validation techniques using blind analysis for analogy-based estimation." Thesis, Brunel University, 2016. http://bura.brunel.ac.uk/handle/2438/12797.

Full text
Abstract:
Context: Software effort estimation is a very important component of the software development life cycle. It underpins activities such as planning, maintenance and bidding. Therefore, it has triggered much research over the past four decades, including many machine learning approaches. One popular approach, that has the benefit of accessible reasoning, is analogy-based estimation. Machine learning including analogy is known to significantly benefit from feature selection/weighting. Unfortunately feature weighting search is an NP hard problem, therefore computationally very demanding, if not intractable. Objective: Therefore, one objective of this research is to develop an effi cient and effective feature weighting algorithm for estimation by analogy. However, a major challenge for the effort estimation research community is that experimental results tend to be contradictory and also lack reliability. This has been paralleled by a recent awareness of how bias can impact research results. This is a contributory reason why software effort estimation is still an open problem. Consequently the second objective is to investigate research methods that might lead to more reliable results and focus on blinding methods to reduce researcher bias. Method: In order to build on the most promising feature weighting algorithms I conduct a systematic literature review. From this I develop a novel and e fficient feature weighting algorithm. This is experimentally evaluated, comparing three feature weighting approaches with a na ive benchmark using 2 industrial data sets. Using these experiments, I explore blind analysis as a technique to reduce bias. Results: The systematic literature review conducted identified 19 relevant primary studies. Results from the meta-analysis of selected studies using a one-sample sign test (p = 0.0003) shows a positive effect - to feature weighting in general compared with ordinary analogy-based estimation (ABE), that is, feature weighting is a worthwhile technique to improve ABE. Nevertheless the results remain imperfect so there is still much scope for improvement. My experience shows that blinding can be a relatively straightforward procedure. I also highlight various statistical analysis decisions which ought not be guided by the hunt for statistical significance and show that results can be inverted merely through a seemingly inconsequential statistical nicety. After analysing results from 483 software projects from two separate industrial data sets, I conclude that the proposed technique improves accuracy over the standard feature subset selection (FSS) and traditional case-based reasoning (CBR) when using pseudo time-series validation. Interestingly, there is no strong evidence for superior performance of the new technique when traditional validation techniques (jackknifing) are used but is more effi cient. Conclusion: There are two main findings: (i) Feature weighting techniques are promising for software effort estimation but they need to be tailored for target case for their potential to be adequately exploited. Despite the research findings showing that assuming weights differ in different parts of the instance space ('local' regions) may improve effort estimation results - majority of studies in software effort estimation (SEE) do not take this into consideration. This represents an improvement on other methods that do not take this into consideration. (ii) Whilst there are minor challenges and some limits to the degree of blinding possible, blind analysis is a very practical and an easy-to-implement method that supports more objective analysis of experimental results. Therefore I argue that blind analysis should be the norm for analysing software engineering experiments.
APA, Harvard, Vancouver, ISO, and other styles
9

Leary, Emily Vanessa. "A comparison of sampling, weighting, and variance estimation of techniques for the Oklahoma oral health needs assessment." Oklahoma City : [s.n.], 2006.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
10

Örn, Henrik. "Accuracy and precision of bedrock sur-face prediction using geophysics and geostatistics." Thesis, KTH, Mark- och vattenteknik, 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-171859.

Full text
Abstract:
In underground construction and foundation engineering uncertainties associated with subsurface properties are inevitable to deal with. Site investigations are expensive to perform, but a limited understanding of the subsurface may result in major problems; which often lead to an unexpected increase in the overall cost of the construction project. This study aims to optimize the pre-investigation program to get as much correct information out from a limited input of resources, thus making it as cost effective as possible. To optimize site investigation using soil-rock sounding three different sampling techniques, a varying number of sample points and two different interpolation methods (Inverse distance weighting and point Kriging) were tested on four modeled reference surfaces. The accuracy of rock surface predictions was evaluated using a 3D gridding and modeling computer software (Surfer 8.02®). Samples with continuously distributed data, resembling profile lines from geophysical surveys were used to evaluate how this could improve the accuracy of the prediction compared to adding additional sampling points. The study explains the correlation between the number of sampling points and the accuracy of the prediction obtained using different interpolators. Most importantly it shows how continuous data significantly improves the accuracy of the rock surface predictions and therefore concludes that geophysical measurement should be used combined with traditional soil rock sounding to optimize the pre-investigation program.
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Weighting technique"

1

An investigation of automatic term weighting techniques. Ann Arbor, Mich: University Microfilms International, 1985.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Nimmo, Graham, and Ben Shippey. Clinical skills in critical care. Oxford University Press, 2016. http://dx.doi.org/10.1093/med/9780199600830.003.0013.

Full text
Abstract:
This chapter provides a framework for the learning and teaching of both technical and non-technical skills. There is a deliberate weighting towards decision-making and behavioural skills because of their prevalence in practice, the importance of delivering them reliably, and the need to increase their profile in our wards, classrooms, skills centres, and curricula. The practice of clinical intensive care requires the application of a huge range of clinical skills each of which has its own knowledge base and where each necessitates the acquisition of a technique. It is necessary to consider the application of these skills in the ‘messy’, sometimes chaotic environment of the intensive care unit where multiple critically-ill patients are simultaneously requiring individual input and at the same time relatives require support, learners need teaching, and time and attention are invested in the crucial processes of audit, quality improvement and research.
APA, Harvard, Vancouver, ISO, and other styles
3

Levin, Ines, and Betsy Sinclair. Causal Inference with Complex Survey Designs. Edited by Lonna Rae Atkeson and R. Michael Alvarez. Oxford University Press, 2016. http://dx.doi.org/10.1093/oxfordhb/9780190213299.013.4.

Full text
Abstract:
This article discusses methods that combine survey weighting and propensity score matching to estimate population average treatment effects. Beginning with an overview of causal inference techniques that incorporate data from complex surveys and the usefulness of survey weights, it then considers approaches for incorporating survey weights into three matching algorithms, along with their respective methodologies: nearest-neighbor matching, subclassification matching, and propensity score weighting. It also presents the results of a Monte Carlo simulation study that illustrates the benefits of incorporating survey weights into propensity score matching procedures, as well as the problems that arise when survey weights are ignored. Finally, it explores the differences between population-based inferences and sample-based inferences using real-world data from the 2012 panel of The American Panel Survey (TAPS). The article highlights the impact of social media usage on political participation, when such impact is not actually apparent in the target population.
APA, Harvard, Vancouver, ISO, and other styles
4

Franzese, Robert J., and Jude C. Hays. Empirical Models of Spatial Inter‐Dependence. Edited by Janet M. Box-Steffensmeier, Henry E. Brady, and David Collier. Oxford University Press, 2009. http://dx.doi.org/10.1093/oxfordhb/9780199286546.003.0025.

Full text
Abstract:
This article discusses the role of ‘spatial interdependence’ between units of analysis by using a symmetric weighting matrix for the units of observation whose elements reflect the relative connectivity between unit i and unit j. It starts by addressing spatial interdependence in political science. There are two workhorse regression models in empirical spatial analysis: spatial lag and spatial error models. The article then addresses OLS estimation and specification testing under the null hypothesis of no spatial dependence. It turns to the topic of assessing spatial lag models, and a discussion of spatial error models. Moreover, it reports the calculation of spatial multipliers. Furthermore, it presents several newer applications of spatial techniques in empirical political science research: SAR models with multiple lags, SAR models for binary dependent variables, and spatio-temporal autoregressive (STAR) models for panel data.
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Weighting technique"

1

Chang, Joong Hyuk, and Nam-Hun Park. "A Novel Weighting Technique for Mining Sequence Data Streams." In IT Convergence and Security 2012, 929–36. Dordrecht: Springer Netherlands, 2012. http://dx.doi.org/10.1007/978-94-007-5860-5_112.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Soumeya, Belabbas, and Addou Djamel. "Weighting Schemes Based Discriminative Model Combination Technique for Robust Speech Recognition." In Artificial Intelligence and Heuristics for Smart Energy Efficiency in Smart Cities, 430–38. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-92038-8_43.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Gazzah, Leïla, and Leïla Najjar. "Cooperative Localization Algorithm Based on Reference Selection of Selective Weighting ILS Technique." In Lecture Notes in Computer Science, 122–33. Cham: Springer International Publishing, 2014. http://dx.doi.org/10.1007/978-3-319-13174-0_10.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Marsh, Kevin, Praveen Thokala, Axel Mühlbacher, and Tereza Lanitis. "Incorporating Preferences and Priorities into MCDA: Selecting an Appropriate Scoring and Weighting Technique." In Multi-Criteria Decision Analysis to Support Healthcare Decisions, 47–66. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-47540-0_4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Lu, Hongqian, Wuneng Zhou, and Xingping Liu. "A Free-Weighting Matrices Technique for Stochastic Stability Analysis of Uncertain Singular Hybrid System." In Lecture Notes in Electrical Engineering, 657–65. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-38524-7_72.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Zhang, Ying, and Mei-Jie Zhang. "Inference of Transition Probabilities in Multi-State Models Using Adaptive Inverse Probability Censoring Weighting Technique." In Statistical Modeling in Biomedical Research, 449–81. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-33416-1_19.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Baltrunas, Linas, and Francesco Ricci. "Item Weighting Techniques for Collaborative Filtering." In Knowledge Discovery Enhanced with Semantic and Social Information, 109–26. Berlin, Heidelberg: Springer Berlin Heidelberg, 2009. http://dx.doi.org/10.1007/978-3-642-01891-6_7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Sunder, R. "Advances in Test Techniques to Characterize Fatigue and Fracture Properties for Safety Critical Applications." In Light Weighting for Defense, Aerospace, and Transportation, 103–20. Singapore: Springer Singapore, 2019. http://dx.doi.org/10.1007/978-981-15-1263-6_8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Rui, Duan, Wang Xuegang, and Chen Zhuming. "Time-Varying Weighting Techniques for Airborne Bistatic Radar Clutter Suppression." In Communications in Computer and Information Science, 171–78. Berlin, Heidelberg: Springer Berlin Heidelberg, 2009. http://dx.doi.org/10.1007/978-3-642-02342-2_24.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Çakır, Esra, Mehmet Ali Taş, and Emre Demircioğlu. "A New Weighting Method in Fuzzy Multi-criteria Decision Making: Selected Element Reduction Approach (SERA)." In Applications of Fuzzy Techniques, 20–30. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-031-16038-7_3.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Weighting technique"

1

Sandor, Ivo, Stephan Staudacher, and Gernot Hertweck. "Micro Jet Engine Oil Consumption Measurement Technique." In ASME Turbo Expo 2009: Power for Land, Sea, and Air. ASMEDC, 2009. http://dx.doi.org/10.1115/gt2009-59800.

Full text
Abstract:
Oil consumption of micro gas turbine engines plays a significant role with regards to their practical application in aerospace. In this context an oil consumption measurement device has been developed on behalf of Daimler AG for the application to vehicle turbo chargers. This device has been used to measure the oil consumption of an own design micro jet engine of 400 N thrust. The design of the device is based on the principle of gravimetric weighting. In the past, volumetric principles have been applied to engine oil consumption measurements. Technical advances in the field of piezoelectricity have improved the accuracy of gravimetric weighting in such a way that today its accuracy is comparable to volumetric gauging. Moreover, unlike volumetric gauging gravimetric weighting is not influenced by the density or the amount of emulsified gas in the oil. Hence, application of gravimetric weighting represents a more robust and more efficient way to evaluate oil consumption in micro gas turbine engines. Unlike non-conventional measurement strategies like emission measurement and tracer techniques, gravimetric weighting allows very simple and convenient oil consumption measurements [3]. The device was validated using defined laboratory measurements. Experimental results are shown.
APA, Harvard, Vancouver, ISO, and other styles
2

Hawley, Robert W., and Wendy L. Garber. "Aperture weighting technique for video synthetic aperture radar." In SPIE Defense, Security, and Sensing. SPIE, 2011. http://dx.doi.org/10.1117/12.887648.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Fernandez, Charles, Arun Kr Dev, Rose Norman, Wai Lok Woo, and Shashi Bhushan Kumar. "Dynamic Positioning System: Systematic Weight Assignment for DP Sub-Systems Using Multi-Criteria Evaluation Technique Analytic Hierarchy Process and Validation Using DP-RI Tool With Deep Learning Algorithm." In ASME 2019 38th International Conference on Ocean, Offshore and Arctic Engineering. American Society of Mechanical Engineers, 2019. http://dx.doi.org/10.1115/omae2019-95485.

Full text
Abstract:
Abstract The Dynamic Positioning (DP) System of a vessel involves complex interactions between a large number of sub-systems. Each sub-system plays a unique role in the continuous overall DP function for safe and reliable operation of the vessel. Rating the significance or assigning weightings to the DP sub-systems in different operating conditions is a complex task that requires input from many stakeholders. The weighting assignment is a critical step in determining the reliability of the DP system during complex marine and offshore operations. Thus, an accurate weighting assignment is crucial as it, in turn, influences the decision-making of the operator concerning the DP system functionality execution. Often DP operators prefer to rely on intuition in assigning the weightings. However, it introduces an inherent uncertainty and level of inconsistency in the decision making. The systematic assignment of weightings requires a clear definition of criteria and objectives and data collection with the DP system operating continuously in different environmental conditions. The sub-systems of the overall DP system are characterized by multi-attributes resulting in a high number of comparisons thereby making weighting distribution complicated. If the weighting distribution was performed by simplifying the attributes, making the decision by excluding part of them or compromising the cognitive efforts, then this could lead to inaccurate decision making. Multi-Criteria Decision Making (MCDM) methods have evolved over several decades and have been used in various applications within the Maritime and Oil and Gas industries. DP, being a complex system, naturally lends itself to the implementation of MCDM techniques to assign weight distribution among its sub-systems. In this paper, the Analytic Hierarchy Process (AHP) methodology is used for weight assignment among the DP sub-systems. An AHP model is effective in obtaining the domain knowledge from numerous experts and representing knowledge-guided indexing. The approach involved examination of several criteria in terms of both quantitative and qualitative variables. A state-of-the-art advisory decision-making tool, Dynamic Positioning Reliability Index (DP-RI), is used to validate the results from AHP. The weighting assignments from AHP are close to the reality and verified using the tool through real-life scenarios.
APA, Harvard, Vancouver, ISO, and other styles
4

Al-Mubaid, Hisham, and Duong B. Nguyen. "New Feature Weighting Technique for Predicting Protein Subcellular Localization." In 2014 IEEE International Conference on Bioinformatics and Bioengineering (BIBE). IEEE, 2014. http://dx.doi.org/10.1109/bibe.2014.35.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Yifeng Li and George A. Lampropoulos. "A new adaptive band weighting technique for hydrocarbon detection." In 2009 16th International Conference on Digital Signal Processing (DSP). IEEE, 2009. http://dx.doi.org/10.1109/icdsp.2009.5201136.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Kim, Haesik. "Selective Mapping Technique Using Sliding Weighting Factor in Frequency Domain." In 2013 IEEE 77th Vehicular Technology Conference (VTC Spring). IEEE, 2013. http://dx.doi.org/10.1109/vtcspring.2013.6692747.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Puntheeranurak, Sutheera, and Pongpan Pitakpaisarnsin. "Time-aware Recommender System Using Naive Bayes Classifier Weighting Technique." In 2nd International Symposium on Computer, Communication, Control and Automation. Paris, France: Atlantis Press, 2013. http://dx.doi.org/10.2991/3ca-13.2013.66.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Tzeng, Huan-Wen. "The fuzzy decomposition technique for weighting analysis of skill assessment." In 2009 39th IEEE Frontiers in Education Conference (FIE). IEEE, 2009. http://dx.doi.org/10.1109/fie.2009.5350678.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Sen, Anindya, Hsiang-Hsin Hsiung, Maqbool Patel, Beth A. Schueler, James E. Holte, and Xiaoping Hu. "Exact technique for weighting function calculation in 3D cone-beam reconstruction." In Medical Imaging 1995, edited by Richard L. Van Metter and Jacob Beutel. SPIE, 1995. http://dx.doi.org/10.1117/12.208384.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Wang, Lutao, Jin Gang, and Wei Wang. "Adaptive Medical Ultrasound Imaging with Data Dependent Weighting Spatial Smoothing Technique." In 2018 3rd International Conference on Control, Automation and Artificial Intelligence (CAAI 2018). Paris, France: Atlantis Press, 2018. http://dx.doi.org/10.2991/caai-18.2018.8.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Weighting technique"

1

Dutra, Lauren M., Matthew C. Farrelly, Brian Bradfield, Jamie Ridenhour, and Jamie Guillory. Modeling the Probability of Fraud in Social Media in a National Cannabis Survey. RTI Press, September 2021. http://dx.doi.org/10.3768/rtipress.2021.mr.0046.2109.

Full text
Abstract:
Cannabis legalization has spread rapidly in the United States. Although national surveys provide robust information on the prevalence of cannabis use, cannabis disorders, and related outcomes, information on knowledge, attitudes, and beliefs (KABs) about cannabis is lacking. To inform the relationship between cannabis legalization and cannabis-related KABs, RTI International launched the National Cannabis Climate Survey (NCCS) in 2016. The survey sampled US residents 18 years or older via mail (n = 2,102), mail-to-web (n = 1,046), and two social media data collections (n = 11,957). This report outlines two techniques that we used to problem-solve several challenges with the resulting data: (1) developing a model for detecting fraudulent cases in social media completes after standard fraud detection measures were insufficient and (2) designing a weighting scheme to pool multiple probability and nonprobability samples. We also describe our approach for validating the pooled dataset. The fraud prevention and detection processes, predictive model of fraud, and the methods used to weight the probability and nonprobability samples can be applied to current and future complex data collections and analysis of existing datasets.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography