Academic literature on the topic 'Decision tree'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Decision tree.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Decision tree"

1

TOFAN, Cezarina Adina. "Optimization Techniques of Decision Making - Decision Tree." Advances in Social Sciences Research Journal 1, no. 5 (September 30, 2014): 142–48. http://dx.doi.org/10.14738/assrj.15.437.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Naylor, Mike. "Decision Tree." Mathematics Teacher: Learning and Teaching PK-12 113, no. 7 (July 2020): 612. http://dx.doi.org/10.5951/mtlt.2020.0081.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Oo, Aung Nway, and Thin Naing. "Decision Tree Models for Medical Diagnosis." International Journal of Trend in Scientific Research and Development Volume-3, Issue-3 (April 30, 2019): 1697–99. http://dx.doi.org/10.31142/ijtsrd23510.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

BRESLOW, LEONARD A., and DAVID W. AHA. "Simplifying decision trees: A survey." Knowledge Engineering Review 12, no. 01 (January 1997): 1–40. http://dx.doi.org/10.1017/s0269888997000015.

Full text
Abstract:
Induced decision trees are an extensively-researched solution to classification tasks. For many practical tasks, the trees produced by tree-generation algorithms are not comprehensible to users due to their size and complexity. Although many tree induction algorithms have been shown to produce simpler, more comprehensible trees (or data structures derived from trees) with good classification accuracy, tree simplification has usually been of secondary concern relative to accuracy, and no attempt has been made to survey the literature from the perspective of simplification. We present a framework that organizes the approaches to tree simplification and summarize and critique the approaches within this framework. The purpose of this survey is to provide researchers and practitioners with a concise overview of tree-simplification approaches and insight into their relative capabilities. In our final discussion, we briefly describe some empirical findings and discuss the application of tree induction algorithms to case retrieval in case-based reasoning systems.
APA, Harvard, Vancouver, ISO, and other styles
5

ZANTEMA, HANS, and HANS L. BODLAENDER. "SIZES OF ORDERED DECISION TREES." International Journal of Foundations of Computer Science 13, no. 03 (June 2002): 445–58. http://dx.doi.org/10.1142/s0129054102001205.

Full text
Abstract:
Decision tables provide a natural framework for knowledge acquisition and representation in the area of knowledge based information systems. Decision trees provide a standard method for inductive inference in the area of machine learning. In this paper we show how decision tables can be considered as ordered decision trees: decision trees satisfying an ordering restriction on the nodes. Every decision tree can be represented by an equivalent ordered decision tree, but we show that doing so may exponentially blow up sizes, even if the choice of the order is left free. Our main result states that finding an ordered decision tree of minimal size that represents the same function as a given ordered decision tree is an NP-hard problem; in earlier work we obtained a similar result for unordered decision trees.
APA, Harvard, Vancouver, ISO, and other styles
6

TOFAN, Cezarina Adina. "Method of decision tree applied in adopting the decision for promoting a company." Annals of "Spiru Haret". Economic Series 15, no. 3 (September 30, 2015): 47. http://dx.doi.org/10.26458/1535.

Full text
Abstract:
The decision can be defined as the way chosen from several possible to achieve an objective. An important role in the functioning of the decisional-informational system is held by the decision-making methods. Decision trees are proving to be very useful tools for taking financial decisions or regarding the numbers, where a large amount of complex information must be considered. They provide an effective structure in which alternative decisions and the implications of their choice can be assessed, and help to form a correct and balanced vision of the risks and rewards that may result from a certain choice. For these reasons, the content of this communication will review a series of decision-making criteria. Also, it will analyse the benefits of using the decision tree method in the decision-making process by providing a numerical example. On this basis, it can be concluded that the procedure may prove useful in making decisions for companies operating on markets where competition intensity is differentiated.
APA, Harvard, Vancouver, ISO, and other styles
7

Cockett, J. R. B. "Decision Expression Optimization1." Fundamenta Informaticae 10, no. 1 (January 1, 1987): 93–114. http://dx.doi.org/10.3233/fi-1987-10107.

Full text
Abstract:
A basic concern when using decision trees for the solution of taxonomic or similar problems is their efficiency. Often the information that is required to completely optimize a tree is simply not available. This is especially the case when a criterion based on probabilities is used. It is shown how it is often possible, despite the absence of this information, to improve the design of the tree. The approach is based on algebraic methods for manipulating decision trees and the identification of some particularly desirable forms.
APA, Harvard, Vancouver, ISO, and other styles
8

Yun, Jooyeol, Jun won Seo, and Taeseon Yoon. "Fuzzy Decision Tree." International Journal of Fuzzy Logic Systems 4, no. 3 (July 31, 2014): 7–11. http://dx.doi.org/10.5121/ijfls.2014.4302.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Manwani, N., and P. S. Sastry. "Geometric Decision Tree." IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics) 42, no. 1 (February 2012): 181–92. http://dx.doi.org/10.1109/tsmcb.2011.2163392.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Zhou, Zhi-Hua, and Zhao-Qian Chen. "Hybrid decision tree." Knowledge-Based Systems 15, no. 8 (November 2002): 515–28. http://dx.doi.org/10.1016/s0950-7051(02)00038-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Decision tree"

1

Shi, Haijian. "Best-first Decision Tree Learning." The University of Waikato, 2007. http://hdl.handle.net/10289/2317.

Full text
Abstract:
In best-first top-down induction of decision trees, the best split is added in each step (e.g. the split that maximally reduces the Gini index). This is in contrast to the standard depth-first traversal of a tree. The resulting tree will be the same, just how it is built is different. The objective of this project is to investigate whether it is possible to determine an appropriate tree size on practical datasets by combining best-first decision tree growth with cross-validation-based selection of the number of expansions that are performed. Pre-pruning, post-pruning, CART-pruning can be performed this way to compare.
APA, Harvard, Vancouver, ISO, and other styles
2

Vella, Alan. "Hyper-heuristic decision tree induction." Thesis, Heriot-Watt University, 2012. http://hdl.handle.net/10399/2540.

Full text
Abstract:
A hyper-heuristic is any algorithm that searches or operates in the space of heuristics as opposed to the space of solutions. Hyper-heuristics are increasingly used in function and combinatorial optimization. Rather than attempt to solve a problem using a fixed heuristic, a hyper-heuristic approach attempts to find a combination of heuristics that solve a problem (and in turn may be directly suitable for a class of problem instances). Hyper-heuristics have been little explored in data mining. This work presents novel hyper-heuristic approaches to data mining, by searching a space of attribute selection criteria for decision tree building algorithm. The search is conducted by a genetic algorithm. The result of the hyper-heuristic search in this case is a strategy for selecting attributes while building decision trees. Most hyper-heuristics work by trying to adapt the heuristic to the state of the problem being solved. Our hyper-heuristic is no different. It employs a strategy for adapting the heuristic used to build decision tree nodes according to some set of features of the training set it is working on. We introduce, explore and evaluate five different ways in which this problem state can be represented for a hyper-heuristic that operates within a decisiontree building algorithm. In each case, the hyper-heuristic is guided by a rule set that tries to map features of the data set to be split by the decision tree building algorithm to a heuristic to be used for splitting the same data set. We also explore and evaluate three different sets of low-level heuristics that could be employed by such a hyper-heuristic. This work also makes a distinction between specialist hyper-heuristics and generalist hyper-heuristics. The main difference between these two hyperheuristcs is the number of training sets used by the hyper-heuristic genetic algorithm. Specialist hyper-heuristics are created using a single data set from a particular domain for evolving the hyper-heurisic rule set. Such algorithms are expected to outperform standard algorithms on the kind of data set used by the hyper-heuristic genetic algorithm. Generalist hyper-heuristics are trained on multiple data sets from different domains and are expected to deliver a robust and competitive performance over these data sets when compared to standard algorithms. We evaluate both approaches for each kind of hyper-heuristic presented in this thesis. We use both real data sets as well as synthetic data sets. Our results suggest that none of the hyper-heuristics presented in this work are suited for specialization – in most cases, the hyper-heuristic’s performance on the data set it was specialized for was not significantly better than that of the best performing standard algorithm. On the other hand, the generalist hyper-heuristics delivered results that were very competitive to the best standard methods. In some cases we even achieved a significantly better overall performance than all of the standard methods.
APA, Harvard, Vancouver, ISO, and other styles
3

Bogdan, Vukobratović. "Hardware Acceleration of Nonincremental Algorithms for the Induction of Decision Trees and Decision Tree Ensembles." Phd thesis, Univerzitet u Novom Sadu, Fakultet tehničkih nauka u Novom Sadu, 2017. https://www.cris.uns.ac.rs/record.jsf?recordId=102520&source=NDLTD&language=en.

Full text
Abstract:
The thesis proposes novel full decision tree and decision tree ensembleinduction algorithms EFTI and EEFTI, and various possibilities for theirimplementations are explored. The experiments show that the proposed EFTIalgorithm is able to infer much smaller DTs on average, without thesignificant loss in accuracy, when compared to the top-down incremental DTinducers. On the other hand, when compared to other full tree inductionalgorithms, it was able to produce more accurate DTs, with similar sizes, inshorter times. Also, the hardware architectures for acceleration of thesealgorithms (EFTIP and EEFTIP) are proposed and it is shown in experimentsthat they can offer substantial speedups.
У овоj дисертациjи, представљени су нови алгоритми EFTI и EEFTI заформирање стабала одлуке и њихових ансамбала неинкременталномметодом, као и разне могућности за њихову имплементациjу.Експерименти показуjу да jе предложени EFTI алгоритам у могућностида произведе драстично мања стабла без губитка тачности у односу напостојеће top-down инкременталне алгоритме, а стабла знатно већетачности у односу на постојеће неинкременталне алгоритме. Такође супредложене хардверске архитектуре за акцелерацију ових алгоритама(EFTIP и EEFTIP) и показано је да је уз помоћ ових архитектура могућеостварити знатна убрзања.
U ovoj disertaciji, predstavljeni su novi algoritmi EFTI i EEFTI zaformiranje stabala odluke i njihovih ansambala neinkrementalnommetodom, kao i razne mogućnosti za njihovu implementaciju.Eksperimenti pokazuju da je predloženi EFTI algoritam u mogućnostida proizvede drastično manja stabla bez gubitka tačnosti u odnosu napostojeće top-down inkrementalne algoritme, a stabla znatno većetačnosti u odnosu na postojeće neinkrementalne algoritme. Takođe supredložene hardverske arhitekture za akceleraciju ovih algoritama(EFTIP i EEFTIP) i pokazano je da je uz pomoć ovih arhitektura mogućeostvariti znatna ubrzanja.
APA, Harvard, Vancouver, ISO, and other styles
4

Qureshi, Taimur. "Contributions to decision tree based learning." Thesis, Lyon 2, 2010. http://www.theses.fr/2010LYO20051/document.

Full text
Abstract:
Advances in data collection methods, storage and processing technology are providing a unique challenge and opportunity for automated data learning techniques which aim at producing high-level information, or models, from data. A Typical knowledge discovery process consists of data selection, data preparation, data transformation, data mining and interpretation/validation of the results. Thus, we develop automatic learning techniques which contribute to the data preparation, transformation and mining tasks of knowledge discovery. In doing so, we try to improve the prediction accuracy of the overall learning process. Our work focuses on decision tree based learning and thus, we introduce various preprocessing and transformation techniques such as discretization, fuzzy partitioning and dimensionality reduction to improve this type of learning. However, these techniques can be used in other learning methods e.g. discretization can also be used for naive-bayes classifiers. The data preparation step represents almost 80 percent of the problem and is both time consuming and critical for the quality of modeling. Discretization of continuous features is an important problem that has effects on accuracy, complexity, variance and understandability of the induction models. In this thesis, we propose and develop resampling based aggregation techniques that improve the quality of discretization. Later, we validate by comparing with other discretization techniques and with an optimal partitioning method on 10 benchmark data sets.The second part of our thesis concerns with automatic fuzzy partitioning for soft decision tree induction. Soft or fuzzy decision tree is an extension of the classical crisp tree induction such that fuzzy logic is embedded into the induction process with the effect of more accurate models and reduced variance, but still interpretable and autonomous. We modify the above resampling based partitioning method to generate fuzzy partitions. In addition we propose, develop and validate another fuzzy partitioning method that improves the accuracy of the decision tree.Finally, we adopt a topological learning scheme and perform non-linear dimensionality reduction. We modify an existing manifold learning based technique and see whether it can enhance the predictive power and interpretability of classification
La recherche avancée dans les méthodes d'acquisition de données ainsi que les méthodes de stockage et les technologies d'apprentissage, s'attaquent défi d'automatiser de manière systématique les techniques d'apprentissage de données en vue d'extraire des connaissances valides et utilisables.La procédure de découverte de connaissances s'effectue selon les étapes suivants: la sélection des données, la préparation de ces données, leurs transformation, le fouille de données et finalement l'interprétation et validation des résultats trouvés. Dans ce travail de thèse, nous avons développé des techniques qui contribuent à la préparation et la transformation des données ainsi qu'a des méthodes de fouille des données pour extraire les connaissances. A travers ces travaux, on a essayé d'améliorer l'exactitude de la prédiction durant tout le processus d'apprentissage. Les travaux de cette thèse se basent sur les arbres de décision. On a alors introduit plusieurs approches de prétraitement et des techniques de transformation; comme le discrétisation, le partitionnement flou et la réduction des dimensions afin d'améliorer les performances des arbres de décision. Cependant, ces techniques peuvent être utilisées dans d'autres méthodes d'apprentissage comme la discrétisation qui peut être utilisées pour la classification bayesienne.Dans le processus de fouille de données, la phase de préparation de données occupe généralement 80 percent du temps. En autre, elle est critique pour la qualité de la modélisation. La discrétisation des attributs continus demeure ainsi un problème très important qui affecte la précision, la complexité, la variance et la compréhension des modèles d'induction. Dans cette thèse, nous avons proposes et développé des techniques qui ce basent sur le ré-échantillonnage. Nous avons également étudié d'autres alternatives comme le partitionnement flou pour une induction floue des arbres de décision. Ainsi la logique floue est incorporée dans le processus d'induction pour augmenter la précision des modèles et réduire la variance, en maintenant l'interprétabilité.Finalement, nous adoptons un schéma d'apprentissage topologique qui vise à effectuer une réduction de dimensions non-linéaire. Nous modifions une technique d'apprentissage à base de variété topologiques `manifolds' pour savoir si on peut augmenter la précision et l'interprétabilité de la classification
APA, Harvard, Vancouver, ISO, and other styles
5

Ardeshir, G. "Decision tree simplification for classifier ensembles." Thesis, University of Surrey, 2002. http://epubs.surrey.ac.uk/843022/.

Full text
Abstract:
Design of ensemble classifiers involves three factors: 1) a learning algorithm to produce a classifier (base classifier), 2) an ensemble method to generate diverse classifiers, and 3) a combining method to combine decisions made by base classifiers. With regard to the first factor, a good choice for constructing a classifier is a decision tree learning algorithm. However, a possible problem with this learning algorithm is its complexity which has only been addressed previously in the context of pruning methods for individual trees. Furthermore, the ensemble method may require the learning algorithm to produce a complex classifier. Considering the fact that performance of simplification methods as well as ensemble methods changes from one domain to another, our main contribution is to address a simplification method (post-pruning) in the context of ensemble methods including Bagging, Boosting and Error-Correcting Output Code (ECOC). Using a statistical test, the performance of ensembles made by Bagging, Boosting and ECOC as well as five pruning methods in the context of ensembles is compared. In addition to the implementation a supporting theory called Margin, is discussed and the relationship of Pruning to bias and variance is explained. For ECOC, the effect of parameters such as code length and size of training set on performance of Pruning methods is also studied. Decomposition methods such as ECOC are considered as a solution to reduce complexity of multi-class problems in many real problems such as face recognition. Focusing on the decomposition methods, AdaBoost.OC which is a combination of Boosting and ECOC is compared with the pseudo-loss based version of Boosting, AdaBoost.M2. In addition, the influence of pruning on the performance of ensembles is studied. Motivated by the result that both pruned and unpruned ensembles made by AdaBoost.OC have similar accuracy, pruned ensembles are compared with ensembles of single node decision trees. This results in the hypothesis that ensembles of simple classifiers may give better performance as shown for AdaBoost.OC on the identification problem in face recognition. The implication is that in some problems to achieve best accuracy of an ensemble, it is necessary to select base classifier complexity.
APA, Harvard, Vancouver, ISO, and other styles
6

Ahmad, Amir. "Data Transformation for Decision Tree Ensembles." Thesis, University of Manchester, 2009. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.508528.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Cai, Jingfeng. "Decision Tree Pruning Using Expert Knowledge." University of Akron / OhioLINK, 2006. http://rave.ohiolink.edu/etdc/view?acc_num=akron1158279616.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Wu, Shuning. "Optimal instance selection for improved decision tree." [Ames, Iowa : Iowa State University], 2007.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
9

Sinnamon, Roslyn M. "Binary decision diagrams for fault tree analysis." Thesis, Loughborough University, 1996. https://dspace.lboro.ac.uk/2134/7424.

Full text
Abstract:
This thesis develops a new approach to fault tree analysis, namely the Binary Decision Diagram (BDD) method. Conventional qualitative fault tree analysis techniques such as the "top-down" or "bottom-up" approaches are now so well developed that further refinement is unlikely to result in vast improvements in terms of their computational capability. The BDD method has exhibited potential gains to be made in terms of speed and efficiency in determining the minimal cut sets. Further, the nature of the binary decision diagram is such that it is more suited to Boolean manipulation. The BDD method has been programmed and successfully applied to a number of benchmark fault trees. The analysis capabilities of the technique have been extended such that all quantitative fault tree top event parameters, which can be determined by conventional Kinetic Tree Theory, can now be derived directly from the BDD. Parameters such as the top event probability, frequency of occurrence and expected number of occurrences can be calculated exactly using this method, removing the need for the approximations previously required. Thus the BDD method is proven to have advantages in terms of both accuracy and efficiency. Initiator/enabler event analysis and importance measures have been incorporated to extend this method into a full analysis procedure.
APA, Harvard, Vancouver, ISO, and other styles
10

Ho, Colin Kok Meng. "Discretization and defragmentation for decision tree learning." Thesis, University of Essex, 1999. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.299072.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Decision tree"

1

Gladwin, Christina. Ethnographic decision tree modeling. Newbury Park: Sage, 1989.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Gladwin, Christina H. Ethnographic decision tree modeling. Newbury Park: Sage, 1989.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

Euler, Bryan L. EDDT: Emotional Disturbance Decision Tree. Lutz, FL: Psychological Assessment Resources, 2007.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

Ken, Friedman. The decision tree: A novel. Rainier, Wash: Heart Pub., 1996.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

Grąbczewski, Krzysztof. Meta-Learning in Decision Tree Induction. Cham: Springer International Publishing, 2014. http://dx.doi.org/10.1007/978-3-319-00960-5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Association, American Bankers. Analyzing financial statements: A decision tree approach. Washington, D.C: American Bankers Association, 2013.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
7

Barros, Rodrigo C., André C. P. L. F. de Carvalho, and Alex A. Freitas. Automatic Design of Decision-Tree Induction Algorithms. Cham: Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-14231-9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

David, Landgrebe, and United States. National Aeronautics and Space Administration., eds. A survey of decision tree classifier methodology. West Lafayatte, Ind: School of Electrical Engineering, Purdue University, 1990.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
9

Medical conditions and massage therapy: A decision tree approach. Philadelphia: Wolters Kluwer/Lippincott Williams & Wilkins Health, 2010.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
10

National Flood Proofing Committee (U.S.), ed. Flood proofing: How to evaluate your options : decision tree. [Fort Belvoir, Va.?]: US Army Corps of Engineers, National Flood Proofing Committee, 1995.

Find full text
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Decision tree"

1

Ayyadevara, V. Kishore. "Decision Tree." In Pro Machine Learning Algorithms, 71–103. Berkeley, CA: Apress, 2018. http://dx.doi.org/10.1007/978-1-4842-3564-5_4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Nahler, Gerhard. "decision tree." In Dictionary of Pharmaceutical Medicine, 48. Vienna: Springer Vienna, 2009. http://dx.doi.org/10.1007/978-3-211-89836-9_366.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Webb, Geoffrey I., Johannes Fürnkranz, Johannes Fürnkranz, Johannes Fürnkranz, Geoffrey Hinton, Claude Sammut, Joerg Sander, et al. "Decision Tree." In Encyclopedia of Machine Learning, 263–67. Boston, MA: Springer US, 2011. http://dx.doi.org/10.1007/978-0-387-30164-8_204.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Berrar, Daniel, and Werner Dubitzky. "Decision Tree." In Encyclopedia of Systems Biology, 551–55. New York, NY: Springer New York, 2013. http://dx.doi.org/10.1007/978-1-4419-9863-7_611.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Panda, Rajendra Mohan, and B. S. Daya Sagar. "Decision Tree." In Encyclopedia of Mathematical Geosciences, 1–7. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-26050-7_81-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Panda, Rajendra Mohan, and B. S. Daya Sagar. "Decision Tree." In Encyclopedia of Mathematical Geosciences, 1–6. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-26050-7_81-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Jo, Taeho. "Decision Tree." In Machine Learning Foundations, 141–65. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-65900-4_7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Fürnkranz, Johannes. "Decision Tree." In Encyclopedia of Machine Learning and Data Mining, 1–5. Boston, MA: Springer US, 2016. http://dx.doi.org/10.1007/978-1-4899-7502-7_66-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Fürnkranz, Johannes. "Decision Tree." In Encyclopedia of Machine Learning and Data Mining, 330–35. Boston, MA: Springer US, 2017. http://dx.doi.org/10.1007/978-1-4899-7687-1_66.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Bandyopadhyay, Susmita. "Decision Tree." In Decision Support System, 7–22. Boca Raton: CRC Press, 2023. http://dx.doi.org/10.1201/9781003307655-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Decision tree"

1

Yawata, Koichiro, Yoshihiro Osakabe, Takuya Okuyama, and Akinori Asahara. "QUBO Decision Tree: Annealing Machine Extends Decision Tree Splitting." In 2022 IEEE International Conference on Knowledge Graph (ICKG). IEEE, 2022. http://dx.doi.org/10.1109/ickg55886.2022.00052.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Desai, Ankit, and Sanjay Chaudhary. "Distributed Decision Tree." In ACM COMPUTE '16: Ninth Annual ACM India Conference. New York, NY, USA: ACM, 2016. http://dx.doi.org/10.1145/2998476.2998478.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Gavankar, Sachin S., and Sudhirkumar D. Sawarkar. "Eager decision tree." In 2017 2nd International Conference for Convergence in Technology (I2CT). IEEE, 2017. http://dx.doi.org/10.1109/i2ct.2017.8226246.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Nowozin, Sebastian, Carsten Rother, Shai Bagon, Toby Sharp, Bangpeng Yao, and Pushmeet Kohli. "Decision tree fields." In 2011 IEEE International Conference on Computer Vision (ICCV). IEEE, 2011. http://dx.doi.org/10.1109/iccv.2011.6126429.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Meng, Qing-wu, Qiang He, Ning Li, Xiang-ran Du, and Li-na Su. "Crisp Decision Tree Induction Based on Fuzzy Decision Tree Algorithm." In 2009 First International Conference on Information Science and Engineering. IEEE, 2009. http://dx.doi.org/10.1109/icise.2009.440.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Huang, Sieh-Chuen, Hsuan-Lei Shao, and Robert B. Leflar. "Applying decision tree analysis to family court decisions." In ICAIL '21: Eighteenth International Conference for Artificial Intelligence and Law. New York, NY, USA: ACM, 2021. http://dx.doi.org/10.1145/3462757.3466076.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

COUVREUR, Jean-Michel, and Duy-Tung NGUYEN. "Tree Data Decision Diagrams." In Second International Workshop on Verification and Evaluation of Computer and Communication Systems (VECoS 2008). BCS Learning & Development, 2008. http://dx.doi.org/10.14236/ewic/vecos2008.3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Ali, Mohd Mahmood, M. S. Qaseem, Lakshmi Rajamani, and A. Govardhan. "Improved decision tree induction." In the Second International Conference. New York, New York, USA: ACM Press, 2012. http://dx.doi.org/10.1145/2393216.2393346.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Jia, Wenguang, and LiJing Huang. "Improved C4.5 Decision Tree." In 2010 International Conference on Internet Technology and Applications (iTAP 2010). IEEE, 2010. http://dx.doi.org/10.1109/itapp.2010.5566133.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Manapragada, Chaitanya, Geoffrey I. Webb, and Mahsa Salehi. "Extremely Fast Decision Tree." In KDD '18: The 24th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. New York, NY, USA: ACM, 2018. http://dx.doi.org/10.1145/3219819.3220005.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Decision tree"

1

Hamilton, Jill, and Tuan Nguyen. Asbestos Inspection/Reinspection Decision Tree. Fort Belvoir, VA: Defense Technical Information Center, October 1999. http://dx.doi.org/10.21236/ada370454.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Narlikar, Girija J. A Parallel, Multithreaded Decision Tree Builder. Fort Belvoir, VA: Defense Technical Information Center, December 1998. http://dx.doi.org/10.21236/ada363531.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Quiller, Ryan. Decision Tree Technique for Particle Identification. Office of Scientific and Technical Information (OSTI), September 2003. http://dx.doi.org/10.2172/815649.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Mughal, Mohamed. Biological Weapons Response Template and Decision Tree. Fort Belvoir, VA: Defense Technical Information Center, April 2001. http://dx.doi.org/10.21236/ada385897.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Dakin, Gordon, and Sankar Virdhagriswaran. Misleading Information Detection Through Probabilistic Decision Tree Classifiers. Fort Belvoir, VA: Defense Technical Information Center, September 2002. http://dx.doi.org/10.21236/ada406823.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Kwon, Theresa Hyunjin, Erin Cho, and Youn-Kyung Kim. Identifying Sustainable Style Consumers with Decision Tree Predictive Model. Ames: Iowa State University, Digital Repository, November 2016. http://dx.doi.org/10.31274/itaa_proceedings-180814-1366.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Eccleston, C. H. The decision - identification tree: A new EIS scoping tool. Office of Scientific and Technical Information (OSTI), April 1997. http://dx.doi.org/10.2172/16876.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Mikulski, Dariusz G. Rough Set Based Splitting Criterion for Binary Decision Tree Classifiers. Fort Belvoir, VA: Defense Technical Information Center, September 2006. http://dx.doi.org/10.21236/ada489077.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Song, So Young, Erin Cho, Youn-Kyung Kim, and Theresa Hyunjin Kwon. Clothing Communication via Social Media: A Decision Tree Predictive Model. Ames: Iowa State University, Digital Repository, November 2015. http://dx.doi.org/10.31274/itaa_proceedings-180814-102.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Zaman, Md Mostafa, Theresa Hyunjin Kwon, Katrina Laemmerhirt, and Youn-Kyung Kim. Profiling Second-hand Clothing Shoppers with Decision Tree Predictive Model. Ames: Iowa State University, Digital Repository, 2017. http://dx.doi.org/10.31274/itaa_proceedings-180814-407.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography