Literatura académica sobre el tema "Decision tree"

Crea una cita precisa en los estilos APA, MLA, Chicago, Harvard y otros

Elija tipo de fuente:

Consulte las listas temáticas de artículos, libros, tesis, actas de conferencias y otras fuentes académicas sobre el tema "Decision tree".

Junto a cada fuente en la lista de referencias hay un botón "Agregar a la bibliografía". Pulsa este botón, y generaremos automáticamente la referencia bibliográfica para la obra elegida en el estilo de cita que necesites: APA, MLA, Harvard, Vancouver, Chicago, etc.

También puede descargar el texto completo de la publicación académica en formato pdf y leer en línea su resumen siempre que esté disponible en los metadatos.

Artículos de revistas sobre el tema "Decision tree"

1

TOFAN, Cezarina Adina. "Optimization Techniques of Decision Making - Decision Tree". Advances in Social Sciences Research Journal 1, n.º 5 (30 de septiembre de 2014): 142–48. http://dx.doi.org/10.14738/assrj.15.437.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Naylor, Mike. "Decision Tree". Mathematics Teacher: Learning and Teaching PK-12 113, n.º 7 (julio de 2020): 612. http://dx.doi.org/10.5951/mtlt.2020.0081.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

Oo, Aung Nway y Thin Naing. "Decision Tree Models for Medical Diagnosis". International Journal of Trend in Scientific Research and Development Volume-3, Issue-3 (30 de abril de 2019): 1697–99. http://dx.doi.org/10.31142/ijtsrd23510.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

BRESLOW, LEONARD A. y DAVID W. AHA. "Simplifying decision trees: A survey". Knowledge Engineering Review 12, n.º 01 (enero de 1997): 1–40. http://dx.doi.org/10.1017/s0269888997000015.

Texto completo
Resumen
Induced decision trees are an extensively-researched solution to classification tasks. For many practical tasks, the trees produced by tree-generation algorithms are not comprehensible to users due to their size and complexity. Although many tree induction algorithms have been shown to produce simpler, more comprehensible trees (or data structures derived from trees) with good classification accuracy, tree simplification has usually been of secondary concern relative to accuracy, and no attempt has been made to survey the literature from the perspective of simplification. We present a framework that organizes the approaches to tree simplification and summarize and critique the approaches within this framework. The purpose of this survey is to provide researchers and practitioners with a concise overview of tree-simplification approaches and insight into their relative capabilities. In our final discussion, we briefly describe some empirical findings and discuss the application of tree induction algorithms to case retrieval in case-based reasoning systems.
Los estilos APA, Harvard, Vancouver, ISO, etc.
5

ZANTEMA, HANS y HANS L. BODLAENDER. "SIZES OF ORDERED DECISION TREES". International Journal of Foundations of Computer Science 13, n.º 03 (junio de 2002): 445–58. http://dx.doi.org/10.1142/s0129054102001205.

Texto completo
Resumen
Decision tables provide a natural framework for knowledge acquisition and representation in the area of knowledge based information systems. Decision trees provide a standard method for inductive inference in the area of machine learning. In this paper we show how decision tables can be considered as ordered decision trees: decision trees satisfying an ordering restriction on the nodes. Every decision tree can be represented by an equivalent ordered decision tree, but we show that doing so may exponentially blow up sizes, even if the choice of the order is left free. Our main result states that finding an ordered decision tree of minimal size that represents the same function as a given ordered decision tree is an NP-hard problem; in earlier work we obtained a similar result for unordered decision trees.
Los estilos APA, Harvard, Vancouver, ISO, etc.
6

TOFAN, Cezarina Adina. "Method of decision tree applied in adopting the decision for promoting a company". Annals of "Spiru Haret". Economic Series 15, n.º 3 (30 de septiembre de 2015): 47. http://dx.doi.org/10.26458/1535.

Texto completo
Resumen
The decision can be defined as the way chosen from several possible to achieve an objective. An important role in the functioning of the decisional-informational system is held by the decision-making methods. Decision trees are proving to be very useful tools for taking financial decisions or regarding the numbers, where a large amount of complex information must be considered. They provide an effective structure in which alternative decisions and the implications of their choice can be assessed, and help to form a correct and balanced vision of the risks and rewards that may result from a certain choice. For these reasons, the content of this communication will review a series of decision-making criteria. Also, it will analyse the benefits of using the decision tree method in the decision-making process by providing a numerical example. On this basis, it can be concluded that the procedure may prove useful in making decisions for companies operating on markets where competition intensity is differentiated.
Los estilos APA, Harvard, Vancouver, ISO, etc.
7

Cockett, J. R. B. "Decision Expression Optimization1". Fundamenta Informaticae 10, n.º 1 (1 de enero de 1987): 93–114. http://dx.doi.org/10.3233/fi-1987-10107.

Texto completo
Resumen
A basic concern when using decision trees for the solution of taxonomic or similar problems is their efficiency. Often the information that is required to completely optimize a tree is simply not available. This is especially the case when a criterion based on probabilities is used. It is shown how it is often possible, despite the absence of this information, to improve the design of the tree. The approach is based on algebraic methods for manipulating decision trees and the identification of some particularly desirable forms.
Los estilos APA, Harvard, Vancouver, ISO, etc.
8

Yun, Jooyeol, Jun won Seo y Taeseon Yoon. "Fuzzy Decision Tree". International Journal of Fuzzy Logic Systems 4, n.º 3 (31 de julio de 2014): 7–11. http://dx.doi.org/10.5121/ijfls.2014.4302.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
9

Manwani, N. y P. S. Sastry. "Geometric Decision Tree". IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics) 42, n.º 1 (febrero de 2012): 181–92. http://dx.doi.org/10.1109/tsmcb.2011.2163392.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
10

Zhou, Zhi-Hua y Zhao-Qian Chen. "Hybrid decision tree". Knowledge-Based Systems 15, n.º 8 (noviembre de 2002): 515–28. http://dx.doi.org/10.1016/s0950-7051(02)00038-2.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.

Tesis sobre el tema "Decision tree"

1

Shi, Haijian. "Best-first Decision Tree Learning". The University of Waikato, 2007. http://hdl.handle.net/10289/2317.

Texto completo
Resumen
In best-first top-down induction of decision trees, the best split is added in each step (e.g. the split that maximally reduces the Gini index). This is in contrast to the standard depth-first traversal of a tree. The resulting tree will be the same, just how it is built is different. The objective of this project is to investigate whether it is possible to determine an appropriate tree size on practical datasets by combining best-first decision tree growth with cross-validation-based selection of the number of expansions that are performed. Pre-pruning, post-pruning, CART-pruning can be performed this way to compare.
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Vella, Alan. "Hyper-heuristic decision tree induction". Thesis, Heriot-Watt University, 2012. http://hdl.handle.net/10399/2540.

Texto completo
Resumen
A hyper-heuristic is any algorithm that searches or operates in the space of heuristics as opposed to the space of solutions. Hyper-heuristics are increasingly used in function and combinatorial optimization. Rather than attempt to solve a problem using a fixed heuristic, a hyper-heuristic approach attempts to find a combination of heuristics that solve a problem (and in turn may be directly suitable for a class of problem instances). Hyper-heuristics have been little explored in data mining. This work presents novel hyper-heuristic approaches to data mining, by searching a space of attribute selection criteria for decision tree building algorithm. The search is conducted by a genetic algorithm. The result of the hyper-heuristic search in this case is a strategy for selecting attributes while building decision trees. Most hyper-heuristics work by trying to adapt the heuristic to the state of the problem being solved. Our hyper-heuristic is no different. It employs a strategy for adapting the heuristic used to build decision tree nodes according to some set of features of the training set it is working on. We introduce, explore and evaluate five different ways in which this problem state can be represented for a hyper-heuristic that operates within a decisiontree building algorithm. In each case, the hyper-heuristic is guided by a rule set that tries to map features of the data set to be split by the decision tree building algorithm to a heuristic to be used for splitting the same data set. We also explore and evaluate three different sets of low-level heuristics that could be employed by such a hyper-heuristic. This work also makes a distinction between specialist hyper-heuristics and generalist hyper-heuristics. The main difference between these two hyperheuristcs is the number of training sets used by the hyper-heuristic genetic algorithm. Specialist hyper-heuristics are created using a single data set from a particular domain for evolving the hyper-heurisic rule set. Such algorithms are expected to outperform standard algorithms on the kind of data set used by the hyper-heuristic genetic algorithm. Generalist hyper-heuristics are trained on multiple data sets from different domains and are expected to deliver a robust and competitive performance over these data sets when compared to standard algorithms. We evaluate both approaches for each kind of hyper-heuristic presented in this thesis. We use both real data sets as well as synthetic data sets. Our results suggest that none of the hyper-heuristics presented in this work are suited for specialization – in most cases, the hyper-heuristic’s performance on the data set it was specialized for was not significantly better than that of the best performing standard algorithm. On the other hand, the generalist hyper-heuristics delivered results that were very competitive to the best standard methods. In some cases we even achieved a significantly better overall performance than all of the standard methods.
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

Bogdan, Vukobratović. "Hardware Acceleration of Nonincremental Algorithms for the Induction of Decision Trees and Decision Tree Ensembles". Phd thesis, Univerzitet u Novom Sadu, Fakultet tehničkih nauka u Novom Sadu, 2017. https://www.cris.uns.ac.rs/record.jsf?recordId=102520&source=NDLTD&language=en.

Texto completo
Resumen
The thesis proposes novel full decision tree and decision tree ensembleinduction algorithms EFTI and EEFTI, and various possibilities for theirimplementations are explored. The experiments show that the proposed EFTIalgorithm is able to infer much smaller DTs on average, without thesignificant loss in accuracy, when compared to the top-down incremental DTinducers. On the other hand, when compared to other full tree inductionalgorithms, it was able to produce more accurate DTs, with similar sizes, inshorter times. Also, the hardware architectures for acceleration of thesealgorithms (EFTIP and EEFTIP) are proposed and it is shown in experimentsthat they can offer substantial speedups.
У овоj дисертациjи, представљени су нови алгоритми EFTI и EEFTI заформирање стабала одлуке и њихових ансамбала неинкременталномметодом, као и разне могућности за њихову имплементациjу.Експерименти показуjу да jе предложени EFTI алгоритам у могућностида произведе драстично мања стабла без губитка тачности у односу напостојеће top-down инкременталне алгоритме, а стабла знатно већетачности у односу на постојеће неинкременталне алгоритме. Такође супредложене хардверске архитектуре за акцелерацију ових алгоритама(EFTIP и EEFTIP) и показано је да је уз помоћ ових архитектура могућеостварити знатна убрзања.
U ovoj disertaciji, predstavljeni su novi algoritmi EFTI i EEFTI zaformiranje stabala odluke i njihovih ansambala neinkrementalnommetodom, kao i razne mogućnosti za njihovu implementaciju.Eksperimenti pokazuju da je predloženi EFTI algoritam u mogućnostida proizvede drastično manja stabla bez gubitka tačnosti u odnosu napostojeće top-down inkrementalne algoritme, a stabla znatno većetačnosti u odnosu na postojeće neinkrementalne algoritme. Takođe supredložene hardverske arhitekture za akceleraciju ovih algoritama(EFTIP i EEFTIP) i pokazano je da je uz pomoć ovih arhitektura mogućeostvariti znatna ubrzanja.
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

Qureshi, Taimur. "Contributions to decision tree based learning". Thesis, Lyon 2, 2010. http://www.theses.fr/2010LYO20051/document.

Texto completo
Resumen
Advances in data collection methods, storage and processing technology are providing a unique challenge and opportunity for automated data learning techniques which aim at producing high-level information, or models, from data. A Typical knowledge discovery process consists of data selection, data preparation, data transformation, data mining and interpretation/validation of the results. Thus, we develop automatic learning techniques which contribute to the data preparation, transformation and mining tasks of knowledge discovery. In doing so, we try to improve the prediction accuracy of the overall learning process. Our work focuses on decision tree based learning and thus, we introduce various preprocessing and transformation techniques such as discretization, fuzzy partitioning and dimensionality reduction to improve this type of learning. However, these techniques can be used in other learning methods e.g. discretization can also be used for naive-bayes classifiers. The data preparation step represents almost 80 percent of the problem and is both time consuming and critical for the quality of modeling. Discretization of continuous features is an important problem that has effects on accuracy, complexity, variance and understandability of the induction models. In this thesis, we propose and develop resampling based aggregation techniques that improve the quality of discretization. Later, we validate by comparing with other discretization techniques and with an optimal partitioning method on 10 benchmark data sets.The second part of our thesis concerns with automatic fuzzy partitioning for soft decision tree induction. Soft or fuzzy decision tree is an extension of the classical crisp tree induction such that fuzzy logic is embedded into the induction process with the effect of more accurate models and reduced variance, but still interpretable and autonomous. We modify the above resampling based partitioning method to generate fuzzy partitions. In addition we propose, develop and validate another fuzzy partitioning method that improves the accuracy of the decision tree.Finally, we adopt a topological learning scheme and perform non-linear dimensionality reduction. We modify an existing manifold learning based technique and see whether it can enhance the predictive power and interpretability of classification
La recherche avancée dans les méthodes d'acquisition de données ainsi que les méthodes de stockage et les technologies d'apprentissage, s'attaquent défi d'automatiser de manière systématique les techniques d'apprentissage de données en vue d'extraire des connaissances valides et utilisables.La procédure de découverte de connaissances s'effectue selon les étapes suivants: la sélection des données, la préparation de ces données, leurs transformation, le fouille de données et finalement l'interprétation et validation des résultats trouvés. Dans ce travail de thèse, nous avons développé des techniques qui contribuent à la préparation et la transformation des données ainsi qu'a des méthodes de fouille des données pour extraire les connaissances. A travers ces travaux, on a essayé d'améliorer l'exactitude de la prédiction durant tout le processus d'apprentissage. Les travaux de cette thèse se basent sur les arbres de décision. On a alors introduit plusieurs approches de prétraitement et des techniques de transformation; comme le discrétisation, le partitionnement flou et la réduction des dimensions afin d'améliorer les performances des arbres de décision. Cependant, ces techniques peuvent être utilisées dans d'autres méthodes d'apprentissage comme la discrétisation qui peut être utilisées pour la classification bayesienne.Dans le processus de fouille de données, la phase de préparation de données occupe généralement 80 percent du temps. En autre, elle est critique pour la qualité de la modélisation. La discrétisation des attributs continus demeure ainsi un problème très important qui affecte la précision, la complexité, la variance et la compréhension des modèles d'induction. Dans cette thèse, nous avons proposes et développé des techniques qui ce basent sur le ré-échantillonnage. Nous avons également étudié d'autres alternatives comme le partitionnement flou pour une induction floue des arbres de décision. Ainsi la logique floue est incorporée dans le processus d'induction pour augmenter la précision des modèles et réduire la variance, en maintenant l'interprétabilité.Finalement, nous adoptons un schéma d'apprentissage topologique qui vise à effectuer une réduction de dimensions non-linéaire. Nous modifions une technique d'apprentissage à base de variété topologiques `manifolds' pour savoir si on peut augmenter la précision et l'interprétabilité de la classification
Los estilos APA, Harvard, Vancouver, ISO, etc.
5

Ardeshir, G. "Decision tree simplification for classifier ensembles". Thesis, University of Surrey, 2002. http://epubs.surrey.ac.uk/843022/.

Texto completo
Resumen
Design of ensemble classifiers involves three factors: 1) a learning algorithm to produce a classifier (base classifier), 2) an ensemble method to generate diverse classifiers, and 3) a combining method to combine decisions made by base classifiers. With regard to the first factor, a good choice for constructing a classifier is a decision tree learning algorithm. However, a possible problem with this learning algorithm is its complexity which has only been addressed previously in the context of pruning methods for individual trees. Furthermore, the ensemble method may require the learning algorithm to produce a complex classifier. Considering the fact that performance of simplification methods as well as ensemble methods changes from one domain to another, our main contribution is to address a simplification method (post-pruning) in the context of ensemble methods including Bagging, Boosting and Error-Correcting Output Code (ECOC). Using a statistical test, the performance of ensembles made by Bagging, Boosting and ECOC as well as five pruning methods in the context of ensembles is compared. In addition to the implementation a supporting theory called Margin, is discussed and the relationship of Pruning to bias and variance is explained. For ECOC, the effect of parameters such as code length and size of training set on performance of Pruning methods is also studied. Decomposition methods such as ECOC are considered as a solution to reduce complexity of multi-class problems in many real problems such as face recognition. Focusing on the decomposition methods, AdaBoost.OC which is a combination of Boosting and ECOC is compared with the pseudo-loss based version of Boosting, AdaBoost.M2. In addition, the influence of pruning on the performance of ensembles is studied. Motivated by the result that both pruned and unpruned ensembles made by AdaBoost.OC have similar accuracy, pruned ensembles are compared with ensembles of single node decision trees. This results in the hypothesis that ensembles of simple classifiers may give better performance as shown for AdaBoost.OC on the identification problem in face recognition. The implication is that in some problems to achieve best accuracy of an ensemble, it is necessary to select base classifier complexity.
Los estilos APA, Harvard, Vancouver, ISO, etc.
6

Ahmad, Amir. "Data Transformation for Decision Tree Ensembles". Thesis, University of Manchester, 2009. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.508528.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
7

Cai, Jingfeng. "Decision Tree Pruning Using Expert Knowledge". University of Akron / OhioLINK, 2006. http://rave.ohiolink.edu/etdc/view?acc_num=akron1158279616.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
8

Wu, Shuning. "Optimal instance selection for improved decision tree". [Ames, Iowa : Iowa State University], 2007.

Buscar texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
9

Sinnamon, Roslyn M. "Binary decision diagrams for fault tree analysis". Thesis, Loughborough University, 1996. https://dspace.lboro.ac.uk/2134/7424.

Texto completo
Resumen
This thesis develops a new approach to fault tree analysis, namely the Binary Decision Diagram (BDD) method. Conventional qualitative fault tree analysis techniques such as the "top-down" or "bottom-up" approaches are now so well developed that further refinement is unlikely to result in vast improvements in terms of their computational capability. The BDD method has exhibited potential gains to be made in terms of speed and efficiency in determining the minimal cut sets. Further, the nature of the binary decision diagram is such that it is more suited to Boolean manipulation. The BDD method has been programmed and successfully applied to a number of benchmark fault trees. The analysis capabilities of the technique have been extended such that all quantitative fault tree top event parameters, which can be determined by conventional Kinetic Tree Theory, can now be derived directly from the BDD. Parameters such as the top event probability, frequency of occurrence and expected number of occurrences can be calculated exactly using this method, removing the need for the approximations previously required. Thus the BDD method is proven to have advantages in terms of both accuracy and efficiency. Initiator/enabler event analysis and importance measures have been incorporated to extend this method into a full analysis procedure.
Los estilos APA, Harvard, Vancouver, ISO, etc.
10

Ho, Colin Kok Meng. "Discretization and defragmentation for decision tree learning". Thesis, University of Essex, 1999. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.299072.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.

Libros sobre el tema "Decision tree"

1

Gladwin, Christina. Ethnographic decision tree modeling. Newbury Park: Sage, 1989.

Buscar texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Gladwin, Christina H. Ethnographic decision tree modeling. Newbury Park: Sage, 1989.

Buscar texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

Euler, Bryan L. EDDT: Emotional Disturbance Decision Tree. Lutz, FL: Psychological Assessment Resources, 2007.

Buscar texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

Ken, Friedman. The decision tree: A novel. Rainier, Wash: Heart Pub., 1996.

Buscar texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
5

Grąbczewski, Krzysztof. Meta-Learning in Decision Tree Induction. Cham: Springer International Publishing, 2014. http://dx.doi.org/10.1007/978-3-319-00960-5.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
6

Association, American Bankers. Analyzing financial statements: A decision tree approach. Washington, D.C: American Bankers Association, 2013.

Buscar texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
7

Barros, Rodrigo C., André C. P. L. F. de Carvalho y Alex A. Freitas. Automatic Design of Decision-Tree Induction Algorithms. Cham: Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-14231-9.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
8

David, Landgrebe y United States. National Aeronautics and Space Administration., eds. A survey of decision tree classifier methodology. West Lafayatte, Ind: School of Electrical Engineering, Purdue University, 1990.

Buscar texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
9

Medical conditions and massage therapy: A decision tree approach. Philadelphia: Wolters Kluwer/Lippincott Williams & Wilkins Health, 2010.

Buscar texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
10

National Flood Proofing Committee (U.S.), ed. Flood proofing: How to evaluate your options : decision tree. [Fort Belvoir, Va.?]: US Army Corps of Engineers, National Flood Proofing Committee, 1995.

Buscar texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.

Capítulos de libros sobre el tema "Decision tree"

1

Ayyadevara, V. Kishore. "Decision Tree". En Pro Machine Learning Algorithms, 71–103. Berkeley, CA: Apress, 2018. http://dx.doi.org/10.1007/978-1-4842-3564-5_4.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Nahler, Gerhard. "decision tree". En Dictionary of Pharmaceutical Medicine, 48. Vienna: Springer Vienna, 2009. http://dx.doi.org/10.1007/978-3-211-89836-9_366.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

Webb, Geoffrey I., Johannes Fürnkranz, Johannes Fürnkranz, Johannes Fürnkranz, Geoffrey Hinton, Claude Sammut, Joerg Sander et al. "Decision Tree". En Encyclopedia of Machine Learning, 263–67. Boston, MA: Springer US, 2011. http://dx.doi.org/10.1007/978-0-387-30164-8_204.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

Berrar, Daniel y Werner Dubitzky. "Decision Tree". En Encyclopedia of Systems Biology, 551–55. New York, NY: Springer New York, 2013. http://dx.doi.org/10.1007/978-1-4419-9863-7_611.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
5

Panda, Rajendra Mohan y B. S. Daya Sagar. "Decision Tree". En Encyclopedia of Mathematical Geosciences, 1–7. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-26050-7_81-2.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
6

Panda, Rajendra Mohan y B. S. Daya Sagar. "Decision Tree". En Encyclopedia of Mathematical Geosciences, 1–6. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-26050-7_81-1.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
7

Jo, Taeho. "Decision Tree". En Machine Learning Foundations, 141–65. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-65900-4_7.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
8

Fürnkranz, Johannes. "Decision Tree". En Encyclopedia of Machine Learning and Data Mining, 1–5. Boston, MA: Springer US, 2016. http://dx.doi.org/10.1007/978-1-4899-7502-7_66-1.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
9

Fürnkranz, Johannes. "Decision Tree". En Encyclopedia of Machine Learning and Data Mining, 330–35. Boston, MA: Springer US, 2017. http://dx.doi.org/10.1007/978-1-4899-7687-1_66.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
10

Bandyopadhyay, Susmita. "Decision Tree". En Decision Support System, 7–22. Boca Raton: CRC Press, 2023. http://dx.doi.org/10.1201/9781003307655-2.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.

Actas de conferencias sobre el tema "Decision tree"

1

Yawata, Koichiro, Yoshihiro Osakabe, Takuya Okuyama y Akinori Asahara. "QUBO Decision Tree: Annealing Machine Extends Decision Tree Splitting". En 2022 IEEE International Conference on Knowledge Graph (ICKG). IEEE, 2022. http://dx.doi.org/10.1109/ickg55886.2022.00052.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Desai, Ankit y Sanjay Chaudhary. "Distributed Decision Tree". En ACM COMPUTE '16: Ninth Annual ACM India Conference. New York, NY, USA: ACM, 2016. http://dx.doi.org/10.1145/2998476.2998478.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

Gavankar, Sachin S. y Sudhirkumar D. Sawarkar. "Eager decision tree". En 2017 2nd International Conference for Convergence in Technology (I2CT). IEEE, 2017. http://dx.doi.org/10.1109/i2ct.2017.8226246.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

Nowozin, Sebastian, Carsten Rother, Shai Bagon, Toby Sharp, Bangpeng Yao y Pushmeet Kohli. "Decision tree fields". En 2011 IEEE International Conference on Computer Vision (ICCV). IEEE, 2011. http://dx.doi.org/10.1109/iccv.2011.6126429.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
5

Meng, Qing-wu, Qiang He, Ning Li, Xiang-ran Du y Li-na Su. "Crisp Decision Tree Induction Based on Fuzzy Decision Tree Algorithm". En 2009 First International Conference on Information Science and Engineering. IEEE, 2009. http://dx.doi.org/10.1109/icise.2009.440.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
6

Huang, Sieh-Chuen, Hsuan-Lei Shao y Robert B. Leflar. "Applying decision tree analysis to family court decisions". En ICAIL '21: Eighteenth International Conference for Artificial Intelligence and Law. New York, NY, USA: ACM, 2021. http://dx.doi.org/10.1145/3462757.3466076.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
7

COUVREUR, Jean-Michel y Duy-Tung NGUYEN. "Tree Data Decision Diagrams". En Second International Workshop on Verification and Evaluation of Computer and Communication Systems (VECoS 2008). BCS Learning & Development, 2008. http://dx.doi.org/10.14236/ewic/vecos2008.3.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
8

Ali, Mohd Mahmood, M. S. Qaseem, Lakshmi Rajamani y A. Govardhan. "Improved decision tree induction". En the Second International Conference. New York, New York, USA: ACM Press, 2012. http://dx.doi.org/10.1145/2393216.2393346.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
9

Jia, Wenguang y LiJing Huang. "Improved C4.5 Decision Tree". En 2010 International Conference on Internet Technology and Applications (iTAP 2010). IEEE, 2010. http://dx.doi.org/10.1109/itapp.2010.5566133.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
10

Manapragada, Chaitanya, Geoffrey I. Webb y Mahsa Salehi. "Extremely Fast Decision Tree". En KDD '18: The 24th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. New York, NY, USA: ACM, 2018. http://dx.doi.org/10.1145/3219819.3220005.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.

Informes sobre el tema "Decision tree"

1

Hamilton, Jill y Tuan Nguyen. Asbestos Inspection/Reinspection Decision Tree. Fort Belvoir, VA: Defense Technical Information Center, octubre de 1999. http://dx.doi.org/10.21236/ada370454.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Narlikar, Girija J. A Parallel, Multithreaded Decision Tree Builder. Fort Belvoir, VA: Defense Technical Information Center, diciembre de 1998. http://dx.doi.org/10.21236/ada363531.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

Quiller, Ryan. Decision Tree Technique for Particle Identification. Office of Scientific and Technical Information (OSTI), septiembre de 2003. http://dx.doi.org/10.2172/815649.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

Mughal, Mohamed. Biological Weapons Response Template and Decision Tree. Fort Belvoir, VA: Defense Technical Information Center, abril de 2001. http://dx.doi.org/10.21236/ada385897.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
5

Dakin, Gordon y Sankar Virdhagriswaran. Misleading Information Detection Through Probabilistic Decision Tree Classifiers. Fort Belvoir, VA: Defense Technical Information Center, septiembre de 2002. http://dx.doi.org/10.21236/ada406823.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
6

Kwon, Theresa Hyunjin, Erin Cho y Youn-Kyung Kim. Identifying Sustainable Style Consumers with Decision Tree Predictive Model. Ames: Iowa State University, Digital Repository, noviembre de 2016. http://dx.doi.org/10.31274/itaa_proceedings-180814-1366.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
7

Eccleston, C. H. The decision - identification tree: A new EIS scoping tool. Office of Scientific and Technical Information (OSTI), abril de 1997. http://dx.doi.org/10.2172/16876.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
8

Mikulski, Dariusz G. Rough Set Based Splitting Criterion for Binary Decision Tree Classifiers. Fort Belvoir, VA: Defense Technical Information Center, septiembre de 2006. http://dx.doi.org/10.21236/ada489077.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
9

Song, So Young, Erin Cho, Youn-Kyung Kim y Theresa Hyunjin Kwon. Clothing Communication via Social Media: A Decision Tree Predictive Model. Ames: Iowa State University, Digital Repository, noviembre de 2015. http://dx.doi.org/10.31274/itaa_proceedings-180814-102.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
10

Zaman, Md Mostafa, Theresa Hyunjin Kwon, Katrina Laemmerhirt y Youn-Kyung Kim. Profiling Second-hand Clothing Shoppers with Decision Tree Predictive Model. Ames: Iowa State University, Digital Repository, 2017. http://dx.doi.org/10.31274/itaa_proceedings-180814-407.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
Ofrecemos descuentos en todos los planes premium para autores cuyas obras están incluidas en selecciones literarias temáticas. ¡Contáctenos para obtener un código promocional único!

Pasar a la bibliografía