Gotowa bibliografia na temat „Decision tree”

Utwórz poprawne odniesienie w stylach APA, MLA, Chicago, Harvard i wielu innych

Wybierz rodzaj źródła:

Zobacz listy aktualnych artykułów, książek, rozpraw, streszczeń i innych źródeł naukowych na temat „Decision tree”.

Przycisk „Dodaj do bibliografii” jest dostępny obok każdej pracy w bibliografii. Użyj go – a my automatycznie utworzymy odniesienie bibliograficzne do wybranej pracy w stylu cytowania, którego potrzebujesz: APA, MLA, Harvard, Chicago, Vancouver itp.

Możesz również pobrać pełny tekst publikacji naukowej w formacie „.pdf” i przeczytać adnotację do pracy online, jeśli odpowiednie parametry są dostępne w metadanych.

Artykuły w czasopismach na temat "Decision tree"

1

TOFAN, Cezarina Adina. "Optimization Techniques of Decision Making - Decision Tree". Advances in Social Sciences Research Journal 1, nr 5 (30.09.2014): 142–48. http://dx.doi.org/10.14738/assrj.15.437.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
2

Naylor, Mike. "Decision Tree". Mathematics Teacher: Learning and Teaching PK-12 113, nr 7 (lipiec 2020): 612. http://dx.doi.org/10.5951/mtlt.2020.0081.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
3

Oo, Aung Nway, i Thin Naing. "Decision Tree Models for Medical Diagnosis". International Journal of Trend in Scientific Research and Development Volume-3, Issue-3 (30.04.2019): 1697–99. http://dx.doi.org/10.31142/ijtsrd23510.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
4

BRESLOW, LEONARD A., i DAVID W. AHA. "Simplifying decision trees: A survey". Knowledge Engineering Review 12, nr 01 (styczeń 1997): 1–40. http://dx.doi.org/10.1017/s0269888997000015.

Pełny tekst źródła
Streszczenie:
Induced decision trees are an extensively-researched solution to classification tasks. For many practical tasks, the trees produced by tree-generation algorithms are not comprehensible to users due to their size and complexity. Although many tree induction algorithms have been shown to produce simpler, more comprehensible trees (or data structures derived from trees) with good classification accuracy, tree simplification has usually been of secondary concern relative to accuracy, and no attempt has been made to survey the literature from the perspective of simplification. We present a framework that organizes the approaches to tree simplification and summarize and critique the approaches within this framework. The purpose of this survey is to provide researchers and practitioners with a concise overview of tree-simplification approaches and insight into their relative capabilities. In our final discussion, we briefly describe some empirical findings and discuss the application of tree induction algorithms to case retrieval in case-based reasoning systems.
Style APA, Harvard, Vancouver, ISO itp.
5

ZANTEMA, HANS, i HANS L. BODLAENDER. "SIZES OF ORDERED DECISION TREES". International Journal of Foundations of Computer Science 13, nr 03 (czerwiec 2002): 445–58. http://dx.doi.org/10.1142/s0129054102001205.

Pełny tekst źródła
Streszczenie:
Decision tables provide a natural framework for knowledge acquisition and representation in the area of knowledge based information systems. Decision trees provide a standard method for inductive inference in the area of machine learning. In this paper we show how decision tables can be considered as ordered decision trees: decision trees satisfying an ordering restriction on the nodes. Every decision tree can be represented by an equivalent ordered decision tree, but we show that doing so may exponentially blow up sizes, even if the choice of the order is left free. Our main result states that finding an ordered decision tree of minimal size that represents the same function as a given ordered decision tree is an NP-hard problem; in earlier work we obtained a similar result for unordered decision trees.
Style APA, Harvard, Vancouver, ISO itp.
6

TOFAN, Cezarina Adina. "Method of decision tree applied in adopting the decision for promoting a company". Annals of "Spiru Haret". Economic Series 15, nr 3 (30.09.2015): 47. http://dx.doi.org/10.26458/1535.

Pełny tekst źródła
Streszczenie:
The decision can be defined as the way chosen from several possible to achieve an objective. An important role in the functioning of the decisional-informational system is held by the decision-making methods. Decision trees are proving to be very useful tools for taking financial decisions or regarding the numbers, where a large amount of complex information must be considered. They provide an effective structure in which alternative decisions and the implications of their choice can be assessed, and help to form a correct and balanced vision of the risks and rewards that may result from a certain choice. For these reasons, the content of this communication will review a series of decision-making criteria. Also, it will analyse the benefits of using the decision tree method in the decision-making process by providing a numerical example. On this basis, it can be concluded that the procedure may prove useful in making decisions for companies operating on markets where competition intensity is differentiated.
Style APA, Harvard, Vancouver, ISO itp.
7

Cockett, J. R. B. "Decision Expression Optimization1". Fundamenta Informaticae 10, nr 1 (1.01.1987): 93–114. http://dx.doi.org/10.3233/fi-1987-10107.

Pełny tekst źródła
Streszczenie:
A basic concern when using decision trees for the solution of taxonomic or similar problems is their efficiency. Often the information that is required to completely optimize a tree is simply not available. This is especially the case when a criterion based on probabilities is used. It is shown how it is often possible, despite the absence of this information, to improve the design of the tree. The approach is based on algebraic methods for manipulating decision trees and the identification of some particularly desirable forms.
Style APA, Harvard, Vancouver, ISO itp.
8

Yun, Jooyeol, Jun won Seo i Taeseon Yoon. "Fuzzy Decision Tree". International Journal of Fuzzy Logic Systems 4, nr 3 (31.07.2014): 7–11. http://dx.doi.org/10.5121/ijfls.2014.4302.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
9

Manwani, N., i P. S. Sastry. "Geometric Decision Tree". IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics) 42, nr 1 (luty 2012): 181–92. http://dx.doi.org/10.1109/tsmcb.2011.2163392.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
10

Zhou, Zhi-Hua, i Zhao-Qian Chen. "Hybrid decision tree". Knowledge-Based Systems 15, nr 8 (listopad 2002): 515–28. http://dx.doi.org/10.1016/s0950-7051(02)00038-2.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.

Rozprawy doktorskie na temat "Decision tree"

1

Shi, Haijian. "Best-first Decision Tree Learning". The University of Waikato, 2007. http://hdl.handle.net/10289/2317.

Pełny tekst źródła
Streszczenie:
In best-first top-down induction of decision trees, the best split is added in each step (e.g. the split that maximally reduces the Gini index). This is in contrast to the standard depth-first traversal of a tree. The resulting tree will be the same, just how it is built is different. The objective of this project is to investigate whether it is possible to determine an appropriate tree size on practical datasets by combining best-first decision tree growth with cross-validation-based selection of the number of expansions that are performed. Pre-pruning, post-pruning, CART-pruning can be performed this way to compare.
Style APA, Harvard, Vancouver, ISO itp.
2

Vella, Alan. "Hyper-heuristic decision tree induction". Thesis, Heriot-Watt University, 2012. http://hdl.handle.net/10399/2540.

Pełny tekst źródła
Streszczenie:
A hyper-heuristic is any algorithm that searches or operates in the space of heuristics as opposed to the space of solutions. Hyper-heuristics are increasingly used in function and combinatorial optimization. Rather than attempt to solve a problem using a fixed heuristic, a hyper-heuristic approach attempts to find a combination of heuristics that solve a problem (and in turn may be directly suitable for a class of problem instances). Hyper-heuristics have been little explored in data mining. This work presents novel hyper-heuristic approaches to data mining, by searching a space of attribute selection criteria for decision tree building algorithm. The search is conducted by a genetic algorithm. The result of the hyper-heuristic search in this case is a strategy for selecting attributes while building decision trees. Most hyper-heuristics work by trying to adapt the heuristic to the state of the problem being solved. Our hyper-heuristic is no different. It employs a strategy for adapting the heuristic used to build decision tree nodes according to some set of features of the training set it is working on. We introduce, explore and evaluate five different ways in which this problem state can be represented for a hyper-heuristic that operates within a decisiontree building algorithm. In each case, the hyper-heuristic is guided by a rule set that tries to map features of the data set to be split by the decision tree building algorithm to a heuristic to be used for splitting the same data set. We also explore and evaluate three different sets of low-level heuristics that could be employed by such a hyper-heuristic. This work also makes a distinction between specialist hyper-heuristics and generalist hyper-heuristics. The main difference between these two hyperheuristcs is the number of training sets used by the hyper-heuristic genetic algorithm. Specialist hyper-heuristics are created using a single data set from a particular domain for evolving the hyper-heurisic rule set. Such algorithms are expected to outperform standard algorithms on the kind of data set used by the hyper-heuristic genetic algorithm. Generalist hyper-heuristics are trained on multiple data sets from different domains and are expected to deliver a robust and competitive performance over these data sets when compared to standard algorithms. We evaluate both approaches for each kind of hyper-heuristic presented in this thesis. We use both real data sets as well as synthetic data sets. Our results suggest that none of the hyper-heuristics presented in this work are suited for specialization – in most cases, the hyper-heuristic’s performance on the data set it was specialized for was not significantly better than that of the best performing standard algorithm. On the other hand, the generalist hyper-heuristics delivered results that were very competitive to the best standard methods. In some cases we even achieved a significantly better overall performance than all of the standard methods.
Style APA, Harvard, Vancouver, ISO itp.
3

Bogdan, Vukobratović. "Hardware Acceleration of Nonincremental Algorithms for the Induction of Decision Trees and Decision Tree Ensembles". Phd thesis, Univerzitet u Novom Sadu, Fakultet tehničkih nauka u Novom Sadu, 2017. https://www.cris.uns.ac.rs/record.jsf?recordId=102520&source=NDLTD&language=en.

Pełny tekst źródła
Streszczenie:
The thesis proposes novel full decision tree and decision tree ensembleinduction algorithms EFTI and EEFTI, and various possibilities for theirimplementations are explored. The experiments show that the proposed EFTIalgorithm is able to infer much smaller DTs on average, without thesignificant loss in accuracy, when compared to the top-down incremental DTinducers. On the other hand, when compared to other full tree inductionalgorithms, it was able to produce more accurate DTs, with similar sizes, inshorter times. Also, the hardware architectures for acceleration of thesealgorithms (EFTIP and EEFTIP) are proposed and it is shown in experimentsthat they can offer substantial speedups.
У овоj дисертациjи, представљени су нови алгоритми EFTI и EEFTI заформирање стабала одлуке и њихових ансамбала неинкременталномметодом, као и разне могућности за њихову имплементациjу.Експерименти показуjу да jе предложени EFTI алгоритам у могућностида произведе драстично мања стабла без губитка тачности у односу напостојеће top-down инкременталне алгоритме, а стабла знатно већетачности у односу на постојеће неинкременталне алгоритме. Такође супредложене хардверске архитектуре за акцелерацију ових алгоритама(EFTIP и EEFTIP) и показано је да је уз помоћ ових архитектура могућеостварити знатна убрзања.
U ovoj disertaciji, predstavljeni su novi algoritmi EFTI i EEFTI zaformiranje stabala odluke i njihovih ansambala neinkrementalnommetodom, kao i razne mogućnosti za njihovu implementaciju.Eksperimenti pokazuju da je predloženi EFTI algoritam u mogućnostida proizvede drastično manja stabla bez gubitka tačnosti u odnosu napostojeće top-down inkrementalne algoritme, a stabla znatno većetačnosti u odnosu na postojeće neinkrementalne algoritme. Takođe supredložene hardverske arhitekture za akceleraciju ovih algoritama(EFTIP i EEFTIP) i pokazano je da je uz pomoć ovih arhitektura mogućeostvariti znatna ubrzanja.
Style APA, Harvard, Vancouver, ISO itp.
4

Qureshi, Taimur. "Contributions to decision tree based learning". Thesis, Lyon 2, 2010. http://www.theses.fr/2010LYO20051/document.

Pełny tekst źródła
Streszczenie:
Advances in data collection methods, storage and processing technology are providing a unique challenge and opportunity for automated data learning techniques which aim at producing high-level information, or models, from data. A Typical knowledge discovery process consists of data selection, data preparation, data transformation, data mining and interpretation/validation of the results. Thus, we develop automatic learning techniques which contribute to the data preparation, transformation and mining tasks of knowledge discovery. In doing so, we try to improve the prediction accuracy of the overall learning process. Our work focuses on decision tree based learning and thus, we introduce various preprocessing and transformation techniques such as discretization, fuzzy partitioning and dimensionality reduction to improve this type of learning. However, these techniques can be used in other learning methods e.g. discretization can also be used for naive-bayes classifiers. The data preparation step represents almost 80 percent of the problem and is both time consuming and critical for the quality of modeling. Discretization of continuous features is an important problem that has effects on accuracy, complexity, variance and understandability of the induction models. In this thesis, we propose and develop resampling based aggregation techniques that improve the quality of discretization. Later, we validate by comparing with other discretization techniques and with an optimal partitioning method on 10 benchmark data sets.The second part of our thesis concerns with automatic fuzzy partitioning for soft decision tree induction. Soft or fuzzy decision tree is an extension of the classical crisp tree induction such that fuzzy logic is embedded into the induction process with the effect of more accurate models and reduced variance, but still interpretable and autonomous. We modify the above resampling based partitioning method to generate fuzzy partitions. In addition we propose, develop and validate another fuzzy partitioning method that improves the accuracy of the decision tree.Finally, we adopt a topological learning scheme and perform non-linear dimensionality reduction. We modify an existing manifold learning based technique and see whether it can enhance the predictive power and interpretability of classification
La recherche avancée dans les méthodes d'acquisition de données ainsi que les méthodes de stockage et les technologies d'apprentissage, s'attaquent défi d'automatiser de manière systématique les techniques d'apprentissage de données en vue d'extraire des connaissances valides et utilisables.La procédure de découverte de connaissances s'effectue selon les étapes suivants: la sélection des données, la préparation de ces données, leurs transformation, le fouille de données et finalement l'interprétation et validation des résultats trouvés. Dans ce travail de thèse, nous avons développé des techniques qui contribuent à la préparation et la transformation des données ainsi qu'a des méthodes de fouille des données pour extraire les connaissances. A travers ces travaux, on a essayé d'améliorer l'exactitude de la prédiction durant tout le processus d'apprentissage. Les travaux de cette thèse se basent sur les arbres de décision. On a alors introduit plusieurs approches de prétraitement et des techniques de transformation; comme le discrétisation, le partitionnement flou et la réduction des dimensions afin d'améliorer les performances des arbres de décision. Cependant, ces techniques peuvent être utilisées dans d'autres méthodes d'apprentissage comme la discrétisation qui peut être utilisées pour la classification bayesienne.Dans le processus de fouille de données, la phase de préparation de données occupe généralement 80 percent du temps. En autre, elle est critique pour la qualité de la modélisation. La discrétisation des attributs continus demeure ainsi un problème très important qui affecte la précision, la complexité, la variance et la compréhension des modèles d'induction. Dans cette thèse, nous avons proposes et développé des techniques qui ce basent sur le ré-échantillonnage. Nous avons également étudié d'autres alternatives comme le partitionnement flou pour une induction floue des arbres de décision. Ainsi la logique floue est incorporée dans le processus d'induction pour augmenter la précision des modèles et réduire la variance, en maintenant l'interprétabilité.Finalement, nous adoptons un schéma d'apprentissage topologique qui vise à effectuer une réduction de dimensions non-linéaire. Nous modifions une technique d'apprentissage à base de variété topologiques `manifolds' pour savoir si on peut augmenter la précision et l'interprétabilité de la classification
Style APA, Harvard, Vancouver, ISO itp.
5

Ardeshir, G. "Decision tree simplification for classifier ensembles". Thesis, University of Surrey, 2002. http://epubs.surrey.ac.uk/843022/.

Pełny tekst źródła
Streszczenie:
Design of ensemble classifiers involves three factors: 1) a learning algorithm to produce a classifier (base classifier), 2) an ensemble method to generate diverse classifiers, and 3) a combining method to combine decisions made by base classifiers. With regard to the first factor, a good choice for constructing a classifier is a decision tree learning algorithm. However, a possible problem with this learning algorithm is its complexity which has only been addressed previously in the context of pruning methods for individual trees. Furthermore, the ensemble method may require the learning algorithm to produce a complex classifier. Considering the fact that performance of simplification methods as well as ensemble methods changes from one domain to another, our main contribution is to address a simplification method (post-pruning) in the context of ensemble methods including Bagging, Boosting and Error-Correcting Output Code (ECOC). Using a statistical test, the performance of ensembles made by Bagging, Boosting and ECOC as well as five pruning methods in the context of ensembles is compared. In addition to the implementation a supporting theory called Margin, is discussed and the relationship of Pruning to bias and variance is explained. For ECOC, the effect of parameters such as code length and size of training set on performance of Pruning methods is also studied. Decomposition methods such as ECOC are considered as a solution to reduce complexity of multi-class problems in many real problems such as face recognition. Focusing on the decomposition methods, AdaBoost.OC which is a combination of Boosting and ECOC is compared with the pseudo-loss based version of Boosting, AdaBoost.M2. In addition, the influence of pruning on the performance of ensembles is studied. Motivated by the result that both pruned and unpruned ensembles made by AdaBoost.OC have similar accuracy, pruned ensembles are compared with ensembles of single node decision trees. This results in the hypothesis that ensembles of simple classifiers may give better performance as shown for AdaBoost.OC on the identification problem in face recognition. The implication is that in some problems to achieve best accuracy of an ensemble, it is necessary to select base classifier complexity.
Style APA, Harvard, Vancouver, ISO itp.
6

Ahmad, Amir. "Data Transformation for Decision Tree Ensembles". Thesis, University of Manchester, 2009. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.508528.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
7

Cai, Jingfeng. "Decision Tree Pruning Using Expert Knowledge". University of Akron / OhioLINK, 2006. http://rave.ohiolink.edu/etdc/view?acc_num=akron1158279616.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
8

Wu, Shuning. "Optimal instance selection for improved decision tree". [Ames, Iowa : Iowa State University], 2007.

Znajdź pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
9

Sinnamon, Roslyn M. "Binary decision diagrams for fault tree analysis". Thesis, Loughborough University, 1996. https://dspace.lboro.ac.uk/2134/7424.

Pełny tekst źródła
Streszczenie:
This thesis develops a new approach to fault tree analysis, namely the Binary Decision Diagram (BDD) method. Conventional qualitative fault tree analysis techniques such as the "top-down" or "bottom-up" approaches are now so well developed that further refinement is unlikely to result in vast improvements in terms of their computational capability. The BDD method has exhibited potential gains to be made in terms of speed and efficiency in determining the minimal cut sets. Further, the nature of the binary decision diagram is such that it is more suited to Boolean manipulation. The BDD method has been programmed and successfully applied to a number of benchmark fault trees. The analysis capabilities of the technique have been extended such that all quantitative fault tree top event parameters, which can be determined by conventional Kinetic Tree Theory, can now be derived directly from the BDD. Parameters such as the top event probability, frequency of occurrence and expected number of occurrences can be calculated exactly using this method, removing the need for the approximations previously required. Thus the BDD method is proven to have advantages in terms of both accuracy and efficiency. Initiator/enabler event analysis and importance measures have been incorporated to extend this method into a full analysis procedure.
Style APA, Harvard, Vancouver, ISO itp.
10

Ho, Colin Kok Meng. "Discretization and defragmentation for decision tree learning". Thesis, University of Essex, 1999. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.299072.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.

Książki na temat "Decision tree"

1

Gladwin, Christina. Ethnographic decision tree modeling. Newbury Park: Sage, 1989.

Znajdź pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
2

Gladwin, Christina H. Ethnographic decision tree modeling. Newbury Park: Sage, 1989.

Znajdź pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
3

Euler, Bryan L. EDDT: Emotional Disturbance Decision Tree. Lutz, FL: Psychological Assessment Resources, 2007.

Znajdź pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
4

Ken, Friedman. The decision tree: A novel. Rainier, Wash: Heart Pub., 1996.

Znajdź pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
5

Grąbczewski, Krzysztof. Meta-Learning in Decision Tree Induction. Cham: Springer International Publishing, 2014. http://dx.doi.org/10.1007/978-3-319-00960-5.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
6

Association, American Bankers. Analyzing financial statements: A decision tree approach. Washington, D.C: American Bankers Association, 2013.

Znajdź pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
7

Barros, Rodrigo C., André C. P. L. F. de Carvalho i Alex A. Freitas. Automatic Design of Decision-Tree Induction Algorithms. Cham: Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-14231-9.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
8

David, Landgrebe, i United States. National Aeronautics and Space Administration., red. A survey of decision tree classifier methodology. West Lafayatte, Ind: School of Electrical Engineering, Purdue University, 1990.

Znajdź pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
9

Medical conditions and massage therapy: A decision tree approach. Philadelphia: Wolters Kluwer/Lippincott Williams & Wilkins Health, 2010.

Znajdź pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
10

National Flood Proofing Committee (U.S.), red. Flood proofing: How to evaluate your options : decision tree. [Fort Belvoir, Va.?]: US Army Corps of Engineers, National Flood Proofing Committee, 1995.

Znajdź pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.

Części książek na temat "Decision tree"

1

Ayyadevara, V. Kishore. "Decision Tree". W Pro Machine Learning Algorithms, 71–103. Berkeley, CA: Apress, 2018. http://dx.doi.org/10.1007/978-1-4842-3564-5_4.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
2

Nahler, Gerhard. "decision tree". W Dictionary of Pharmaceutical Medicine, 48. Vienna: Springer Vienna, 2009. http://dx.doi.org/10.1007/978-3-211-89836-9_366.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
3

Webb, Geoffrey I., Johannes Fürnkranz, Johannes Fürnkranz, Johannes Fürnkranz, Geoffrey Hinton, Claude Sammut, Joerg Sander i in. "Decision Tree". W Encyclopedia of Machine Learning, 263–67. Boston, MA: Springer US, 2011. http://dx.doi.org/10.1007/978-0-387-30164-8_204.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
4

Berrar, Daniel, i Werner Dubitzky. "Decision Tree". W Encyclopedia of Systems Biology, 551–55. New York, NY: Springer New York, 2013. http://dx.doi.org/10.1007/978-1-4419-9863-7_611.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
5

Panda, Rajendra Mohan, i B. S. Daya Sagar. "Decision Tree". W Encyclopedia of Mathematical Geosciences, 1–7. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-26050-7_81-2.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
6

Panda, Rajendra Mohan, i B. S. Daya Sagar. "Decision Tree". W Encyclopedia of Mathematical Geosciences, 1–6. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-26050-7_81-1.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
7

Jo, Taeho. "Decision Tree". W Machine Learning Foundations, 141–65. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-65900-4_7.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
8

Fürnkranz, Johannes. "Decision Tree". W Encyclopedia of Machine Learning and Data Mining, 1–5. Boston, MA: Springer US, 2016. http://dx.doi.org/10.1007/978-1-4899-7502-7_66-1.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
9

Fürnkranz, Johannes. "Decision Tree". W Encyclopedia of Machine Learning and Data Mining, 330–35. Boston, MA: Springer US, 2017. http://dx.doi.org/10.1007/978-1-4899-7687-1_66.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
10

Bandyopadhyay, Susmita. "Decision Tree". W Decision Support System, 7–22. Boca Raton: CRC Press, 2023. http://dx.doi.org/10.1201/9781003307655-2.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.

Streszczenia konferencji na temat "Decision tree"

1

Yawata, Koichiro, Yoshihiro Osakabe, Takuya Okuyama i Akinori Asahara. "QUBO Decision Tree: Annealing Machine Extends Decision Tree Splitting". W 2022 IEEE International Conference on Knowledge Graph (ICKG). IEEE, 2022. http://dx.doi.org/10.1109/ickg55886.2022.00052.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
2

Desai, Ankit, i Sanjay Chaudhary. "Distributed Decision Tree". W ACM COMPUTE '16: Ninth Annual ACM India Conference. New York, NY, USA: ACM, 2016. http://dx.doi.org/10.1145/2998476.2998478.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
3

Gavankar, Sachin S., i Sudhirkumar D. Sawarkar. "Eager decision tree". W 2017 2nd International Conference for Convergence in Technology (I2CT). IEEE, 2017. http://dx.doi.org/10.1109/i2ct.2017.8226246.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
4

Nowozin, Sebastian, Carsten Rother, Shai Bagon, Toby Sharp, Bangpeng Yao i Pushmeet Kohli. "Decision tree fields". W 2011 IEEE International Conference on Computer Vision (ICCV). IEEE, 2011. http://dx.doi.org/10.1109/iccv.2011.6126429.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
5

Meng, Qing-wu, Qiang He, Ning Li, Xiang-ran Du i Li-na Su. "Crisp Decision Tree Induction Based on Fuzzy Decision Tree Algorithm". W 2009 First International Conference on Information Science and Engineering. IEEE, 2009. http://dx.doi.org/10.1109/icise.2009.440.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
6

Huang, Sieh-Chuen, Hsuan-Lei Shao i Robert B. Leflar. "Applying decision tree analysis to family court decisions". W ICAIL '21: Eighteenth International Conference for Artificial Intelligence and Law. New York, NY, USA: ACM, 2021. http://dx.doi.org/10.1145/3462757.3466076.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
7

COUVREUR, Jean-Michel, i Duy-Tung NGUYEN. "Tree Data Decision Diagrams". W Second International Workshop on Verification and Evaluation of Computer and Communication Systems (VECoS 2008). BCS Learning & Development, 2008. http://dx.doi.org/10.14236/ewic/vecos2008.3.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
8

Ali, Mohd Mahmood, M. S. Qaseem, Lakshmi Rajamani i A. Govardhan. "Improved decision tree induction". W the Second International Conference. New York, New York, USA: ACM Press, 2012. http://dx.doi.org/10.1145/2393216.2393346.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
9

Jia, Wenguang, i LiJing Huang. "Improved C4.5 Decision Tree". W 2010 International Conference on Internet Technology and Applications (iTAP 2010). IEEE, 2010. http://dx.doi.org/10.1109/itapp.2010.5566133.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
10

Manapragada, Chaitanya, Geoffrey I. Webb i Mahsa Salehi. "Extremely Fast Decision Tree". W KDD '18: The 24th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. New York, NY, USA: ACM, 2018. http://dx.doi.org/10.1145/3219819.3220005.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.

Raporty organizacyjne na temat "Decision tree"

1

Hamilton, Jill, i Tuan Nguyen. Asbestos Inspection/Reinspection Decision Tree. Fort Belvoir, VA: Defense Technical Information Center, październik 1999. http://dx.doi.org/10.21236/ada370454.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
2

Narlikar, Girija J. A Parallel, Multithreaded Decision Tree Builder. Fort Belvoir, VA: Defense Technical Information Center, grudzień 1998. http://dx.doi.org/10.21236/ada363531.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
3

Quiller, Ryan. Decision Tree Technique for Particle Identification. Office of Scientific and Technical Information (OSTI), wrzesień 2003. http://dx.doi.org/10.2172/815649.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
4

Mughal, Mohamed. Biological Weapons Response Template and Decision Tree. Fort Belvoir, VA: Defense Technical Information Center, kwiecień 2001. http://dx.doi.org/10.21236/ada385897.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
5

Dakin, Gordon, i Sankar Virdhagriswaran. Misleading Information Detection Through Probabilistic Decision Tree Classifiers. Fort Belvoir, VA: Defense Technical Information Center, wrzesień 2002. http://dx.doi.org/10.21236/ada406823.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
6

Kwon, Theresa Hyunjin, Erin Cho i Youn-Kyung Kim. Identifying Sustainable Style Consumers with Decision Tree Predictive Model. Ames: Iowa State University, Digital Repository, listopad 2016. http://dx.doi.org/10.31274/itaa_proceedings-180814-1366.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
7

Eccleston, C. H. The decision - identification tree: A new EIS scoping tool. Office of Scientific and Technical Information (OSTI), kwiecień 1997. http://dx.doi.org/10.2172/16876.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
8

Mikulski, Dariusz G. Rough Set Based Splitting Criterion for Binary Decision Tree Classifiers. Fort Belvoir, VA: Defense Technical Information Center, wrzesień 2006. http://dx.doi.org/10.21236/ada489077.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
9

Song, So Young, Erin Cho, Youn-Kyung Kim i Theresa Hyunjin Kwon. Clothing Communication via Social Media: A Decision Tree Predictive Model. Ames: Iowa State University, Digital Repository, listopad 2015. http://dx.doi.org/10.31274/itaa_proceedings-180814-102.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
10

Zaman, Md Mostafa, Theresa Hyunjin Kwon, Katrina Laemmerhirt i Youn-Kyung Kim. Profiling Second-hand Clothing Shoppers with Decision Tree Predictive Model. Ames: Iowa State University, Digital Repository, 2017. http://dx.doi.org/10.31274/itaa_proceedings-180814-407.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
Oferujemy zniżki na wszystkie plany premium dla autorów, których prace zostały uwzględnione w tematycznych zestawieniach literatury. Skontaktuj się z nami, aby uzyskać unikalny kod promocyjny!

Do bibliografii