Добірка наукової літератури з теми "DECISION TRESS"
Оформте джерело за APA, MLA, Chicago, Harvard та іншими стилями
Ознайомтеся зі списками актуальних статей, книг, дисертацій, тез та інших наукових джерел на тему "DECISION TRESS".
Біля кожної праці в переліку літератури доступна кнопка «Додати до бібліографії». Скористайтеся нею – і ми автоматично оформимо бібліографічне посилання на обрану працю в потрібному вам стилі цитування: APA, MLA, «Гарвард», «Чикаго», «Ванкувер» тощо.
Також ви можете завантажити повний текст наукової публікації у форматі «.pdf» та прочитати онлайн анотацію до роботи, якщо відповідні параметри наявні в метаданих.
Статті в журналах з теми "DECISION TRESS"
Koyuncugil, Ali Serhan, and Nermin Ozgulbas. "Detecting Road Maps for Capacity Utilization Decisions by Clustering Analysis and CHAID Decision Tress." Journal of Medical Systems 34, no. 4 (February 10, 2009): 459–69. http://dx.doi.org/10.1007/s10916-009-9258-9.
Повний текст джерелаArkin, Esther M., Henk Meijer, Joseph S. B. Mitchell, David Rappaport, and Steven S. Skiena. "Decision Trees for Geometric Models." International Journal of Computational Geometry & Applications 08, no. 03 (June 1998): 343–63. http://dx.doi.org/10.1142/s0218195998000175.
Повний текст джерелаReddy, M. R. S. Surya Narayana, T. Narayana Reddy, and C. Viswanatha Reddy. "Decision Tress Analysis on Employee Job Satisfaction and HRD Climate: Role of Demographics." International Journal of Management Studies VI, no. 2(1) (April 30, 2019): 22. http://dx.doi.org/10.18843/ijms/v6i2(1)/03.
Повний текст джерелаYu, Tianyu, Xuandong Mo, Mingjun Chen, and Changfeng Yao. "Machine-learning-assisted microstructure–property linkages of carbon nanotube-reinforced aluminum matrix nanocomposites produced by laser powder bed fusion." Nanotechnology Reviews 10, no. 1 (January 1, 2021): 1410–24. http://dx.doi.org/10.1515/ntrev-2021-0093.
Повний текст джерелаWawrzyk, Martyna. "Semi-supervised learning with the clustering and Decision Trees classifier for the task of cognitive workload study." Journal of Computer Sciences Institute 15 (June 30, 2020): 214–18. http://dx.doi.org/10.35784/jcsi.1725.
Повний текст джерелаZaborski, Daniel, Witold Stanisław Proskura, Katarzyna Wojdak-Maksymiec, and Wilhelm Grzesiak. "Identification of Cows Susceptible to Mastitis based on Selected Genotypes by Using Decision Trees and A Generalized Linear Model." Acta Veterinaria 66, no. 3 (September 1, 2016): 317–35. http://dx.doi.org/10.1515/acve-2016-0028.
Повний текст джерелаChauhan, Rahul. "Prediction of Employee Turnover based on Machine Learning Models." Mathematical Statistician and Engineering Applications 70, no. 2 (February 26, 2021): 1767–75. http://dx.doi.org/10.17762/msea.v70i2.2469.
Повний текст джерелаNiapele, Sabaria, and Tamrin Salim. "Vegetation Analysis of the Tagafura Protected Forest in the City of Tidore Islands." Agrikan: Jurnal Agribisnis Perikanan 13, no. 2 (December 3, 2020): 426. http://dx.doi.org/10.29239/j.agrikan.13.2.426-434.
Повний текст джерелаAzad, Mohammad, Igor Chikalov, and Mikhail Moshkov. "Representation of Knowledge by Decision Trees for Decision Tables with Multiple Decisions." Procedia Computer Science 176 (2020): 653–59. http://dx.doi.org/10.1016/j.procs.2020.09.037.
Повний текст джерелаMărginean, Nicolae, Janetta Sîrbu, and Dan Racoviţan. "Decision Trees – A Perspective Of Electronic Decisional Support." Annales Universitatis Apulensis Series Oeconomica 2, no. 12 (December 31, 2010): 631–37. http://dx.doi.org/10.29302/oeconomica.2010.12.2.15.
Повний текст джерелаДисертації з теми "DECISION TRESS"
Kustra, Rafal. "Soft decision trees." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1997. http://www.collectionscanada.ca/obj/s4/f2/dsk2/ftp04/mq28745.pdf.
Повний текст джерелаMáša, Petr. "Finding Optimal Decision Trees." Doctoral thesis, Vysoká škola ekonomická v Praze, 2006. http://www.nusl.cz/ntk/nusl-456.
Повний текст джерелаMinguillón, Alfonso Julià. "On cascading small decision trees." Doctoral thesis, Universitat Autònoma de Barcelona, 2002. http://hdl.handle.net/10803/3027.
Повний текст джерелаEl nostre primer objectiu va ser desenvolupar un sistema capaç de reconèixer diferents tipus d'elements presents en un document com ara el fons, text, línies horitzontals i verticals, dibuixos esquemàtics i imatges. Aleshores, cada element pot ser tractat d'acord a les seves característiques. Per exemple, el fons s'elimina i no és processat, mentre que les altres regions serien comprimides usant l'algorisme apropiat, JPEG amb pèrdua per a les imatges i un mètode sense pèrdua per a la resta, per exemple. Els primers experiments usant arbres de decisió varen mostrar que els arbres de decisió construïts eren massa grans i que patien de sobre-entrenament. Aleshores, vàrem tractar d'aprofitar la redundància espacial present en les imatges, utilitzant una aproximació de resolució múltiple: si un bloc gran no pot ser correctament classificat, trencar-lo en quatre sub-blocs i repetir el procés recursivament per a cada sub-bloc, usant tot el coneixement que s'hagi calculat amb anterioritat. Els blocs que no poden ser processats per una mida de bloc donada s'etiqueten com a "mixed", pel que la paraula progressiu pren sentit: una primera versió de poca resolució de la imatge classificada és obtinguda amb el primer classificador, i és refinada pel segon, el tercer, etc., fins que una versió final és obtinguda amb l'últim classificador del muntatge. De fet, l'ús de l'esquema progressiu porta a l'ús d'arbres de decisió més petits, ja que ja no cal un classificador complex. En lloc de construir un classificador gran i complex per a classificar tot el conjunt d'entrenament, només provem de resoldre la part més fàcil del problema de classificació, retardant la resta per a un segon classificador, etc.
La idea bàsica d'aquesta tesi és, doncs, un compromís entre el cost i la precisió sota una restricció de confiança. Una primera classificació es efectuada a baix cost; si un element és classificat amb una confiança elevada, s'accepta, i si no ho és, es rebutja i s'efectua una segona classificació, etc. És bàsicament, una variació del paradigma de "cascading", on un primer classificador s'usa per a calcular informació addicional per a cada element d'entrada, que serà usada per a millorar la precisió de classificació d'un segon classificador, etc. El que presentem en aquesta tesi és, bàsicament, una extensió del paradigma de "cascading" i una avaluació empírica exhaustiva dels paràmetres involucrats en la creació d'arbres de decisió progressius. Alguns aspectes teòrics relacionats als arbres de decisió progressius com la complexitat del sistema, per exemple, també són tractats.
This thesis is about using small decision trees for classification and data mining. The intuitive idea behind this thesis is that a sequence of small decision trees may perform better than a large decision tree, reducing both training and exploitation costs.
Our first goal was to develop a system capable to recognize several kinds of elements present in a document such as background, text, horizontal and vertical lines, line drawings and images. Then, each element would be treated accordingly to its characteristics. For example, background regions would be removed and not processed at all, while the other regions would be compressed using an appropriate algorithm, the lossy JPEG standard operation mode for images and a lossless method for the rest, for instance. Our first experiments using decision trees showed that the decision trees we built were too large and they suffered from overfitting. Then, we tried to take advantage of spatial redundancy present in images, using a multi-resolution approach: if a large block cannot be correctly classified, split it in four subblocks and repeat the process recursively for each subblock, using all previous computed knowledge about such block. Blocks that could not be processed at a given block size were labeled as mixed, so the word progressive came up: a first low resolution version of the classified image is obtained with the first classifier, and it is refined by the second one, the third one, etc, until a final version is obtained with the last classifier in the ensemble. Furthermore, the use of the progressive scheme yield to the use of smaller decision trees, as we no longer need a complex classifier. Instead of building a large and complex classifier for classifying the whole input training set, we only try to solve the easiest part of the classification problem, delaying the rest for a second classifier, and so.
The basic idea in this thesis is, therefore, a trade-off between cost and accuracy under a confidence constraint. A first classification is performed at a low cost; if an element is classified with a high confidence, it is accepted, and if not, it is rejected and a second classification is performed, and so. It is, basically, a variation of the cascading paradigm, where a first classifier is used to compute additional information from each input sample, information that will be used to improve classification accuracy by a second classifier, and so on. What we present in this thesis, basically, is an extension of the cascading paradigm and an exhaustive empirical evaluation of the parameters involved in the creation of progressive decision trees. Some basic theoretical issues related to progressive decision trees such as system complexity, for example, are also addressed.
Esta tesis trata sobre la utilización de árboles de decisión pequeños para la clasificación y la minería de datos. La idea intuitiva detrás de esta tesis es que una secuencia de árboles de decisión pequeños puede rendir mejor que un árbol de decisión grande, reduciendo tanto el coste de entrenamiento como el de explotación.
Nuestro primer objetivo fue desarrollar un sistema capaz de reconocer diferentes tipos de elementos presentes en un documento, como el fondo, texto, líneas horizontales y verticales, dibujos esquemáticos y imágenes. Entonces, cada elemento puede ser tratado de acuerdo a sus características. Por ejemplo, el fondo se elimina y no se procesa, mientras que las otras regiones serían comprimidas usando el algoritmo apropiado, JPEG con pérdida para las imágenes y un método sin pérdida para el resto, por ejemplo. Los primeros experimentos usando árboles de decisión mostraron que los árboles de decisión construidos eran demasiado grandes y que sufrían de sobre-entrenamiento. Entonces, se trató de aprovechar la redundancia espacial presente en las imágenes, utilizando una aproximación de resolución múltiple: si un bloque grande no puede ser correctamente clasificado, romperlo en cuatro sub-bloques y repetir el proceso recursivamente para cada sub-bloque, usando todo el conocimiento que se haya calculado con anterioridad. Los bloques que no pueden ser procesados para una medida de bloque dada se etiquetan como "mixed", por lo que la palabra progresivo toma sentido: una primera versión de poca resolución de la imagen clasificada se obtiene con el primer clasificador, y se refina por el segundo, el tercero, etc., hasta que una versión final es obtenida con el último clasificador del montaje. De hecho, el uso del esquema progresivo lleva al uso de árboles de decisión más pequeños, ya que ya no es necesario un clasificador complejo. En lugar de construir un clasificador grande y complejo para clasificar todo el conjunto de entrenamiento, sólo tratamos de resolver la parte más fácil del problema de clasificación, retardando el resto para un segundo clasificador, etc.
La idea básica de esta tesis es, entonces, un compromiso entre el coste y la precisión bajo una restricción de confianza. Una primera clasificación es efectuada a bajo coste; si un elemento es clasificado con una confianza elevada, se acepta, y si no lo es, se rechaza y se efectúa una segunda clasificación, etc. Es básicamente, una variación del paradigma de "cascading", donde un primer clasificador se usa para calcular información adicional para cada elemento de entrada, que será usada para mejorar la precisión de clasificación de un segundo clasificador, etc. Lo que presentamos en esta tesis es, básicamente, una extensión del paradigma de "cascading" y una evaluación empírica exhaustiva de los parámetros involucrados en la creación de árboles de decisión progresivos. Algunos aspectos teóricos relacionados con los árboles de decisión progresivos como la complejidad del sistema, por ejemplo, también son tratados.
Pisetta, Vincent. "New Insights into Decision Trees Ensembles." Thesis, Lyon 2, 2012. http://www.theses.fr/2012LYO20018/document.
Повний текст джерелаDecision trees ensembles are among the most popular tools in machine learning. Nevertheless, their theoretical properties as well as their empirical performances are subject to strong investigation up to date. In this thesis, we propose to shed light on these methods. More precisely, after having described the current theoretical aspects of three main ensemble schemes (chapter 1), we give an analysis supporting the existence of common reasons to the success of these three principles (chapter 2). This last takes into account the two first moments of the margin as an essential ingredient to obtain strong learning abilities. Starting from this rejoinder, we propose a new ensemble algorithm called OSS (Oriented Sub-Sampling) whose steps are in perfect accordance with the point of view we introduce. The empirical performances of OSS are superior to the ones of currently popular algorithms such as Random Forests and AdaBoost. In a third chapter (chapter 3), we analyze Random Forests adopting a “kernel” point of view. This last allows us to understand and observe the underlying regularization mechanism of these kinds of methods. Adopting the kernel point of view also enables us to improve the predictive performance of Random Forests using popular post-processing techniques such as SVM and multiple kernel learning. In conjunction with random Forests, they show greatly improved performances and are able to realize a pruning of the ensemble by conserving only a small fraction of the initial base learners
Wickramarachchi, Darshana Chitraka. "Oblique decision trees in transformed spaces." Thesis, University of Canterbury. Mathematics and Statistics, 2015. http://hdl.handle.net/10092/11051.
Повний текст джерелаHan, Qian. "Mining Shared Decision Trees between Datasets." Wright State University / OhioLINK, 2010. http://rave.ohiolink.edu/etdc/view?acc_num=wright1274807201.
Повний текст джерелаParkhe, Vidyamani. "Randomized decision trees for data mining." [Florida] : State University System of Florida, 2000. http://etd.fcla.edu/etd/uf/2000/ane5962/thesis.pdf.
Повний текст джерелаTitle from first page of PDF file. Document formatted into pages; contains vi, 54 p.; also contains graphics. Vita. Includes bibliographical references (p. 52-53).
Boujari, Tahereh. "Instance-based ontology alignment using decision trees." Thesis, Linköpings universitet, Institutionen för datavetenskap, 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-84918.
Повний текст джерелаLee, Hong, and 李匡. "Model-based decision trees for ranking data." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2010. http://hub.hku.hk/bib/B45149707.
Повний текст джерелаBeck, Jason. "Implementation and Experimentation with C4.5 Decision Trees." Honors in the Major Thesis, University of Central Florida, 2007. http://digital.library.ucf.edu/cdm/ref/collection/ETH/id/1157.
Повний текст джерелаBachelors
Engineering and Computer Science
Computer Engineering
Книги з теми "DECISION TRESS"
Alsolami, Fawaz, Mohammad Azad, Igor Chikalov, and Mikhail Moshkov. Decision and Inhibitory Trees and Rules for Decision Tables with Many-valued Decisions. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-12854-8.
Повний текст джерелаKustra, Rafal. Soft decision trees. Ottawa: National Library of Canada = Bibliothèque nationale du Canada, 1999.
Знайти повний текст джерелаMcNellis, Ryan Thomas. Training Decision Trees for Optimal Decision-Making. [New York, N.Y.?]: [publisher not identified], 2020.
Знайти повний текст джерелаAzad, Mohammad, Igor Chikalov, Shahid Hussain, Mikhail Moshkov, and Beata Zielosko. Decision Trees with Hypotheses. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-031-08585-7.
Повний текст джерелаAuthority, Financial Services. Stakeholder pensions decision trees. London: Financial Services Authority, 2000.
Знайти повний текст джерелаAuthority, Financial Services. Stakeholder Pensions and Decision Trees. London: Financial Services Authority, 2000.
Знайти повний текст джерелаAuthority, Financial Services, ed. Stakeholder pensions and decision trees. London: Financial Services Authority, 2000.
Знайти повний текст джерелаAuthority, Financial Services, ed. Stakeholder pensions and decision trees. London: Financial Services Authority, 2002.
Знайти повний текст джерелаAuthority, Financial Services, ed. Stakeholder pensions and decision trees. London: Financial Services Authority, 2000.
Знайти повний текст джерелаservice), SpringerLink (Online, ed. Average Time Complexity of Decision Trees. Berlin, Heidelberg: Springer-Verlag GmbH Berlin Heidelberg, 2011.
Знайти повний текст джерелаЧастини книг з теми "DECISION TRESS"
Grąbczewski, Krzysztof. "Validated Decision Trees versus Collective Decisions." In Computational Collective Intelligence. Technologies and Applications, 342–51. Berlin, Heidelberg: Springer Berlin Heidelberg, 2011. http://dx.doi.org/10.1007/978-3-642-23938-0_35.
Повний текст джерелаMurty, M. Narasimha, and V. Susheela Devi. "Decision Trees." In Undergraduate Topics in Computer Science, 123–46. London: Springer London, 2011. http://dx.doi.org/10.1007/978-0-85729-495-1_6.
Повний текст джерелаSuzuki, Joe. "Decision Trees." In Statistical Learning with Math and R, 147–70. Singapore: Springer Singapore, 2020. http://dx.doi.org/10.1007/978-981-15-7568-6_8.
Повний текст джерелаZhou, Hong. "Decision Trees." In Learn Data Mining Through Excel, 125–48. Berkeley, CA: Apress, 2020. http://dx.doi.org/10.1007/978-1-4842-5982-5_9.
Повний текст джерелаPérez Castaño, Arnaldo. "Decision Trees." In Practical Artificial Intelligence, 367–410. Berkeley, CA: Apress, 2018. http://dx.doi.org/10.1007/978-1-4842-3357-3_10.
Повний текст джерелаJukna, Stasys. "Decision Trees." In Algorithms and Combinatorics, 405–37. Berlin, Heidelberg: Springer Berlin Heidelberg, 2011. http://dx.doi.org/10.1007/978-3-642-24508-4_14.
Повний текст джерелаDobra, Alin. "Decision Trees." In Encyclopedia of Database Systems, 1–2. New York, NY: Springer New York, 2016. http://dx.doi.org/10.1007/978-1-4899-7993-3_553-2.
Повний текст джерелаGrosan, Crina, and Ajith Abraham. "Decision Trees." In Intelligent Systems Reference Library, 269–80. Berlin, Heidelberg: Springer Berlin Heidelberg, 2011. http://dx.doi.org/10.1007/978-3-642-21004-4_11.
Повний текст джерелаKubat, Miroslav. "Decision Trees." In An Introduction to Machine Learning, 113–35. Cham: Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-20010-1_6.
Повний текст джерелаDobra, Alin. "Decision Trees." In Encyclopedia of Database Systems, 769. Boston, MA: Springer US, 2009. http://dx.doi.org/10.1007/978-0-387-39940-9_553.
Повний текст джерелаТези доповідей конференцій з теми "DECISION TRESS"
Waksman, Peter. "Isolating causes of yield excursions with decision tress and commonality." In Design, Process Integration, and Characterization for Microelectronics, edited by Alexander Starikov and Kenneth W. Tobin, Jr. SPIE, 2002. http://dx.doi.org/10.1117/12.475647.
Повний текст джерелаSokolov, Andrey Pavlovich. "On expressive abilities of ensembles of decision trees." In Academician O.B. Lupanov 14th International Scientific Seminar "Discrete Mathematics and Its Applications". Keldysh Institute of Applied Mathematics, 2022. http://dx.doi.org/10.20948/dms-2022-72.
Повний текст джерелаFleischer, Rudolf. "Decision trees." In the twenty-fifth annual ACM symposium. New York, New York, USA: ACM Press, 1993. http://dx.doi.org/10.1145/167088.167216.
Повний текст джерелаBjörner, Anders, László Lovász, and Andrew C. C. Yao. "Linear decision trees." In the twenty-fourth annual ACM symposium. New York, New York, USA: ACM Press, 1992. http://dx.doi.org/10.1145/129712.129730.
Повний текст джерелаIgnatov, Dmitry, and Andrey Ignatov. "Decision Stream: Cultivating Deep Decision Trees." In 2017 IEEE 29th International Conference on Tools with Artificial Intelligence (ICTAI). IEEE, 2017. http://dx.doi.org/10.1109/ictai.2017.00140.
Повний текст джерелаVos, Daniël, and Sicco Verwer. "Optimal Decision Tree Policies for Markov Decision Processes." In Thirty-Second International Joint Conference on Artificial Intelligence {IJCAI-23}. California: International Joint Conferences on Artificial Intelligence Organization, 2023. http://dx.doi.org/10.24963/ijcai.2023/606.
Повний текст джерелаA.Jimenez-Roa, Lisandro, Tom Heskes, and Marielle Stoelinga. "Fault Trees, Decision Trees, And Binary Decision Diagrams: A Systematic Comparison." In Proceedings of the 31st European Safety and Reliability Conference. Singapore: Research Publishing Services, 2021. http://dx.doi.org/10.3850/978-981-18-2016-8_241-cd.
Повний текст джерелаAbu-halaweh, Na'el M., and Robert W. Harrison. "Practical fuzzy decision trees." In 2009 IEEE Symposium on Computational Intelligence and Data Mining (CIDM). IEEE, 2009. http://dx.doi.org/10.1109/cidm.2009.4938651.
Повний текст джерелаStruharik, R., V. Vranjkovic, S. Dautovic, and L. Novak. "Inducing oblique decision trees." In 2014 IEEE 12th International Symposium on Intelligent Systems and Informatics (SISY 2014). IEEE, 2014. http://dx.doi.org/10.1109/sisy.2014.6923596.
Повний текст джерелаGopalan, Parikshit, Adam Tauman Kalai, and Adam R. Klivans. "Agnostically learning decision trees." In the 40th annual ACM symposium. New York, New York, USA: ACM Press, 2008. http://dx.doi.org/10.1145/1374376.1374451.
Повний текст джерелаЗвіти організацій з теми "DECISION TRESS"
Wei, Yin-Loh. Decision Trees for Prediction and Data Mining. Fort Belvoir, VA: Defense Technical Information Center, February 2005. http://dx.doi.org/10.21236/ada430178.
Повний текст джерелаKegelmeyer, W. P. Jr, B. Groshong, M. Allmen, and K. Woods. Decision trees and integrated features for computer aided mammographic screening. Office of Scientific and Technical Information (OSTI), February 1997. http://dx.doi.org/10.2172/501540.
Повний текст джерелаKim, Joungbum, Sarah E. Schwarm, and Mari Ostendorf. Detecting Structural Metadata with Decision Trees and Transformation-Based Learning. Fort Belvoir, VA: Defense Technical Information Center, January 2004. http://dx.doi.org/10.21236/ada457891.
Повний текст джерелаAba, David W., and Leonard A. Breslow. Comparing Simplification Procedures for Decision Trees on an Economics Classification. Fort Belvoir, VA: Defense Technical Information Center, May 1998. http://dx.doi.org/10.21236/ada343512.
Повний текст джерелаLinenger, Jerry M., William B. Long, and William J. Sacco. Combat Surgery: Medical Decision Trees for Treatment of Naval Combat Casualties. Fort Belvoir, VA: Defense Technical Information Center, February 1991. http://dx.doi.org/10.21236/ada374992.
Повний текст джерелаBarber, James. Using Boosted Decision Trees to Separate Signal and Background in B to XsGamma Decays. Office of Scientific and Technical Information (OSTI), September 2006. http://dx.doi.org/10.2172/892609.
Повний текст джерелаEdmunds, Thomas A., Jeffrey S. Garrett, and Craig R. Wuest. Decision Trees for Analysis of Strategies to Deter Limited Nuclear Use in a Generic Scenario. Office of Scientific and Technical Information (OSTI), October 2018. http://dx.doi.org/10.2172/1477147.
Повний текст джерелаZio, Enrico, and Nicola Pedroni. Uncertainty characterization in risk analysis for decision-making practice. Fondation pour une culture de sécurité industrielle, May 2012. http://dx.doi.org/10.57071/155chr.
Повний текст джерелаLiu, Zhiyi. Measurement of single top quark production in the tau+jets channnel using boosted decision trees at D0. Office of Scientific and Technical Information (OSTI), December 2009. http://dx.doi.org/10.2172/970067.
Повний текст джерелаHarter, Rachel M., Pinliang (Patrick) Chen, Joseph P. McMichael, Edgardo S. Cureg, Samson A. Adeshiyan, and Katherine B. Morton. Constructing Strata of Primary Sampling Units for the Residential Energy Consumption Survey. RTI Press, May 2017. http://dx.doi.org/10.3768/rtipress.2017.op.0041.1705.
Повний текст джерела