Please use this identifier to cite or link to this item: http://hdl.handle.net/10603/340925
Full metadata record
DC FieldValueLanguage
dc.coverage.spatialTree Classifiers
dc.date.accessioned2021-09-17T09:02:31Z-
dc.date.available2021-09-17T09:02:31Z-
dc.identifier.urihttp://hdl.handle.net/10603/340925-
dc.description.abstractA large amount of data is generated by the digital revolution in a human computerized world. The challenge of automatically analyzing this digital data has increased the need for developing data mining models. Depending upon user requirements, a variety of data mining techniques are used like classification, clustering, regression, summarization, association, and anomaly detection. In the field of data mining, classification plays a very vital role in both predicting the output and discovering patterns in data. Among various classification techniques, the decision tree is found to be a simple, expressive, robust, and efficient classifier. newlineA decision tree is a flowchart like structure generating human-interpretable knowledge with high prediction accuracy. An ensemble of decision trees called a decision tree forest is more accurate and more robust to noise than a single decision tree. In literature, various decision tree and decision tree forest induction algorithms are proposed for accurate classification of data. However, there are some aspects of decision trees and decision tree forest gives us space for further improvement. When data grows larger, then a decision tree becomes large and complex. To tackle this limitation, a clustering-based preprocessing technique called Decision Tree based on Cluster Analysis Pre-processing is applied to reduce datasets. This approach finds informative instances using supervised and unsupervised clustering which optimizes decision trees and gives more prediction accuracy. Two novel methods of opting for representative instances from large datasets are proposed in this thesis. These efficient algorithms are capable to select representative instances from small, medium, and large size datasets. Along with an increase in classification accuracy, it reduces size, number of leaves, and time for training decision trees. newlineThe decision tree forest is based on the rule that creating several weak learners is better than creating a single strong learner. Some aspects of decision tree fores
dc.format.extent139p
dc.languageEnglish
dc.relation219b
dc.rightsuniversity
dc.titleSome Aspects of Decision Tree Classifiers
dc.title.alternative
dc.creator.researcherPanhalkar Archana Ramkisanrao
dc.subject.keywordComputer Science
dc.subject.keywordComputer Science Information Systems
dc.subject.keywordEngineering and Technology
dc.description.note
dc.contributor.guideDoye Dharmpal D.
dc.publisher.placeNanded
dc.publisher.universitySwami Ramanand Teerth Marathwada University
dc.publisher.institutionDepartment of Computer Science and Engineering
dc.date.registered2014
dc.date.completed2021
dc.date.awarded2021
dc.format.dimensions
dc.format.accompanyingmaterialNone
dc.source.universityUniversity
dc.type.degreePh.D.
Appears in Departments:Department of Computer Science and Engineering

Files in This Item:
File Description SizeFormat 
01_title.pdfAttached File485.56 kBAdobe PDFView/Open
02_certificate.pdf244.53 kBAdobe PDFView/Open
03_abstract.pdf182.4 kBAdobe PDFView/Open
04_declaration.pdf70.74 kBAdobe PDFView/Open
05_acknowledgement.pdf179.16 kBAdobe PDFView/Open
06_contents.pdf46.98 kBAdobe PDFView/Open
07_list_of_tables.pdf186.66 kBAdobe PDFView/Open
08_list_of_figures.pdf187 kBAdobe PDFView/Open
09_abbreviations.pdf263.21 kBAdobe PDFView/Open
10_chapter 1.pdf143.31 kBAdobe PDFView/Open
11_chapter 2.pdf637.26 kBAdobe PDFView/Open
12_chapter 3.pdf939.31 kBAdobe PDFView/Open
13_chapter 4.pdf1.15 MBAdobe PDFView/Open
14_chapter 5.pdf951.61 kBAdobe PDFView/Open
15_chapter 6.pdf740.77 kBAdobe PDFView/Open
16_conclusions.pdf29.93 kBAdobe PDFView/Open
17_bibliography.pdf305.51 kBAdobe PDFView/Open
80_recommendation.pdf511.63 kBAdobe PDFView/Open


Items in Shodhganga are licensed under Creative Commons Licence Attribution-NonCommercial-ShareAlike 4.0 International (CC BY-NC-SA 4.0).

Altmetric Badge: