Please use this identifier to cite or link to this item:
http://hdl.handle.net/10603/341458
Title: | An enhanced approach for software defects prediction using intelligent techniques |
Researcher: | Punitha. K |
Guide(s): | Latha, B |
Keywords: | Engineering and Technology Computer Science Computer Science Information Systems Software defect prediction Machine learning |
University: | Anna University |
Completed Date: | 2020 |
Abstract: | A software defect is an error, flaw, bug, mistake, failure, or fault in a computer program or software that may generate an inaccurate or unexpected result. Software defects always increase the cost and time in completing a software product with expected quality. Moreover, identifying and rectifying defects is one of the most time consuming and expensive software processes. It is not practically possible to eliminate every defect but reducing the magnitude of defects, and their adverse effect on the projects is achievable. Software Defect Prediction (SDP) is the process of locating defective modules in the software. To produce high-quality software, the final product should have as few defects as possible. A software metric is a measure of some property of a piece of software or its specifications. Software metrics are often used to assess the ability of software to achieve a predefined goal. SDP, a learning problem, attracts interest from academia/industry. Static code attributes from prior defect logs based software releases build models to predict defective modules for the next release. It locates defective software parts, used with limited budgets, or when the system is too large for comprehensive testing. Data mining techniques and machine learning algorithms are useful in the prediction of software defects. These techniques can be applied to the software repositories to extract the defects of a software product. For effective defect prediction models, data/features to be used have a major role. In this work, the proposed system classifies various defects using classifiers like naïve Bayes, K Nearest Neighbor (KNN), and Radial Basis Function (RBF). KNN is an important non-parameter, supervised learning algorithm. Classification rules are generated by training samples without additional data. Naïve Bayes classification is based on Bayes theorem. Here the optimal rules were given as input to the naïve Bayes classifier. This type of classifier has the advantage that it is easy to implement and generate good results. RBF network is a popular network type that is useful for pattern classification. newline |
Pagination: | xxii,155 p. |
URI: | http://hdl.handle.net/10603/341458 |
Appears in Departments: | Faculty of Information and Communication Engineering |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
01_title.pdf | Attached File | 25.83 kB | Adobe PDF | View/Open |
02_certificates.pdf | 938.13 kB | Adobe PDF | View/Open | |
03_vivaproceedings.pdf | 1.66 MB | Adobe PDF | View/Open | |
04_bonafidecertificate.pdf | 1.1 MB | Adobe PDF | View/Open | |
05_abstracts.pdf | 113.04 kB | Adobe PDF | View/Open | |
06_acknowledgements.pdf | 17 kB | Adobe PDF | View/Open | |
07_contents.pdf | 120.04 kB | Adobe PDF | View/Open | |
08_listoftables.pdf | 10.03 kB | Adobe PDF | View/Open | |
09_listoffigures.pdf | 33.51 kB | Adobe PDF | View/Open | |
10_listofabbreviations.pdf | 159.28 kB | Adobe PDF | View/Open | |
11_chapter1.pdf | 238.4 kB | Adobe PDF | View/Open | |
12_chapter2.pdf | 186.43 kB | Adobe PDF | View/Open | |
13_chapter3.pdf | 608.14 kB | Adobe PDF | View/Open | |
14_chapter4.pdf | 440.43 kB | Adobe PDF | View/Open | |
15_chapter5.pdf | 352.45 kB | Adobe PDF | View/Open | |
16_conclusion.pdf | 15.6 kB | Adobe PDF | View/Open | |
17_references.pdf | 231.14 kB | Adobe PDF | View/Open | |
18_listofpublications.pdf | 111.3 kB | Adobe PDF | View/Open | |
80_recommendation.pdf | 49.12 kB | Adobe PDF | View/Open |
Items in Shodhganga are licensed under Creative Commons Licence Attribution-NonCommercial-ShareAlike 4.0 International (CC BY-NC-SA 4.0).
Altmetric Badge: