Please use this identifier to cite or link to this item:
http://hdl.handle.net/10603/527575
Title: | Deep Learning and Multi Attention based Transformer Language Models for Generative Knowledge Inference |
Researcher: | Shobhan Kumar |
Guide(s): | C, Pavan Kumar and Chauhan, Arun |
Keywords: | Computer Science Computer Science Artificial Intelligence Engineering and Technology |
University: | Indian Institute of Information Technology Dharwad |
Completed Date: | 2023 |
Abstract: | Textbooks play a vital role in any educational system, as they serve as the primary source of information. Many times, the textbooks that are prescribed are not adequate to fulfil the curiosity of the students; hence, they often use community Question-Answering (cQA) systems along with textbooks to gain adequate knowledge of a concept. A large number of Question-Answers (QAs) are available to access in the cQA forums. Due to the high volume of QAs, there is a high variance in the quality of questions and answers in cQA forums; hence, it is hard for the students to go through all possible QAs for a better understanding of the concepts. To address this issue, this work presents a technological solution: Text enrichment with cQA-QAs . The proposed model augments eBooks at the sentence level by recommending the relevant Question-Answers (QA) from community Question-Answering (cQA) forums. In the next phase, the enrichment model selects the key topics of the eBook using topic modeling; these topics are enough to convey a concise summary of the eBook s contents. Later, for each of these selected topics, it recommends the relevant QA from Quora cQA forums. The probability of redundancy in questions is high due to the increasing influx of users on different cQA forums. Because of this redundancy, the responses are scattered through variations of the same question, which results in unsatisfactory search results for a specific question. To address the redundancy issues, this work proposes a transformer based Siamese Network architecture to generate semantically meaningful sentence embeddings. Subsequently, Siamese-Bidirectional Encoder Representations from Transformers(BERT) are applied to assess the similarity between the questions. However, reading only recommended e-content (cQA - QA pairs) does not make students learning effective. Posing appropriate questions during the reading process can also aid in the learner s newlinecomprehension. |
Pagination: | xxi, 305 p. |
URI: | http://hdl.handle.net/10603/527575 |
Appears in Departments: | Computer Science and Engineering |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
01_title.pdf | Attached File | 29.9 kB | Adobe PDF | View/Open |
02_prelim pages.pdf | 335.09 kB | Adobe PDF | View/Open | |
03_content.pdf | 27.1 kB | Adobe PDF | View/Open | |
04_abstract.pdf | 26.44 kB | Adobe PDF | View/Open | |
05_chapter 1.pdf | 54.61 kB | Adobe PDF | View/Open | |
06_chapter 2.pdf | 357.17 kB | Adobe PDF | View/Open | |
07_chapter 3.pdf | 592.14 kB | Adobe PDF | View/Open | |
08_chapter 4.pdf | 1.35 MB | Adobe PDF | View/Open | |
09_chapter 5.pdf | 237.55 kB | Adobe PDF | View/Open | |
10_annexures.pdf | 3.19 MB | Adobe PDF | View/Open | |
80_recommendation.pdf | 45.23 kB | Adobe PDF | View/Open |
Items in Shodhganga are licensed under Creative Commons Licence Attribution-NonCommercial-ShareAlike 4.0 International (CC BY-NC-SA 4.0).
Altmetric Badge: