General Information
    • ISSN: 1793-8201 (Print), 2972-4511 (Online)
    • Abbreviated Title: Int. J. Comput. Theory Eng.
    • Frequency: Quarterly
    • DOI: 10.7763/IJCTE
    • Editor-in-Chief: Prof. Mehmet Sahinoglu
    • Associate Editor-in-Chief: Assoc. Prof. Alberto Arteta, Assoc. Prof. Engin Maşazade
    • Managing Editor: Ms. Mia Hu
    • Abstracting/Indexing: Scopus (Since 2022), INSPEC (IET), CNKI,  Google Scholar, EBSCO, etc.
    • Average Days from Submission to Acceptance: 192 days
    • APC: 800 USD
    • E-mail:
    • Journal Metrics:
    • SCImago Journal & Country Rank
Prof. Mehmet Sahinoglu
Computer Science Department, Troy University, USA
I'm happy to take on the position of editor in chief of IJCTE. We encourage authors to submit papers concerning any branch of computer theory and engineering.

IJCTE 2018 Vol.10(2): 38-45 ISSN: 1793-8201
DOI: 10.7763/IJCTE.2018.V10.1196

Alleviating Catastrophic Forgetting with Modularity for Continuously Learning Linked Open Data

Lu Chen and Masayuki Murata

Abstract—Nowadays, Linked Open Data is spreading year by year, and its further utilization is expected. Because of the large size of data, Linked Open Data is attempted to learn by using neural networks. Since the data is still scaling in various region, unlike the existing neural network specialized to learn only one region, a neural network which can continuously learn wide region of knowledge is needed. However, neural network is known in its problem, catastrophic forgetting, which is to lose previously acquired skills when learning a new skill. Though existing researches said enhancing modularity can overcome this problem since it can reduce interference between tasks, those researches consider the number of learning tasks is given in advance, and it is not applicable for continuous learning. In this paper, we propose a design approach of neural network reducing modularity expecting that unspecialization can mitigate catastrophic forgetting for continuous learning. Our results show that, although, as we can expect, a neural network with high modularity can mitigate forgetting for tasks learned just before because of the low interference, a neural network with low modularity is better for the worst case when evaluating for all the tasks it learned in the past.

Index Terms—Neural network topology design, catastrophic forgetting, linked open data, modularity.

Lu Chen and Masayuki Murata are with Osaka University, Suita, Osaka, Japan (e-mail:,


Cite:Lu Chen and Masayuki Murata, "Alleviating Catastrophic Forgetting with Modularity for Continuously Learning Linked Open Data," International Journal of Computer Theory and Engineering vol. 10, no. 2, pp. 38-45, 2018.

Copyright © 2008-2024. International Association of Computer Science and Information Technology. All rights reserved.