Knowledge Resource Center for Ecological Environment in Arid Area
DOI | 10.1016/j.neunet.2020.05.007 |
Sparsity through evolutionary pruning prevents neuronal networks from overfitting | |
Gerum, Richard C.; Erpenbeck, Andre; Krauss, Patrick; Schilling, Achim | |
通讯作者 | Schilling, A |
来源期刊 | NEURAL NETWORKS
![]() |
ISSN | 0893-6080 |
EISSN | 1879-2782 |
出版年 | 2020 |
卷号 | 128页码:305-312 |
英文摘要 | Modern Machine learning techniques take advantage of the exponentially rising calculation power in new generation processor units. Thus, the number of parameters which are trained to solve complex tasks was highly increased over the last decades. However, still the networks fail - in contrast to our brain - to develop general intelligence in the sense of being able to solve several complex tasks with only one network architecture. This could be the case because the brain is not a randomly initialized neural network, which has to be trained from scratch by simply investing a lot of calculation power, but has from birth some fixed hierarchical structure. To make progress in decoding the structural basis of biological neural networks we here chose a bottom-up approach, where we evolutionarily trained small neural networks in performing a maze task. This simple maze task requires dynamic decision making with delayed rewards. We were able to show that during the evolutionary optimization random severance of connections leads to better generalization performance of the networks compared to fully connected networks. We conclude that sparsity is a central property of neural networks and should be considered for modern Machine learning approaches. (C) 2020 The Author(s). Published by Elsevier Ltd. |
英文关键词 | Evolution Artificial neural networks Maze task Evolutionary algorithm Overfitting Biological plausibility |
类型 | Article |
语种 | 英语 |
开放获取类型 | Other Gold |
收录类别 | SCI-E |
WOS记录号 | WOS:000567770800007 |
WOS关键词 | PATH-INTEGRATION ; DESERT ANTS ; CATAGLYPHIS ; CONNECTOME ; MICROGLIA ; FEATURES ; SYSTEM ; EDGE ; MAP |
WOS类目 | Computer Science, Artificial Intelligence ; Neurosciences |
WOS研究方向 | Computer Science ; Neurosciences & Neurology |
资源类型 | 期刊论文 |
条目标识符 | http://119.78.100.177/qdio/handle/2XILL650/326168 |
作者单位 | [Gerum, Richard C.] Friedrich Alexander Univ Erlangen Nurnberg FAU, Dept Phys, Biophys Grp, Nurnberg, Germany; [Erpenbeck, Andre] Tel Aviv Univ TAU, Sch Chem, Raymond & Beverley Sackler Ctr Computat Mol & Mat, Tel Aviv, Israel; [Krauss, Patrick; Schilling, Achim] Univ Hosp Erlangen, Neurosci Lab, Expt Otolaryngol, Erlangen, Germany; [Krauss, Patrick; Schilling, Achim] Friedrich Alexander Univ Erlangen Nurnberg FAU, Chair English Philol & Linguist, Cognit Computat Neurosci Grp, Erlangen, Germany; [Krauss, Patrick] Univ Groningen, Univ Med Ctr Groningen UMCG, Dept Otorhinolaryngol Head & Neck Surg, Groningen, Netherlands |
推荐引用方式 GB/T 7714 | Gerum, Richard C.,Erpenbeck, Andre,Krauss, Patrick,et al. Sparsity through evolutionary pruning prevents neuronal networks from overfitting[J],2020,128:305-312. |
APA | Gerum, Richard C.,Erpenbeck, Andre,Krauss, Patrick,&Schilling, Achim.(2020).Sparsity through evolutionary pruning prevents neuronal networks from overfitting.NEURAL NETWORKS,128,305-312. |
MLA | Gerum, Richard C.,et al."Sparsity through evolutionary pruning prevents neuronal networks from overfitting".NEURAL NETWORKS 128(2020):305-312. |
条目包含的文件 | 条目无相关文件。 |
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。