Arid
DOI10.1016/j.neunet.2020.05.007
Sparsity through evolutionary pruning prevents neuronal networks from overfitting
Gerum, Richard C.; Erpenbeck, Andre; Krauss, Patrick; Schilling, Achim
通讯作者Schilling, A
来源期刊NEURAL NETWORKS
ISSN0893-6080
EISSN1879-2782
出版年2020
卷号128页码:305-312
英文摘要Modern Machine learning techniques take advantage of the exponentially rising calculation power in new generation processor units. Thus, the number of parameters which are trained to solve complex tasks was highly increased over the last decades. However, still the networks fail - in contrast to our brain - to develop general intelligence in the sense of being able to solve several complex tasks with only one network architecture. This could be the case because the brain is not a randomly initialized neural network, which has to be trained from scratch by simply investing a lot of calculation power, but has from birth some fixed hierarchical structure. To make progress in decoding the structural basis of biological neural networks we here chose a bottom-up approach, where we evolutionarily trained small neural networks in performing a maze task. This simple maze task requires dynamic decision making with delayed rewards. We were able to show that during the evolutionary optimization random severance of connections leads to better generalization performance of the networks compared to fully connected networks. We conclude that sparsity is a central property of neural networks and should be considered for modern Machine learning approaches. (C) 2020 The Author(s). Published by Elsevier Ltd.
英文关键词Evolution Artificial neural networks Maze task Evolutionary algorithm Overfitting Biological plausibility
类型Article
语种英语
开放获取类型Other Gold
收录类别SCI-E
WOS记录号WOS:000567770800007
WOS关键词PATH-INTEGRATION ; DESERT ANTS ; CATAGLYPHIS ; CONNECTOME ; MICROGLIA ; FEATURES ; SYSTEM ; EDGE ; MAP
WOS类目Computer Science, Artificial Intelligence ; Neurosciences
WOS研究方向Computer Science ; Neurosciences & Neurology
资源类型期刊论文
条目标识符http://119.78.100.177/qdio/handle/2XILL650/326168
作者单位[Gerum, Richard C.] Friedrich Alexander Univ Erlangen Nurnberg FAU, Dept Phys, Biophys Grp, Nurnberg, Germany; [Erpenbeck, Andre] Tel Aviv Univ TAU, Sch Chem, Raymond & Beverley Sackler Ctr Computat Mol & Mat, Tel Aviv, Israel; [Krauss, Patrick; Schilling, Achim] Univ Hosp Erlangen, Neurosci Lab, Expt Otolaryngol, Erlangen, Germany; [Krauss, Patrick; Schilling, Achim] Friedrich Alexander Univ Erlangen Nurnberg FAU, Chair English Philol & Linguist, Cognit Computat Neurosci Grp, Erlangen, Germany; [Krauss, Patrick] Univ Groningen, Univ Med Ctr Groningen UMCG, Dept Otorhinolaryngol Head & Neck Surg, Groningen, Netherlands
推荐引用方式
GB/T 7714
Gerum, Richard C.,Erpenbeck, Andre,Krauss, Patrick,et al. Sparsity through evolutionary pruning prevents neuronal networks from overfitting[J],2020,128:305-312.
APA Gerum, Richard C.,Erpenbeck, Andre,Krauss, Patrick,&Schilling, Achim.(2020).Sparsity through evolutionary pruning prevents neuronal networks from overfitting.NEURAL NETWORKS,128,305-312.
MLA Gerum, Richard C.,et al."Sparsity through evolutionary pruning prevents neuronal networks from overfitting".NEURAL NETWORKS 128(2020):305-312.
条目包含的文件
条目无相关文件。
个性服务
推荐该条目
保存到收藏夹
导出为Endnote文件
谷歌学术
谷歌学术中相似的文章
[Gerum, Richard C.]的文章
[Erpenbeck, Andre]的文章
[Krauss, Patrick]的文章
百度学术
百度学术中相似的文章
[Gerum, Richard C.]的文章
[Erpenbeck, Andre]的文章
[Krauss, Patrick]的文章
必应学术
必应学术中相似的文章
[Gerum, Richard C.]的文章
[Erpenbeck, Andre]的文章
[Krauss, Patrick]的文章
相关权益政策
暂无数据
收藏/分享

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。