首页 » 人工智能 »

神经网络的架构

2018年4月24日 / 10次阅读
人工神经网络

打开支付宝首页,搜索“529018372”,即可领取红包!可重复领。

关于神经网络的架构,主要是神经网络中的一些专业术语和概念,先上图:

神经网络的架构,这是一张MLP图

神经网络的架构,这是一张MLP图

这张图有几个概念介绍一下:

(1)input layer,输入层;

(2)output layer,输出层,可以是多个神经元,上图只画了一个;

(3)hidden layer,隐藏层;(既不是输入也不是输出的都是隐藏的)

(4)MLP, multilayer perceptrons,多层的神经网络,Sigmoid神经元组成的网络也可以说成是MLP。

设计神经网络的架构,输入层和输出层都很直接,中间的隐藏层的设计有点艺术。

(5)上图的这个网络,叫前馈网络(feedforward neural networks),即没有环。(也存在有环的神经网络,叫做RNN,Recurrent Neural Networks,循环神经网络,The idea in these RNN models is to have neurons which fire for some limited duration of time, before becoming quiescent. That firing can stimulate other neurons, which may fire a little while later, also for a limited duration. That causes still more neurons to fire, and so over time we get a cascade of neurons firing. Loops don't cause problems in such a model, since a neuron's output only affects its input at some later time, not instantaneously.)

 

Recurrent neural nets have been less influential than feedforward networks, in part because the learning algorithms for recurrent nets are (at least to date) less powerful. But recurrent networks are still extremely interesting. They're much closer in spirit to how our brains work than feedforward networks. And it's possible that recurrent networks can solve important problems which can only be solved with great difficulty by feedforward networks.

RNN网络现在还不是特别牛......但是其更符合我们大脑的工作机制。

本文链接:http://www.maixj.net/ai/ann-jiagou-17759
云上小悟 麦新杰(QQ:1093023102)

评论是美德

无力满足评论实名制,评论对非实名注册用户关闭,有事QQ:1093023102.


前一篇:
后一篇:

栏目精选

云上小悟,麦新杰的独立博客

Ctrl+D 收藏本页

栏目

AD

ppdai

©Copyright 麦新杰 Since 2014 云上小悟独立博客版权所有 备案号:苏ICP备14045477号-1。云上小悟网站部分内容来源于网络,转载目的是为了整合信息,收藏学习,服务大家,有些转载内容也难以判断是否有侵权问题,如果侵犯了您的权益,请及时联系站长,我会立即删除。

网站二维码
go to top