Difference between revisions of "Neuron (neural network)"

From Maths
Jump to: navigation, search
m
m
 
(One intermediate revision by the same user not shown)
Line 1: Line 1:
 
==Definition==
 
==Definition==
 
<div style="float:right;margin:0.05em;">
 
<div style="float:right;margin:0.05em;">
{| class="wikitable" border="1"
+
{| style="margin-bottom:0px;" class="wikitable" border="1"
 
| <center><span style="font-size:1.1em"><mm>\xymatrix{  
 
| <center><span style="font-size:1.1em"><mm>\xymatrix{  
 
     I_{1} \ar[ddrr]^{w_{1} } \\  
 
     I_{1} \ar[ddrr]^{w_{1} } \\  
Line 23: Line 23:
 
In the example to the right, the output of the neuron would be:
 
In the example to the right, the output of the neuron would be:
 
* {{M|1=\mathcal{A}\left(\sum_{i=1}^n(I_iw_i)+\theta\right)}}
 
* {{M|1=\mathcal{A}\left(\sum_{i=1}^n(I_iw_i)+\theta\right)}}
 +
<div style="clear:both;"></div>
 
==Specific models==
 
==Specific models==
 
For an exhaustive list see [[:Category:Types of neuron in a neural network]]
 
For an exhaustive list see [[:Category:Types of neuron in a neural network]]
==[[McCulloch–Pitts neuron]]==
+
===[[McCulloch-Pitts neuron]]===
{{:McCulloch–Pitts neuron}}
+
{{:McCulloch-Pitts neuron/Definition}}
 
==References==
 
==References==
 
<references/>
 
<references/>
 
{{Neural networks navbox}}
 
{{Neural networks navbox}}
{{CS Definition|Neural Networks}}{{Statistics Definition|Neural Networks}}
+
{{CS Definition|Neural Network}}{{Statistics Definition|Neural Network}}

Latest revision as of 13:33, 22 April 2016

Definition

Block diagram of a generic neuron with inputs: I1,,In
A neuron in a neural network has:
  • an output domain, O typically [1,1]R or [0,1]R
    • Usually {0,1} for input and output neurons
  • some inputs, Ii, typically IiR
  • some weights, 1 for each input, wi, again wiR
  • a way to combine each input with a weight (typically multiplication) (Iiwi - creating an "input activation", AiR
  • a bias, θ (pf the same type as the result of combining an input with a weight. Typically this can be simulated by having a fixed "on" input, and treating the bias as another weight) - another input activation, A0
  • a way to combine the input values, typically: nj=0Aj=nj=1Ijwj+θ
  • an activation function A():ROR, this maps the combined input activations to an output value.

In the example to the right, the output of the neuron would be:

  • A(ni=1(Iiwi)+θ)

Specific models

For an exhaustive list see Category:Types of neuron in a neural network

McCulloch-Pitts neuron

Diagram of a McCulloch-Pitts neuron
The McCulloch-Pitts neuron has[1]:
  • Inputs: (I1,,In)Rn
    • Usually each Ii is confined to [0,1]R or [1,1]R
  • A set of weights, one for each input: (w1,,wn)Rn
  • A bias: θR
  • An activation function, A:RR
    • It is more common to see A:R[1,1]R or sometimes A:R[0,1]R than the entire of R

The output of the neuron is given by:

Output:=A(ni=1(Iiwi)+θ)

References

  1. Jump up Neural Networks and Statistical Learning - Ke-Lin Du and M. N. S. Swamy

Template:Neural networks navbox



Template:Statistics Definition