Machine Learning - Neural Networks Examples and Intuitions

版权声明:本文为博主原创文章,未经博主允许不得转载。 https://blog.csdn.net/iracer/article/details/50875484

This series of articles are the study notes of " Machine Learning ", by Prof. Andrew Ng., Stanford University. This article is the notes of week 4.  It contains topics about Neural Networks examples and intuitions.


Neural Networks Examples and Intuitions


1. Neural Networks Examples and intuitions I

Non-linear classification example: XOR/XNOR 

Consider the following problem where we have features X1 and X2 that are binary values. So, either 0 or 1. So, X1 and X2 can each take on only one of two possible values. In this example, I've drawn only two positive examples and two negative examples. That you can think of this as a simplified version of a more complex learning problem where we may have a bunch of positive examples in the upper right and lower left and a bunch of negative examples denoted by the circles. And what we'd like to do is learn a non-linear division of boundary that may need to separate the positive and negative examples.

x1,xare binary (0 or 1). 

y = x1 XOR x2

x1 XNOR x2 = NOT(x1 XOR x2)

And we're going to have y equals 0 if only one of them is true and we're going to figure out if we can get a neural network to fit to this sort of training set. 

Example: AND 


In order to build up to a network that fits the XNOR example we're going to start with a slightly simpler one and show a network that fits the AND function.
In order to do so, I'm going to actually draw in the bias unit as well the plus one unit. Now let me just assign some values to the weights or parameters of this network. I'm gonna write down the parameters on this diagram here, -30 here. +20 and + 20.

x1, x2 ∈{0,1}

y = x1 AND x2


Let's look at what this little single neuron network will compute. Just to remind you the sigmoid activation function g(z) looks like this. It starts from 0 rises smoothly crosses 0.5 and then it asymptotic as 1 and to give you some landmarks, if the horizontal axis value z is equal to 4.6 then the sigmoid function is equal to 0.99. This is very close to 1 and kind of symmetrically, if it's -4.6 then the sigmoid function there is 0.01 which is very close to 0.


Let's look at the four possible input values for x1 and x2 and look at what the hypotheses will output in that case.



Example: OR 


This network showed here computes the OR function.



So, hopefully with this you now understand how single neurons in a neural network can be used to compute logical functions like AND and OR and so on. In the next video we'll continue building on these examples and work through a more complex example. We'll get to show you how a neural network now with multiple layers of units can be used to compute more complex functions like the XOR function or the XNOR function.

2. Neural Networks Examples and intuitions II


In this section I'd like to keep working through our example to show how a Neural Network can compute complex non linear hypothesis.

Negation: 


In the last section we saw how a Neural Network can be used to compute the functions x1 AND x2, and the function x1 OR x2 when x1 and x2 are binary, that is when they take on values 0,1. We can also have a network to compute negation, that is to compute the function not x1. And if you look at what these values are, that's essentially the not x1 function.

Not x


Putting it together:  x1 XNOR x2


Now, taking the three pieces that we have put together as the network for computing x1 AND x2,and the network computing for computing NOT x1 AND NOT x2. And one last network computing for computing x1 OR x2, we should be able to put these three pieces together to compute this x1 XNOR x2 function.

I'm going to take my input +1, x1, x2 and create my first hidden unit here. I'm gonna call this a 21 cuz that's my first hidden unit. And I'm gonna copy the weight over from the red network, the x1 and x2. As well so then -30, 20, 20. Next let me create a second hidden unit which I'm going to call a 2 2. That is the second hidden unit of layer two. I'm going to copy over the cyan that's work in the middle, so I'm gonna have the weights 10 -20 -20. And so, let's pull some of the truth table values.
Finally, I'm going to create my output node, my output unit that is a 3 1. This is one more output h(x) and I'm going to copy over the old network for that. And I'm going to need a +1 bias unit here, so you draw that in, And I'm going to copy over the weights from the green networks.


We end up with a nonlinear decision boundary that computes this XNOR function.


Neural Network intuition 


intuition about why neural networks can compute pretty complicated functions
And the more general intuition is that in the input layer, we just have our four inputs. Then we have a hidden layer, which computed some slightly more complex functions of the inputs that its shown here this is slightly more complex functions. And then by adding yet another layer we end up with an even more complex non linear function. And this is a sort of intuition about why neural networks can compute pretty complicated functions. That when you have multiple layers you have relatively simple function of the inputs of the second layer. But the third layer I can build on that to complete even more complex functions, and then the layer after that can compute even more complex functions.

猜你喜欢

转载自blog.csdn.net/iracer/article/details/50875484
今日推荐