IDIOT DEVELOPER

Implementing XOR Gate with Tensorflow
TensorFlow

I recently had a chance to look into TensorFlow, Google’s “open source software library for numerical computation using data flow graphs,” specifically looking to implement artificial neural networks (ANNs). Most of the introductory tutorials on ANNs with TensorFlow involve building and training networks to classify handwritten digits. In order to better learn the building blocks of TensorFlow—and to refresh my memory of both Python and neural networks—I wanted to start much much smaller, beginning with the simplest possible ANNs and working my way up.

 

So, we are going to implement a XOR Gate on a MultiLayer Neural Network on TensorFlow.

 

First Import the required packages:

We requires two packages:

 

TensorFlow

Numpy

 

Now its time for the training data. As we are implementing a XOR Gate , so our data consist of 4 possible set of Input and Output respectively.

 

Input 1 Input 2 Output
0 0 0
0 1 1
1 0 1
1 1 0

TensorFlow works by building a model out of empty tensors, then plugging in known values and evaluating the model.

 

Our Model have three layers:

Input layer

Hidden layer

Output layer

 

The Input Layer has two nodes.

The Hidden Layer has ten nodes.

The Output Layer has only one node.

 

We have two sets of Weight:

The first between the Input and Hidden Layer.

The second between the Hidden and Output layer.

 

The Weights acts like a synapse between the layers, that is it connects the two layers and have a weight associated with them.

Now we have both the training data and the weights. Now we are going to use the Gradient Descent Optimizer  to train our model.

 

Video

 

 

 

The Complete Code:

 


#!/usr/bin/python

import tensorflow as tf
import numpy as np

x_data = np.array([
[0,0], [0,1], [1,0], [1,1]
])
y_data = np.array([
[0], [1], [1], [0]
])

#hyperparameters
n_input = 2
n_hidden = 10
n_output = 1
learning_rate = 0.1
epochs = 10000

X = tf.placeholder(tf.float32)
Y = tf.placeholder(tf.float32)

#weights
W1 = tf.Variable(tf.random_uniform([n_input, n_hidden], -1.0, 1.0))
W2 = tf.Variable(tf.random_uniform([n_hidden, n_output], -1.0, 1.0))

#bias
b1 = tf.Variable(tf.zeros([n_hidden]), name="Bias1")
b2 = tf.Variable(tf.zeros([n_output]), name="Bias2")

L2 = tf.sigmoid(tf.matmul(X, W1) + b1)
hy = tf.sigmoid(tf.matmul(L2, W2) + b2)

cost = tf.reduce_mean(-Y*tf.log(hy) - (1-Y)*tf.log(1-hy))
optimizer = tf.train.GradientDescentOptimizer(learning_rate).minimize(cost)

init = tf.initialize_all_variables()

with tf.Session() as session:
    session.run(init)

    for step in xrange(epochs):
        session.run(optimizer, feed_dict={X: x_data, Y: y_data})

        if step % 1000 == 0:
            print session.run(cost, feed_dict={X: x_data, Y: y_data})

    answer = tf.equal(tf.floor(hy + 0.5), Y)
    accuracy = tf.reduce_mean(tf.cast(answer, "float"))

    print session.run([hy], feed_dict={X: x_data, Y: y_data})
    print "Accuracy: ", accuracy.eval({X: x_data, Y: y_data}) * 100 , "%"

2 comments on “Implementing XOR Gate with Tensorflow”

  • How can I check individually inputs? Like for example, I want to check x_data[0] which contains [0,0]. please help.

  • BestDarci says:

    I see you don’t monetize your site, don’t waste your traffic,
    you can earn extra cash every month because you’ve got hi quality content.
    If you want to know how to make extra money, search for: Ercannou’s essential adsense alternative

Leave a Reply

Your email address will not be published. Required fields are marked *