Member-only story

tensorflow variables, constant, operations

Ke Gui
5 min readJun 6, 2020

--

TL;DR:

variables: can be changed, like parameters, weight, bias etc.

constant: can not be updated, but can be called multible times with only 1 copy in the memory

tensor: is the input an output of each operations.it flows,:D.

I’m unsure about the practical differences between the 4 variations below (they all evaluate to the same value). My understanding is that if I call tf, it will create an operation on the graph, and otherwise it might. If I don't create the tf.constant() at the beginning, I believe that the constants will be created implicitly when doing the addition; but for tf.add(a,b) vs a + b where a and bare both Tensors (#1 and #3), I can see no difference besides the default naming (former is Addand the latter one is add). Can anyone shed some light on the differences between those, and when should one use each?

## 1
a = tf.constant(1)
b = tf.constant(1)
x = tf.add(a, b)
with tf.Session() as sess:
x.eval()

## 2
a = 1
b = 1
x = tf.add(a, b)
with tf.Session() as sess:
x.eval()

## 3
a = tf.constant(1)
b = tf.constant(1)
x = a + b
with tf.Session() as sess:
x.eval()

## 4
a = 1
b = tf.constant(1)
x = a + b
with tf.Session() as sess:
x.eval()

The four examples you gave will all give the same result, and generate the same graph (if you ignore that some of the operation names in the graph are different). TensorFlow will convert many different Python objects into tf.Tensor objects when they…

--

--

Ke Gui
Ke Gui

Written by Ke Gui

An ordinary guy who wants to be the reason someone believes in the goodness of people. He is living at Brisbane, Australia, with a lovely backyard.

No responses yet