Components of TensorFlow | Python
TensorFlow is an open-source machine learning library. Google owns and maintains it and it is one of the largest machine learning libraries in the world. It is mostly used by everyone because it is very flexible and supports various features.
Some of which are:
- It provides a library of tools that help in performing large numerical computations as a data flow graph. Its main key feature is the“computational graph”
- TensorFlow helps in visualizing graph, iterating, and debugging it using TensorBoard.
- It trains models faster and gives the best performance.
- It supports both CPUs and GPUs in computing devices.
Working of TensorFlow – under the hood
TensorFlow has two main components:
- Graph
- Sessions
It works by building a graph of partial and defined computations but nothing is computed or stored in it. To explain this by an example:
Let say you create a variable sum which is equal to variable a + variable b (sum = a+b).
This sum variable gets added to the graph. Now the graph will define that there is a variable sum which is the addition of variable a and variable b.
We need to understand here that it doesn’t evaluate the sum whereas it simply states that this is the computation that is defined by us. In simple words, it is like writing down an equation without actually performing any math.
A Session is essentially a way to execute a part of the entire graph. It allows the smaller chunks/parts of the graph to be executed. It also allocates the memory and resources and handles the execution of the operations and computations that you define.
Below is a code snippet of – TensorFlow computation, represented as a dataflow graph. As shown, a default graph can be registered with the tf.Graph.as_default context manager. Then, operations would be added to the graph instead of being executed.
g = tf.Graph() with g.as_default(): c = tf.constant(30.0) assert c.graph is g
tf.compat.v1.get_default_graph()
This code creates a graph g in which many operations and tensors can be defined. Here a constant value of 30 is defined and added to the graph as a node. The next function which is tf.compact.v1.get_default_graph() gives the output of the memory location of the stored graph object which can be seen below:
Very well written content!
Keep writing more! ????