Coursera 강의 “Machine Learning with TensorFlow on Google Cloud Platform” 중 세 번째 코스인 Intro to TensorFlow의 1주차 강의노트입니다.

- TensorFlow is an open-source high-performance library for
`numerical computation`

that uses`directed graphs`

. - The
`nodes`

represent mathematical operations.(ex. add) - The
`edges`

represent the input and output of mathematical operations.

- Directed acyclic graph(DAG) is a
`language-independent`

representation of the code in model. - This makes graphs
`being portable`

between different devices. - TensorFlow can
`insert send and receive nodes`

to distribute the graph across machines. - TensorFlow can optimize the graph by
`merging successive nodes`

where necessary. - TensorFlow Lite provides on-device inference of ML models on mobile devices and is available for a variety of hardware.
- TensorFlow supports
`federated`

learning.

- The lowest level is a layer that’s implemented to target different hardware platforms.
- The next level is a TensorFlow C++ API.
- The core Python API is what contains much of the
`numeric processing code`

. - Set of Python modules that have
`high level representation`

of useful NN components. (good for custom model) `Estimator`

knows how to training, evaluate, create a check point, save and serve model.

- The Python API lets you
`build and run`

Directed Graphs - Create the Graph (Build)

- Run the Graph (Run)

- The graph definition is separate from the training loop because this is a lazy evaluation model. (need to run the graph to get results)
`tf.eager`

, however, allows to execute operations imperatively.

`Graphs`

can be processed, compiled, remotely executed, and assigned to devices.- The
`edges`

represent data as`tensor`

which are n-dimensional arrays. - The
`nodes`

represent TensorFlow`operations`

on those tensors. `Session`

allows TensorFlow to`cache and distribute`

computation.

- Execute TensorFlow graphs by calling
`run()`

on a`tf.Session`

- It is possible to
`evaluate`

a list of tensors. - TensorFlow in
`Eager mode`

makes it easier to try out things, but is not recommended for production code.

- You can write the graph out using
`tf.summary.FileWriter`

Then,

It’s not human-readable.

- The graph can be visualized in
`TensorBoard`

.

- A tensor is an N-dimensional array of data.

- Tensors can be
`sliced`

- Tensors can be
`reshaped`

- A variable is a tensor whose value is
`initialized`

and then typically`changed`

as the program runs.

`tf.get_variable`

can be helpful to be able to reuse variables or create them afresh depending on different situations.

- To summarize,
- create a variable by calling
`get_variable`

- decide on how to
`initialize`

a variable - use the
`variable`

just like any other tensor when building the graph - In session,
`initialize`

the variable - evaluate any tensor that you want to evaluate

- create a variable by calling
`Placeholders`

allow you to feed in values, such as by reading from a text file

- Debugging TensorFlow programs is
`similar`

to debugging any piece of software- Read error messages to
`understand the problem`

`Isolate`

the method with fake data- Send made-up data into the method with fake data
- Know how to solve common problems

- Read error messages to
- The most common problem tends to be
`tensor shape`

- Tensor shape
- Scalar-vector mismatch
- Data type mismatch

- Shape problems also happen because of
`batch size`

or because**you have a scalar when a vector is needed**(or vice versa) - Shape problems can often be fixed using
- tf.reshape()
- tf.expand_dims()
- tf.slice()
- tf.squeeze()

`tf.expand_dims`

inserts a dimension of 1 into a tensor’s shape

`tf.slice`

extracts a slice from a tensor

`tf.squeeze`

removes dimensions of size 1 from the shape of a tensor

- Another common problem is
`data type`

- The reason is because we are
`mixing types`

.(ex. Adding a tensor of floats to a tensor of ints won’t work) - One solution is to do a cast with
`tf.cast()`

.

- The reason is because we are

- To debug full-blown programs. there are three methods
`tf.Print()`

`tfdbg`

`TensorBoard`

- Change logging level from
`WARN`

`tf.Print()`

can be used to log specific tensor values

- TensorFlow has a dynamic, interactive debugger (
`tfdbg`

)