字幕列表 影片播放
Hi there!
TensorFlow is no longer what it used to be.
Let’s have a quick history of development overview.
TensorFlow 1 is one of the most widely used deep learning packages.
It is very versatile and that is why many practitioners like it.
However, it has a major disadvantage – it is very hard to learn and use.
Many people become disheartened after seeing even a couple of lines of TensorFlow code.
Not only its methods are strange, but the whole logic of coding is unlike most libraries
out there.
This led to the development and popularization of higher-level packages such as PyTorch and
Keras.
Keras is especially interesting as in 2017 it was integrated in the core TensorFlow – a
feat that may sound a bit strange.
In reality though, both TensorFlow and Keras are open source, so such things do happen
in the programming world.
In fact, Keras’ author claims that Keras is conceived as “an interface for TensorFlow
rather than a different library”, making this integration even easier to digest and
implement.
So far so good.
However, even with Keras as a part of TF, TensorFlow was still losing popularity.
This was addressed in 2019, when TensorFlow 2.0 came on the horizon, or at least its alpha
version.
It is TensorFlow’s effort to catch up with the current demand for higher-level programming.
Interestingly, instead of creating their own high-level syntax, the TF developers chose
to borrow that of Keras.
This decision made sense as Keras was widely adopted already and people generally love
it.
On that note, you may hear people saying: TensorFlow 2 is basically Keras.
In fact, TF 2 has the best of both worlds – most of the versatility of TF 1 and the
high-level simplicity of Keras.
And that’s not all.
There are also other major advantages of TF 2 over TF 1 – they simplified the API, removed
duplicate and deprecated functions and added some new to the core TensorFlow.
Most importantly for us – TensorFlow 2 boasts ‘eager execution’ or in other words – allowing
standard Python “rules of physics” to apply to it, rather than complex computational
graphs, that you don’t really want to know about.