TensorFlow fans have been rewarded with a first release candidate for v1.13.0 of the machine learning framework, which will further whet their appetites for the also soon to come v2.0.
This week’s drop flagged a few major feature changes, including TensorFlow Lite’s move from contrib to core, which means that Python modules are under tf.lite and source code is now under tensorflow/lite rather than tensorflow/contrib/lite. NCCL has also moved to core.
A pair of behavioural changes has also been made. First up is the disallowing of converting python floating types to uint32/64 in tf.constant. Second is to make the gain argument of convolutional orthogonal initializers behave consistently with the the tf.initializers.orthogonal inititializer. It adds that the functions in question are currently in tf.contrib and therefore not guaranteed backward compatible.
In addition, there are a whole rake of bug fixes and other changes.
TensorFlow Lite, in addition to the shift mentioned earlier, also gets an experimental Java API for injecting TensorFlow Lite delegates, and support for strings in the Java API.
The release also drops in some TensorFlow 2.0 related details. These include a command line tool to convert to TF2.0, and merging tf.spectral into tf.signal for TensorFlow 2.0. Lastly, the team will change the default recurrent activation function for LSTM from ‘hard_sigmoid’ to ‘sigmoid’ in 2.0.
According to the notes, “This will enable user [sic] with GPU to use CuDNN kernel by default and get a 10x performance boost in training” but with the proviso that “that this is checkpoint breaking change. “
TensorFlow 2.0 is “coming soon” the project declared just last month, and promises to focus on “simplicity and ease of use”. A public preview is promised “early this year” according to this post by the team, which added “You can already develop the TensorFlow 2.0 way by using tf.keras and eager execution, pre-packaged models and the deployment libraries. The Distribution Strategy API is also already partly available today.”