This is the second public release of Swift for TensorFlow, available across Google Colaboratory, Linux, and macOS. The focus is improving overall stability and refining APIs.
- Install SwiftPM packages using
%install
directives. See documentation in README. (swift-jupyter#45, swift-jupyter#48, swift-jupyter#52) swift-jupyter
can now be installed in a Conda environment. See documentation in README.
AnyDerivative
has been added, representing a type-erased derivative. (apple/swift#23521)
Tensor
now supports advanced indexing and striding APIs. (apple/swift#24684)Tensor
s are now pretty-printed, based on the format of NumPy. (apple/swift#23837)- TensorFlow APIs involving shape dimensions, indices, and sizes now use
Int
instead ofInt32
. (apple/swift#24012, apple/swift#24110) - Additional raw TensorFlow operator are now supported.
(apple/swift#23777,
apple/swift#24096,
apple/swift#24120)
SaveV2
(Raw.saveV2(prefix:tensorNames:shapeAndSlices:tensors:)
)RestoreV2
(Raw.restoreV2(prefix:tensorNames:shapeAndSlices:dtypes:)
)Split
(Raw.split(splitDim:value:numSplit:)
)SplitV
(Raw.splitV(value:sizeSplits:splitDim:numSplit:)
)
- Experimental APIs have been added to group tensor ops into specialized tensor functions for further optimization, optionally using XLA compilation. (apple/swift#23868)
- The
Layer
protocol'sapplied(to:in:)
method has been renamed tocall(_:)
.Layer
s are now "callable" like functions, e.g.layer(input)
.- Note: this is experimental functionality that is currently being proposed through Swift Evolution. Expect potential changes.
- The
context
argument has been removed fromLayer
'sapplied(to:)
method. Instead, contexts are now thread-local. (swift-apis#87)- Use
Context.local
to access the current thread-local context. - Note: layers like
BatchNorm
andDropout
checkContext.local
to determine whether the current learning phase is training or inference. Be sure to set the context learning phase to.training
before running a training loop. - Use
withContext(_:_:)
andwithLearningPhase(_:_:)
to call a closure under a temporary context or learning phase, respectively.
- Use
- A
RNNCell
protocol has been added, generalizing simple RNNs, LSTMs, and GRUs. (swift-apis#80, swift-apis#86) - New layers have been added.
Conv1D
,MaxPool1D
,AvgPool1D
. (swift-apis#57)UpSampling1D
. (swift-apis#61)TransposedConv2D
. (swift-apis#64)GlobalAveragePooling1D
,GlobalAveragePooling2D
,GlobalAveragePooling3D
. (swift-apis#66, swift-apis#65, swift-apis#72)
- Optimizer stored properties (e.g.
learningRate
) are now mutable. (swift-apis#81)
Array
now conforms toDifferentiable
. (apple/swift#23183)- The
@differentiating
attribute now works when the derivative function has a generic context that is more constrained than the original function's generic context. (apple/swift#23384) - The
@differentiating
attribute now accepts awrt
differentiation parameter list, just like the@differentiable
attribute. (apple/swift#23370) - The error
function is differentiable only with respect to a smaller subset of arguments
is now obsolete. (apple/swift#23887) - A differentiation-related memory leak has been fixed. (apple/swift#24165)
This release contains contributions from many people at Google, as well as:
Anthony Platanios, Bart Chrzaszcz, Bastian Müller, Brett Koonce, Dante Broggi, Dave Fernandes, Doug Friedman, Ken Wigginton Jr, Jeremy Howard, John Pope, Leo Zhao, Nanjiang Jiang, Pawan Sasanka Ammanamanchi, Pedro Cuenca, Pedro José Pereira Vieito, Sendil Kumar N, Sylvain Gugger, Tanmay Bakshi, Valeriy Van, Victor Guerra, Volodymyr Pavliukevych, Vova Manannikov, Wayne Nixalo.
This is the first public release of Swift for TensorFlow, available across Google Colaboratory, Linux, and macOS. The focus is building the basic technology platform and fundamental deep learning APIs.
This release includes the core Swift for TensorFlow compiler, the standard libraries, and the Swift for TensorFlow Deep Learning Library. Core functionality includes: the ability to define, train and evaluate models, a notebook environment, and natural Python interoperability.
- Hit "Tab" to trigger basic semantic autocomplete.
- Use matplotlib to produce inline graphs.
- Interrupt cell execution by clicking the "stop" button next to the cell.
- Declare a
KeyPathIterable
protocol conformance to make your custom type provide a collection of key paths to stored properties. Read Dynamic Property Iteration using Key Paths for a deep dive into the design. - Declare an
AdditiveArithmetic
protocol conformance to make values of your custom type behave like an additive group. If the declaration is in the same file as the type definition and when all stored properties conform toAdditiveArithmetic
, the compiler will synthesize the conformance automatically. - Declare an
VectorNumeric
protocol conformance to make values of your custom type behave like a vector space. If the declaration is in the same file as the type definition and when all stored properties conform toVectorNumeric
with the sameScalar
associated type, the compiler will synthesize the conformance automatically.
- The
Layer
protocol and layers built on top of it. - The
Optimizer
protocol and optimizers built on top of it. - Philox and Threefry random number generators and generic random distributions.
- Sequential layer application utilities:
sequenced(in:through:_:)
and its n-ary overloads.
- Declare a conformance to the
Differentiable
protocol to make a custom type work with automatic differentiation. For a technical deep dive, read Differentiable Types. - Use
differentiableFunction(from:)
to form a@differentiable
function from a custom derivative function. - Custom differentiation APIs are available in the standard library. Follow the
custom differentiation
tutorial
to learn how to use them.
- Gradient checkpointing API:
withRecomputationInPullbacks(_:)
. - Gradient surgery API:
withGradient(_:)
.
- Gradient checkpointing API:
- Switch between Python versions using
PythonLibrary.useVersion(_:_:)
.
This release contains contributions from many people at Google, as well as:
Anthony Platanios, Edward Connell, Tanmay Bakshi.