You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository was archived by the owner on Jul 1, 2023. It is now read-only.
Copy file name to clipboardexpand all lines: README.md
+6-5
Original file line number
Diff line number
Diff line change
@@ -41,17 +41,18 @@ struct Model: Layer {
41
41
var classifier =Model()
42
42
let optimizer =SGD(for: classifier, learningRate: 0.02)
43
43
Context.local.learningPhase= .training
44
-
let x: Tensor<Float> =...
45
-
let y: Tensor<Int32> =...
44
+
// Dummy data.
45
+
let x: Tensor<Float> =Tensor(randomNormal: [100, 4])
46
+
let y: Tensor<Int32> =Tensor(randomUniform: [100])
46
47
```
47
48
48
49
#### Run a training loop
49
50
50
-
One way to define a training epoch is to use the [`Differentiable.gradient(in:)`](https://github.com/apple/swift/blob/652523f49581a42986ef2b6b04a593ed47496122/stdlib/public/core/AutoDiff.swift#L214) method.
51
+
One way to define a training epoch is to use the [`gradient(at:in:)`](https://www.tensorflow.org/swift/api_docs/Functions#/s:10TensorFlow8gradient2at2in13TangentVectorQzx_AA0A0Vyq_GxXEtAA14DifferentiableRzAA0aB13FloatingPointR_r0_lF) function.
51
52
52
53
```swift
53
54
for_in0..<1000 {
54
-
let 𝛁model =classifier.gradient { classifier -> Tensor<Float>in
55
+
let 𝛁model =gradient(at: classifier) { classifier -> Tensor<Float>in
55
56
let ŷ =classifier(x)
56
57
let loss =softmaxCrossEntropy(logits: ŷ, labels: y)
57
58
print("Loss: \(loss)")
@@ -66,7 +67,7 @@ Another way is to make use of methods on `Differentiable` or `Layer` that produc
66
67
```swift
67
68
for_in0..<1000 {
68
69
let (ŷ, backprop) = classifier.appliedForBackpropagation(to: x)
0 commit comments