forked from microsoft/CNTK
-
Notifications
You must be signed in to change notification settings - Fork 0
Express a softmax over a dynamic axis
Chris Basoglu edited this page Oct 8, 2016
·
8 revisions
A softmax over a dynamic axis can be done via a recurrence:
BrainScript:
SoftMaxOverSequence (z) = {
# define a recursive expression for \log(\sum_{i=1}^t \exp(z_i))
runningLogSumExp = LogPlus (z,
PastValue (0, runningLogSumExp, defaultHiddenActivation=-1e30))
logSumExp = If (BS.Loop.IsLast (runningLogSumExp), # if last entry
/*then*/ runningLogSumExp, # then copy that
/*else*/ FutureValue (0, logSumExp)) # else just propagate to the front
result = Exp (z - logSumExp)
}.result
Getting Started
Additional Documentation
How to use CNTK
Using CNTK Models in Your Code
- Overview
- Nuget Package for Evaluation
- C++ Evaluation Interface
- C# Evaluation Interface
- Evaluating Hidden Layers
- C# Image Transforms for Evaluation
- C# Multi-model Evaluation
- Evaluate in Azure
Advanced topics
Licenses
Source Code & Development