Issue
I'm trying to implement the Softmax function in android.
Here is the original softmax function from Stackoverflow for reference:
Softmax Activation Implementation
private double softmax(double input, double[] neuronValues) {
double total = Arrays.stream(neuronValues).map(Math::exp).sum();
return Math.exp(input) / total;
}
And then, I tested the softmax
function in the java Main:
public static void main(String[] args) {
try{
double input = Double.valueOf("123456789");
double[] doubleValues = new double[2];
doubleValues[0] = Double.valueOf("123456789");
doubleValues[1] = Double.valueOf("234567890");
double total = 0;
for (int i = 0; i < doubleValues.length; i++) {
double value = Math.exp(doubleValues[i]);
total += value;
}
double result = Math.exp(input) / total;
System.out.println(String.format("total: %s, result: %s", total, result));
}catch (Throwable ex) {
ex.printStackTrace();
}
}
The output is:
total: Infinity, result: NaN
It seems the softmax function returned is NaN
, it is not in the range of [0,1].
As I understood, the softmax function should convert any numbers to the range of [0,1].
What's the problem?
Solution
Your number is too large so its exponent exceeds the range that double can handles (overflow). Exponent of 100 has an order of magnitude of 43 so exponent of 123456789 will go to infinity.
total
is double.POSITIVE_INFINITY. result
is inf / inf so it is NaN.
Try to normalize your input to a range, for example, min-max normalization to transform the input to a range of [-1,1] or [0,-1]. These range are commonly used in machine learning as their power series are bounded.
Answered By - Jim LK Yu
Answer Checked By - David Goodson (JavaFixing Volunteer)