Issue
I am trying to find the averages of columns of a jagged array but I get an error saying that arr[j].length
when I'm dividing it by the sum, the j
cannot find symbol. What can I do to fix this problem?
int maxC = arr[0].length;
for (int a = 1; a < arr.length; a++){
if (arr[a].length > maxC){
maxC = arr[a].length;
}
}
for (int i = 0; i < maxC; i++){
double sum = 0.0;
for (int j = 0; j < arr.length; j++){
if (i < arr[j].length){
sum += arr[j][i];
}
}
double avg = sum / arr[j].length;
System.out.println("Average of col " + (i + 1) + "is: " + avg);
}
Solution
Your variable j
is declared in the inner for loop. Outside of that loop it does not exist. But the way I see it, you don't want to divide by the length of arr[j]
, but by the number of numbers you added. That's arr.length
minus the number of nested arrays that are too short.
The easiest way to fix this is to introduce a new variable that you increment inside your if statement, then divide by that variable instead of by arr[j].length
.
Answered By - Rob Spoor
Answer Checked By - David Goodson (JavaFixing Volunteer)