Precision and Recall over validation step #5809
-
When Precision and Recall are directly computed, I get the following result: import torch
from pytorch_lightning.metrics import Precision
from pytorch_lightning.metrics import Recall
y = torch.tensor([0, 0, 2, 2, 1, 1, 1, 2, 0, 0])
y_hat = torch.tensor([1, 1, 2, 1, 1, 1, 1, 1, 2, 1])
precision = Precision(num_classes=3)
recall = Recall(num_classes=3)
precision(y_hat, y)
#>>>tensor(0.2917)
recall(y_hat, y)
#>>>tensor(0.4444) However, when the same metrics are computed over def validation_step(self, batch, batch_idx):
x, y = batch["x"], batch["y"] # y = tensor([0, 0, 2, 2, 1, 1, 1, 2, 0, 0], device='cuda:0')
y_hat = self(x) # y_hat = tensor([1, 1, 2, 1, 1, 1, 1, 1, 2, 1], device='cuda:0')
precision = self.precision_score(y_hat, y) # precision = tensor(0.4000, device='cuda:0')
recall = self.recall_score(y_hat, y) # recall = tensor(0.4000, device='cuda:0') what am I missing? |
Beta Was this translation helpful? Give feedback.
Replies: 4 comments
-
@ceceu can you provide code to reproduce? seems weird to me |
Beta Was this translation helpful? Give feedback.
-
@ceceu after trying myself, I assume you have set the y = torch.tensor([0, 0, 2, 2, 1, 1, 1, 2, 0, 0])
y_hat = torch.tensor([1, 1, 2, 1, 1, 1, 1, 1, 2, 1])
precision = Precision(num_classes=3, average='macro')
recall = Recall(num_classes=3, average='macro')
print(precision(y_hat, y), recall(y_hat, y)) # tensor(0.2917), tensor(0.4444)
precision = Precision(num_classes=3, average='micro')
recall = Recall(num_classes=3, average='micro')
print(precision(y_hat, y), recall(y_hat, y)) # tensor(0.4000), tensor(0.4000) |
Beta Was this translation helpful? Give feedback.
-
Hello @SkafteNicki, Thanks! |
Beta Was this translation helpful? Give feedback.
-
Nice, closing this issue. |
Beta Was this translation helpful? Give feedback.
@ceceu after trying myself, I assume you have set the
average
argument in the first case tomacro
and in the second case tomicro
(default):