You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
There is no error of this part,
the lentgth of temp_hid = torch.sum(temp_hid, dim=2, keepdim=True) is not equal to diff_time, maybe you could check it again.
Thanks for your reply! I just doubt the integral function T_T. For my understanding, when we minimize the LAMBDA=\int \sum lambda_k, the type mask should be removed? The type mask just needs at the maximizing loglikelihood function part(just maximizing lambda_k at time t).
def compute_integral_unbiased(model, data, time, non_pad_mask, type_mask):
""" Log-likelihood of non-events, using Monte Carlo integration. """
For my understanding ,
temp_hid = torch.sum(temp_hid * type_mask[:, 1:, :], dim=2, keepdim=True)
->
temp_hid = torch.sum(temp_hid, dim=2, keepdim=True)
The text was updated successfully, but these errors were encountered: