Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Time complexity of the expanding network training #7

Open
JingweiZuo opened this issue May 8, 2023 · 0 comments
Open

Time complexity of the expanding network training #7

JingweiZuo opened this issue May 8, 2023 · 0 comments

Comments

@JingweiZuo
Copy link

Dear authors,

Firstly, I would like to express my gratitude for sharing the code for your wonderful work, which I find quite inspiring.

After thoroughly reading your paper, I have a question regarding the time complexity of the Expanding Network Training.

In section 4.2 - Expanding Network Training, you mentioned that the complexity is reduced to $\mathcal{O}((\Delta_{\tau}d^{2})^{2})$ by considering only newly added nodes for updating the model.

However, upon checking your code in main.py,, specifically lines 170-180 (shown below), I noticed that you used the data from new nodes for updating the model. From my understanding, the parameter amount of the model stays unchanged, and only the loss of updating the model has changed. That should not affect the time complexity for updating the model.

In case I missed something, I would be grateful if you could provide clarification on this matter.

Thank you in advance for your time and assistance.

Best regards,
Jingwei

if args.strategy == "incremental" and args.year > args.begin_year:
  pred, _ = to_dense_batch(pred, batch=data.batch)
  data.y, _ = to_dense_batch(data.y, batch=data.batch)
  pred = pred[:, args.mapping, :]
  data.y = data.y[:, args.mapping, :]
loss = lossfunc(data.y, pred, reduction="mean")
if args.ewc and args.year > args.begin_year:
    loss += model.compute_consolidation_loss()
training_loss += float(loss)
loss.backward()
optimizer.step()
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant