You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi.
I try to use dlg to recover the data from non-twice-differentiable function, the algorithm successfully recover the data. Here are the following code:
Here is the result:
The result was only slightly worse.
The paper replaces the ReLU function with sigmoid function and gets a good result. So, I try to use sigmoid function to improve the result. Here is the code:
and here is the result:
The result got worse.
So I don't think the non-twice-differentiable function lead to a worse result. When the DLG algorithm is optimizing, it's not optimizing weights, it's optimizing dummy_data and dummy_lable. So the second order derivative is d(dL/dW)/ddummy_data and d(dL/dW)/ddummy_label, not d(dL/dW)/dW.
Looking forward to your reply. :-)
The text was updated successfully, but these errors were encountered:
Hi.
I try to use dlg to recover the data from non-twice-differentiable function, the algorithm successfully recover the data. Here are the following code:
The result as follow:
The data was almostly recovered.
The model code is:
I test the model with 2 ReLU layers:
Here is the result:
The result was only slightly worse.
The paper replaces the ReLU function with sigmoid function and gets a good result. So, I try to use sigmoid function to improve the result. Here is the code:
and here is the result:
The result got worse.
So I don't think the non-twice-differentiable function lead to a worse result. When the DLG algorithm is optimizing, it's not optimizing weights, it's optimizing dummy_data and dummy_lable. So the second order derivative is d(dL/dW)/ddummy_data and d(dL/dW)/ddummy_label, not d(dL/dW)/dW.
Looking forward to your reply. :-)
The text was updated successfully, but these errors were encountered: