-
Notifications
You must be signed in to change notification settings - Fork 721
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
support multi treatment in meta learners #141
Conversation
heimengqi
commented
Nov 7, 2019
- extend the meta learners support multiple treatments
- remove DRLearner
- change tests and notebook accordingly
…microsoft/EconML into mehei/metalearnermultitreatment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In general I don't think the marginal effects have the right shape when there are multiple treatments. I've added a few other comments as well.
Still need to write a more comprehensive test includes testing multi Y, array Y or column Y, like the test for DML. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I've added a few more comments based on your latest revision, mostly pointing out minor things.
econml/metalearners.py
Outdated
for model in self.final_models: | ||
taus.append(model.predict(X)) | ||
taus = np.column_stack(taus).reshape((-1, self._d_t - 1,) + self._d_y) # shape as of m*d_t*d_y | ||
if self._d_y: | ||
taus = transpose(taus, (0, 2, 1)) # shape as of m*d_y*d_t | ||
return taus |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks like very similar logic to this shows up in a few places. Would it be worthwhile to create a common base class so that the logic doesn't need to be repeated?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I've added several minor suggestions, but feel free to merge without another round of review after you've addressed them to your satisfaction.
econml/metalearners.py
Outdated
@@ -150,39 +149,42 @@ def fit(self, Y, T, X, inference=None): | |||
self : an instance of self. | |||
""" | |||
# Check inputs | |||
if X is None: | |||
X = np.ones((Y.shape[0], 1)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[minor]
I think that in this case, the default could be a 0-column array rather than a column of ones (the columns from T will still be there):
X = np.ones((Y.shape[0], 1)) | |
X = np.empty((Y.shape[0], 0)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
On the other hand, maybe it's silly to even allow X=None because there is no W (unlike DML)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes. In the setting of Slearner, X = None is the same with learning the diff of mean(Y) in each class. I will keep it for now.