Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

support multi treatment in meta learners #141

Merged
merged 12 commits into from
Nov 13, 2019

Conversation

heimengqi
Copy link
Contributor

  1. extend the meta learners support multiple treatments
  2. remove DRLearner
  3. change tests and notebook accordingly

@heimengqi heimengqi self-assigned this Nov 7, 2019
@heimengqi heimengqi added the enhancement New feature or request label Nov 7, 2019
Copy link
Collaborator

@kbattocchi kbattocchi left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In general I don't think the marginal effects have the right shape when there are multiple treatments. I've added a few other comments as well.

econml/metalearners.py Outdated Show resolved Hide resolved
econml/metalearners.py Outdated Show resolved Hide resolved
econml/metalearners.py Outdated Show resolved Hide resolved
econml/metalearners.py Outdated Show resolved Hide resolved
econml/metalearners.py Outdated Show resolved Hide resolved
econml/metalearners.py Outdated Show resolved Hide resolved
econml/metalearners.py Outdated Show resolved Hide resolved
econml/metalearners.py Outdated Show resolved Hide resolved
econml/metalearners.py Outdated Show resolved Hide resolved
econml/metalearners.py Outdated Show resolved Hide resolved
@heimengqi heimengqi marked this pull request as ready for review November 11, 2019 03:19
@heimengqi heimengqi requested a review from moprescu November 11, 2019 03:20
@heimengqi
Copy link
Contributor Author

Still need to write a more comprehensive test includes testing multi Y, array Y or column Y, like the test for DML.

Copy link
Collaborator

@kbattocchi kbattocchi left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I've added a few more comments based on your latest revision, mostly pointing out minor things.

econml/metalearners.py Outdated Show resolved Hide resolved
econml/metalearners.py Outdated Show resolved Hide resolved
econml/metalearners.py Outdated Show resolved Hide resolved
econml/metalearners.py Outdated Show resolved Hide resolved
econml/metalearners.py Show resolved Hide resolved
econml/metalearners.py Outdated Show resolved Hide resolved
econml/metalearners.py Outdated Show resolved Hide resolved
econml/metalearners.py Outdated Show resolved Hide resolved
econml/metalearners.py Outdated Show resolved Hide resolved
Comment on lines 423 to 428
for model in self.final_models:
taus.append(model.predict(X))
taus = np.column_stack(taus).reshape((-1, self._d_t - 1,) + self._d_y) # shape as of m*d_t*d_y
if self._d_y:
taus = transpose(taus, (0, 2, 1)) # shape as of m*d_y*d_t
return taus
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks like very similar logic to this shows up in a few places. Would it be worthwhile to create a common base class so that the logic doesn't need to be repeated?

Copy link
Collaborator

@kbattocchi kbattocchi left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I've added several minor suggestions, but feel free to merge without another round of review after you've addressed them to your satisfaction.

econml/metalearners.py Outdated Show resolved Hide resolved
@@ -150,39 +149,42 @@ def fit(self, Y, T, X, inference=None):
self : an instance of self.
"""
# Check inputs
if X is None:
X = np.ones((Y.shape[0], 1))
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[minor]
I think that in this case, the default could be a 0-column array rather than a column of ones (the columns from T will still be there):

Suggested change
X = np.ones((Y.shape[0], 1))
X = np.empty((Y.shape[0], 0))

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

On the other hand, maybe it's silly to even allow X=None because there is no W (unlike DML)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes. In the setting of Slearner, X = None is the same with learning the diff of mean(Y) in each class. I will keep it for now.

econml/metalearners.py Outdated Show resolved Hide resolved
econml/metalearners.py Outdated Show resolved Hide resolved
econml/metalearners.py Outdated Show resolved Hide resolved
econml/metalearners.py Outdated Show resolved Hide resolved
@heimengqi heimengqi merged commit b7e826e into master Nov 13, 2019
@heimengqi heimengqi deleted the mehei/metalearnermultitreatment branch November 13, 2019 16:59
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants