Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug] Ray Mac OSX security popups every time I run script locally #18730

Closed
2 tasks done
worldveil opened this issue Sep 17, 2021 · 17 comments · Fixed by #18904
Closed
2 tasks done

[Bug] Ray Mac OSX security popups every time I run script locally #18730

worldveil opened this issue Sep 17, 2021 · 17 comments · Fixed by #18904
Assignees
Labels
bug Something that is supposed to be working; but isn't P1 Issue that should be fixed within a few weeks size-medium usability
Milestone

Comments

@worldveil
Copy link
Contributor

Search before asking

  • I searched the issues and found no similar issues.

Ray Component

Ray Core, Monitoring & Debugging, Others

What happened + What you expected to happen

I always see these popups.

Screen Shot 2021-09-17 at 3 23 13 PM

They are ridiculously annoying and often keep appearing even after script is done running.

It doesn't give me confidence in Ray.

Reproduction script

Literally any ray example will work.

from ray import tune


def objective(step, alpha, beta):
    return (0.1 + alpha * step / 100)**(-1) + beta * 0.1


def training_function(config):
    # Hyperparameters
    alpha, beta = config["alpha"], config["beta"]
    for step in range(10):
        # Iterative training function - can be any arbitrary training procedure.
        intermediate_score = objective(step, alpha, beta)
        # Feed the score back back to Tune.
        tune.report(mean_loss=intermediate_score)


analysis = tune.run(
    training_function,
    config={
        "alpha": tune.grid_search([0.001, 0.01, 0.1]),
        "beta": tune.choice([1, 2, 3])
    })

print("Best config: ", analysis.get_best_config(
    metric="mean_loss", mode="min"))

# Get a dataframe for analyzing trial results.
df = analysis.results_df

should trigger this.

Anything else

Let's figure out how to address and add to the docs.

Are you willing to submit a PR?

  • Yes I am willing to submit a PR!
@worldveil worldveil added bug Something that is supposed to be working; but isn't triage Needs triage (eg: priority, bug/not-bug, and owning component) labels Sep 17, 2021
@ericl
Copy link
Contributor

ericl commented Sep 17, 2021

@ijrsvt @thomasdesr or other mac users, do you know if this is because the worker processes are binding to something like 0.0.0.0 (instead of 127.0.0.1)?

Or is OSX just complaining about any port being opened, even if it listens only internally?

@ericl ericl added P1 Issue that should be fixed within a few weeks size-medium usability and removed triage Needs triage (eg: priority, bug/not-bug, and owning component) labels Sep 17, 2021
@ericl ericl added this to the Core Backlog milestone Sep 17, 2021
@ericl ericl assigned jjyao and unassigned ericl Sep 17, 2021
@jjyao jjyao removed their assignment Sep 25, 2021
@pcmoritz
Copy link
Contributor

pcmoritz commented Sep 25, 2021

Did we try if listening on localhost on macOS fixes this issue as @ericl suggests above? That should fix this issue right? Is this hard to do?

@jjyao
Copy link
Collaborator

jjyao commented Sep 26, 2021

I did a quick check, for grpc servers, we are listening to 0.0.0.0 and gcs server is using 192.168. Do we currently have a way to know in c++ that we are in the single node mode so we can safely use 127.0.0.1. Once we know that, the fix should be trivial. @ericl @pcmoritz?

@pcmoritz
Copy link
Contributor

I don't think there is a flag/setting for single node mode currently, but I think it would be ok to add a simple helper function in C++ that checks if we are on macOS and if yes return 127.0.0.1 and else return 0.0.0.0. Later if needed we could migrate it to a flag, but I don't really see the need. I doubt we need to support macOS clusters any time soon (or ever).

@worldveil
Copy link
Contributor Author

I can still repro this:

# requirements.txt
ray[default,data,tune] @ https://s3-us-west-2.amazonaws.com/ray-wheels/latest/ray-2.0.0.dev0-cp37-cp37m-macosx_10_15_intel.whl
jupyter
torch
awscli
pyarrow  # for loadng parquet
scikit-learn
tensorboard
boto3
fsspec
pickle5

# dask ecosystem
dask
dask_ml
s3fs

# mlflow
mlflow
torchvision

then run:

import ray
import time

ray.init() 

@ray.remote
def test():
    time.sleep(10)
    print("hello!")
    return 10

results = ray.get([test.remote() for i in range(10)])

@worldveil worldveil reopened this Oct 25, 2021
@simon-mo
Copy link
Contributor

can you confirm the import ray; print(ray.__commit__)?

@jjyao
Copy link
Collaborator

jjyao commented Oct 25, 2021

I can reproduce it with a new conda environment. But I'm not able to reproduce with locally built ray with the same commit.

@jjyao
Copy link
Collaborator

jjyao commented Oct 26, 2021

Also If I just do pip install -U https://s3-us-west-2.amazonaws.com/ray-wheels/latest/ray-2.0.0.dev0-cp37-cp37m-macosx_10_15_intel.whl and run the script. There is also no popups.

@worldveil
Copy link
Contributor Author

@simon-mo do you need me to check the commit still?

thank you all for digging in here!

@jjyao
Copy link
Collaborator

jjyao commented Oct 27, 2021

@worldveil I don't need the commit and I have a PR out for the fix. Will let you verify once it's merged.

@jjyao
Copy link
Collaborator

jjyao commented Nov 4, 2021

@worldveil it should have been fixed. Could you verify and close the issue?

@worldveil
Copy link
Contributor Author

worldveil commented Nov 9, 2021

@jjyao I still see this on ray start --head

Screen Shot 2021-11-08 at 8 55 13 PM

@jjyao
Copy link
Collaborator

jjyao commented Nov 10, 2021

@worldveil Can you try it now with the latest wheel. Sorry for so many iterations.

@jjyao
Copy link
Collaborator

jjyao commented Nov 19, 2021

Confirmed with @worldveil that popups are gone.

@jjyao jjyao closed this as completed Nov 19, 2021
@worldveil
Copy link
Contributor Author

@jjyao I have seen it again :/ is there any reason I should be seeing this on Ray 1.9.x? any regressions?

I can try to provide a repro if needed, but it is inconsistent

@jjyao
Copy link
Collaborator

jjyao commented Feb 3, 2022

No, it shouldn't happen. If you can give a repro, I'm happy to debug it.

@worldveil
Copy link
Contributor Author

Have not been able to repro! Leaving closed for now.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something that is supposed to be working; but isn't P1 Issue that should be fixed within a few weeks size-medium usability
Projects
None yet
Development

Successfully merging a pull request may close this issue.

5 participants