Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[local mode] Actors are not handled correctly #5715

Closed
richardliaw opened this issue Sep 16, 2019 · 10 comments · Fixed by #5863
Closed

[local mode] Actors are not handled correctly #5715

richardliaw opened this issue Sep 16, 2019 · 10 comments · Fixed by #5863

Comments

@richardliaw
Copy link
Contributor

The below fails with:

Traceback (most recent call last):
  File "/Users/rliaw/Research/riselab/ray/doc/examples/parameter_server/failure.py", line 35, in <module>
    accuracies = run_sync_parameter_server()
  File "/Users/rliaw/Research/riselab/ray/doc/examples/parameter_server/failure.py", line 32, in run_sync_parameter_server
    current_weights = ps.get_weights.remote()
  File "/Users/rliaw/miniconda3/lib/python3.7/site-packages/ray/actor.py", line 148, in remote
    return self._remote(args, kwargs)
  File "/Users/rliaw/miniconda3/lib/python3.7/site-packages/ray/actor.py", line 169, in _remote
    return invocation(args, kwargs)
  File "/Users/rliaw/miniconda3/lib/python3.7/site-packages/ray/actor.py", line 163, in invocation
    num_return_vals=num_return_vals)
  File "/Users/rliaw/miniconda3/lib/python3.7/site-packages/ray/actor.py", line 588, in _actor_method_call
    function = getattr(worker.actors[self._ray_actor_id], method_name)
AttributeError: 'DataWorker' object has no attribute 'get_weights'
import ray

@ray.remote
class ParameterServer(object):
    def __init__(self, learning_rate):
        pass

    def apply_gradients(self, *gradients):
        pass

    def get_weights(self):
        pass

@ray.remote
class DataWorker(object):
    def __init__(self):
        pass

    def compute_gradient_on_batch(self, data, target):
        pass

    def compute_gradients(self, weights):
        pass


def run_sync_parameter_server():
    iterations = 50
    num_workers = 2
    ps = ParameterServer.remote(1e-4 * num_workers)
    # Create workers.
    workers = [DataWorker.remote() for i in range(num_workers)]
    current_weights = ps.get_weights.remote()

ray.init(ignore_reinit_error=True, local_mode=True)
accuracies = run_sync_parameter_server()
@richardliaw
Copy link
Contributor Author

cc @edoakes

@justinwyang
Copy link
Contributor

justinwyang commented Sep 17, 2019

I'll take a look into this.

@mawright
Copy link
Contributor

I get a similar error in this test case.

import ray
from ray import tune
config = {"env": "CartPole-v1"}
ray.init(local_mode=True)
tune.run("PPO", config=config)
Traceback (most recent call last):
  File "/home/matt/Code/ray/python/ray/tune/trial_runner.py", line 506, in _process_trial
    result = self.trial_executor.fetch_result(trial)
  File "/home/matt/Code/ray/python/ray/tune/ray_trial_executor.py", line 347, in fetch_result
    result = ray.get(trial_future[0])
  File "/home/matt/Code/ray/python/ray/worker.py", line 2349, in get
    raise value
ray.exceptions.RayTaskError: python test.py (pid=32468, host=Rocko2)
  File "/home/matt/Code/ray/python/ray/local_mode_manager.py", line 55, in execute
    results = function(*copy.deepcopy(args))
  File "/home/matt/Code/ray/python/ray/rllib/agents/trainer.py", line 395, in train
    w.set_global_vars.remote(self.global_vars)
  File "/home/matt/Code/ray/python/ray/actor.py", line 148, in remote
    return self._remote(args, kwargs)
  File "/home/matt/Code/ray/python/ray/actor.py", line 169, in _remote
    return invocation(args, kwargs)
  File "/home/matt/Code/ray/python/ray/actor.py", line 163, in invocation
    num_return_vals=num_return_vals)
  File "/home/matt/Code/ray/python/ray/actor.py", line 588, in _actor_method_call
    function = getattr(worker.actors[self._ray_actor_id], method_name)
AttributeError: 'PPO' object has no attribute 'set_global_vars'

@nicofirst1
Copy link

I get a similar error in this test case.

import ray
from ray import tune
config = {"env": "CartPole-v1"}
ray.init(local_mode=True)
tune.run("PPO", config=config)
Traceback (most recent call last):
  File "/home/matt/Code/ray/python/ray/tune/trial_runner.py", line 506, in _process_trial
    result = self.trial_executor.fetch_result(trial)
  File "/home/matt/Code/ray/python/ray/tune/ray_trial_executor.py", line 347, in fetch_result
    result = ray.get(trial_future[0])
  File "/home/matt/Code/ray/python/ray/worker.py", line 2349, in get
    raise value
ray.exceptions.RayTaskError: python test.py (pid=32468, host=Rocko2)
  File "/home/matt/Code/ray/python/ray/local_mode_manager.py", line 55, in execute
    results = function(*copy.deepcopy(args))
  File "/home/matt/Code/ray/python/ray/rllib/agents/trainer.py", line 395, in train
    w.set_global_vars.remote(self.global_vars)
  File "/home/matt/Code/ray/python/ray/actor.py", line 148, in remote
    return self._remote(args, kwargs)
  File "/home/matt/Code/ray/python/ray/actor.py", line 169, in _remote
    return invocation(args, kwargs)
  File "/home/matt/Code/ray/python/ray/actor.py", line 163, in invocation
    num_return_vals=num_return_vals)
  File "/home/matt/Code/ray/python/ray/actor.py", line 588, in _actor_method_call
    function = getattr(worker.actors[self._ray_actor_id], method_name)
AttributeError: 'PPO' object has no attribute 'set_global_vars'

I'm getting the same error with:

ray.init(num_cpus=N_CPUS, local_mode=True) 

# defining dictionary for the experiment
experiment_params = dict(

    run="PPO",  # must be the same as the default config
    env=gym_name,
    config={**ppo_config},
    checkpoint_freq=20,
    checkpoint_at_end=True,
    max_failures=999,
    stop={"training_iteration": 200, },  # stop conditions

)


experiment_params = {params["exp_tag"]: experiment_params}

# running the experiment
trials = run_experiments(experiment_params)

With the following Traceback:

ray.exceptions.RayTaskError: /anaconda3/envs/dmas/bin/python /Applications/PyCharm.app/Contents/helpers/pydev/pydevconsole.py --mode=client --port=49411 (pid=1002, host=client-145-120-37-77.surfnet.eduroam.rug.nl)
  File "/anaconda3/envs/dmas/lib/python3.6/site-packages/ray/local_mode_manager.py", line 55, in execute
    results = function(*copy.deepcopy(args))
  File "/anaconda3/envs/dmas/lib/python3.6/site-packages/ray/rllib/agents/trainer.py", line 395, in train
    w.set_global_vars.remote(self.global_vars)
  File "/anaconda3/envs/dmas/lib/python3.6/site-packages/ray/actor.py", line 148, in remote
    return self._remote(args, kwargs)
  File "/anaconda3/envs/dmas/lib/python3.6/site-packages/ray/actor.py", line 169, in _remote
    return invocation(args, kwargs)
  File "/anaconda3/envs/dmas/lib/python3.6/site-packages/ray/actor.py", line 163, in invocation
    num_return_vals=num_return_vals)
  File "/anaconda3/envs/dmas/lib/python3.6/site-packages/ray/actor.py", line 548, in _actor_method_call
    function = getattr(worker.actors[self._ray_actor_id], method_name)
AttributeError: 'PPO' object has no attribute 'set_global_vars'

@davidcotton
Copy link

I'm a bit late as I can see a PR in progress, but I was able to fix this local mode issue by using Ray 0.7.3, seems to be introduced in 0.7.4

@nicofirst1
Copy link

Still present in 0.7.5.
Debugging it's getting hard.

@edoakes
Copy link
Contributor

edoakes commented Oct 8, 2019

Sorry this took so long, but it will be fixed in the next release (see the above PR). For now, if you rely heavily on local mode I would suggest using 0.7.3.

@nicofirst1
Copy link

My problem is that I'm using the MADDPG in the contrib section of rllib, which is not available in ray version 0.7.3.
Is there a way to update ray and keep rllib as it is?

@edoakes
Copy link
Contributor

edoakes commented Oct 9, 2019

@nicofirst1 one option is to build Ray from source off of the latest master. You can follow these instructions to do so:
https://ray.readthedocs.io/en/latest/installation.html#building-ray-from-source

0.7.6 will contain the fix and should be released in the coming weeks.

@robertnishihara
Copy link
Collaborator

If you're trying to use the latest version of Ray, you can also just pip install the latest version by following the instructions in https://ray.readthedocs.io/en/latest/installation.html#latest-snapshots-nightlies.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

7 participants