Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Misleading error message when redispatching action is not performed correctly #39

Closed
jhmenke opened this issue Mar 4, 2020 · 9 comments

Comments

@jhmenke
Copy link

jhmenke commented Mar 4, 2020

So i have likely found another bug.

My redispatch action looks like this:

action_space["redispatch"]
Out[2]: [(0, 4.9999784936326535), (1, 4.78524395611872), (4, -9.999591852954794)]

This is translated into the following print action:

print(action)
This action will:
	 - NOT change anything to the injections
	 - perform the following redispatching action: [ 4.99997849  4.78524396  0.          0.         -9.99959185]
	 - NOT force any line status
	 - NOT switch any line status
	 - NOT switch anything in the topology
	 - NOT force any particular bus configuration

Not that indices 2,3 are not redispatched (0.) - this is correct as they are not controllable.

However, in BasicEnv.py in line 435:

if np.any(redisp_act_orig[new_p == 0.]):

this will evaluate to True because redisp_act_orig has two zeros as per the action print (nothing should be changed on the generators). This is the bug, because now a InvalidRedispatching("Impossible to dispatched a turned off generator") is thrown and the reward is -5.

Even if this was fixed, afterwards there would be

redisp_act_orig[new_p == 0.] = 0.

which would reset all redispatch. For some reason, new_p is completely 0, which would mean that all generators are shut off. I don't think this is correct as the first observation has most generators active.

Thanks for looking into it and best regards

@jhmenke
Copy link
Author

jhmenke commented Mar 4, 2020

self.gen_activeprod_t in BasicEnv.py line 379 is completely 0. despite observation.prod_p being

[81.3 80.1 12.6 0. 80.49025102]

I would not expect all gens to be shut off in the forecast/next timestep.

@BDonnot
Copy link
Collaborator

BDonnot commented Mar 5, 2020

You are correct, this is a bug that I will work on in the next version.

For now, better not use "redispatch" action in simulate.

Benjamin

@BDonnot
Copy link
Collaborator

BDonnot commented Mar 5, 2020

This bug only appear when you simulate correct ?

@BDonnot BDonnot changed the title Bug in redispatch action Bug when you "simulate" redispatch action Mar 5, 2020
@jhmenke
Copy link
Author

jhmenke commented Mar 5, 2020

No, also in the step

obs, reward, done, info = env.step(act)
print(reward)
-5.0

@BDonnot BDonnot changed the title Bug when you "simulate" redispatch action Misleading error message when redispatching action is not performed correctly Mar 5, 2020
@BDonnot
Copy link
Collaborator

BDonnot commented Mar 5, 2020

After a quick experiment, here is what is happening:

You want to modify the setpoint (by redispatching) of the 3 controlable generators. This is totally fine.

To "fix" your action, because the redispatching action must sum to 0, an "automaton" [imagine something that has the same role as the frequency control automaton in real life] takes care of that. It can act only on dispatchable generators. But you modify all the three.

As your action does not sum to 0:

import numpy as np
import grid2op
env = grid2op.make()
act = env.action_space({"redispatch": [(0, 4.9999784936326535), (1, 4.78524395611872), (4, -9.999591852954794)]})
np.sum(act._redispatch)
# -0.21436940320342046

then the action is not implemented because the constraints "I must have a redispatching summing at 0." can't be met.

You can see that with:

obs, reward, done, info = env.step(act)
print(info["is_dipatching_illegal"])
# it prints True

The error message is completely misleading i agree some work need to be done in this case.

This error is easily "fixed" you need to provide a vector that sums at 0, like this for example:

act2 = env.action_space({"redispatch": np.array([ 4.99997849,  4.78524396,  0.        ,  0.        , -9.78522245])})
obs, reward, done, info = env.step(act2)
print(reward)
# 221.22035009361286

@jhmenke
Copy link
Author

jhmenke commented Mar 5, 2020

Hmm that's a bit weird, because the -0.21 in this case are the decrease of system losses. But i understand your idea, so i can just scale the values correctly and it should work.

@BDonnot
Copy link
Collaborator

BDonnot commented Mar 5, 2020

This constraint of "redispatch should sum to 0." is to impose the system to always meet "sum Prod = sum Load + losses" and not interfer too much with the slack bus.
This constraints is met (more or less, up to +/- 1MW say) when the chronics have been generated.

If your vector does not sum at 0, then some agent could only decrease the values of one generator, and the slack bus would absorb all. This is only "a reasonable approximation" with a distributed slack bus (where this value is split on different generators), but some powerflows don't support this feature. That's why I added, directly in the environment, this "automaton" that enforce the redispatching vector to sum at 0.

Scaling up your vector would work in this case, but make sure that the ramps and pmin / pmax are met when you do that :-)

@jhmenke
Copy link
Author

jhmenke commented Mar 5, 2020

Sure, i have to translate the OPF results correctly, which should be doable. Now i only have the issue in simulate like you already wrote. Thanks!

@BDonnot
Copy link
Collaborator

BDonnot commented Mar 20, 2020

Hello,

This issue should be fixed in the latest (0.5.8) grid2op issue, where i check specifically for this case and throw the appropriate error (that should then help debugging much more easily in this case).

Thanks for noticing and suggesting this improvement. If you think the issue has been solved after upgrading, feel free to close it :-)

Benjamin

@jhmenke jhmenke closed this as completed Mar 23, 2020
BDonnot added a commit that referenced this issue May 19, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants