-
Notifications
You must be signed in to change notification settings - Fork 125
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Simulation error at some timesteps #372
Comments
Is it possible for you to share your input file ? |
import math
l0 = 2.0*math.pi # laser wavelength [in code units]
t0 = l0 # optical cycle
Lsimx = 300.*l0 # length of the simulation box in x (= 240 mum)
Lsimy = 80.*l0 # length of the simulation box in y (= 64 mum)
Tsim = 1500.*t0 # duartion of the simulation (= 4 ps)
resx = 16.
resy = 2. # nb of cells in on laser wavelength
rest = resx/0.8 # nb of timesteps in one optical cycle
dt = t0/rest
nx = 300*16 # nb of cells along x-axis
npatchx = 32 # nb of patches along x-axis
# -----------
# n0max = 5.734e-3 # electron density (code units =>1=plasma critical density)
# initial density profile of electrons
# def n0_electron(x,y):
# return 1./(1.+math.exp((abs(x-x0)-widthx)/Lx))*n0max
#------------
# DEFINING SMILEI's VARIABLES
# All in "bloacks"
Main(
geometry = "2Dcartesian",
interpolation_order = 2,
timestep = dt,
simulation_time = Tsim,
cell_length = [l0/resx,l0/resy],
grid_length = [Lsimx,Lsimy],
number_of_patches = [32,16],
clrw = nx/npatchx,
reference_angular_frequency_SI = 2.0*math.pi*3e8/0.8e-6,
EM_boundary_conditions = [["silver-muller","silver-muller"],["silver-muller","silver-muller"],],
solve_poisson = False,
print_every = 100,
random_seed = smilei_mpi_rank
)
MovingWindow(
time_start = 0.9*Main.grid_length[0],
velocity_x = 0.9997
)
LoadBalancing(
initial_balance = False,
every = 20,
cell_load = 1.,
)
Species(
name = "electron",
position_initialization = "regular",
momentum_initialization = "cold",
particles_per_cell = 4,
mass = 1.0,
charge = -1.0,
number_density = 0.005734,# electron density (code units =>1=plasma critical density)
mean_velocity = [0.0,0.0,0.0],
pusher = "boris",
time_frozen = 0.0,
boundary_conditions = [["remove","remove"],["remove","remove"],],
)
FWHMinI = 12.5*l0 #(= 10 mum)
waistinI = FWHMinI/(2.0*math.sqrt(math.log(2.0)))
waistinE = FWHMinI/math.sqrt(2.0*math.log(2.0))
FWHMtinI = 8.0*t0 #(= 21.3 fs)
FWHMtinE = FWHMtinI*math.sqrt(2.0) #(= 30 fs)
diagEvery = int(37.5*t0/dt) # frequency of outputs(= 100 fs)
LaserGaussian2D(
box_side = "xmin",
a0 = 2.6, # intensity 1.45 e+19
omega = 1.,
focus = [0.,Main.grid_length[1]/2.],
waist = waistinE,
time_envelope = tgaussian(start=0.,duration=2.*FWHMtinE,fwhm=FWHMtinE,center=FWHMtinE)
)
LaserGaussian2D(
box_side = "xmin",
a0 = 2.6, # intensity 1.45 e+19
omega = 1.,
focus = [0.,Main.grid_length[1]/2.],
waist = waistinE,
time_envelope = tgaussian(start=2.0*math.pi,duration=2.*FWHMtinE,fwhm=FWHMtinE,center=FWHMtinE)
)
LaserGaussian2D(
box_side = "xmin",
a0 = 2.6, # intensity 1.45 e+19
omega = 1.,
focus = [0.,Main.grid_length[1]/2.],
waist = waistinE,
time_envelope = tgaussian(start=4.0*math.pi,duration=2.*FWHMtinE,fwhm=FWHMtinE,center=FWHMtinE)
)
LaserGaussian2D(
box_side = "xmin",
a0 = 2.6, # intensity 1.45 e+19
omega = 1.,
focus = [0.,Main.grid_length[1]/2.],
waist = waistinE,
time_envelope = tgaussian(start=6.0*math.pi,duration=2.*FWHMtinE,fwhm=FWHMtinE,center=FWHMtinE)
)
list_fields = ['Ex','Ey','Bz','Rho','Jx']
DiagFields(
every = diagEvery,
fields = list_fields
)
DiagProbe(
every = diagEvery,
origin = [0.,Main.grid_length[1]/2.],
corners = [[Main.grid_length[0],Main.grid_length[1]/2.],
],
number = [nx],
fields = list_fields
)
DiagScalar(
every = int(diagEvery/10),
vars=['Uelm','Ukin_electron',
'ExMax','ExMaxCell','EyMax','EyMaxCell','RhoMin',
'RhoMinCell','Ukin_bnd','Uelm_bnd','Ukin_out_mvw','Ukin_inj_mvw','Uelm_out_mvw','Uelm_inj_mvw','Utot']
)
DiagParticleBinning(
deposited_quantity = "weight",
every = diagEvery,
species = ["electron"],
axes = [
["moving_x",0., Lsimx, 300],
["ekin",1.,500.,200]
]
)
DiagParticleBinning(
deposited_quantity = "weight",
every = diagEvery,
species = ["electron"],
axes = [
["moving_x",0., Lsimx, 300],
["px",-1.,1.,100]
]
)
DiagParticleBinning(
deposited_quantity = "weight",
every = diagEvery,
species = ["electron"],
axes = [
["y",0., Lsimy, 300],
["py",-1.,1.,100]
]
)
DiagParticleBinning(
deposited_quantity = "weight",
every = diagEvery,
species = ["electron"],
axes = [
["moving_x",0., Lsimx, 300],
["px",-1.,1.,100]
]
)
DiagParticleBinning(
deposited_quantity = "weight",
every = diagEvery,
species = ["electron"],
axes = [
["y",0., Lsimy, 300],
["py",-1.,1.,100]]
)
DiagPerformances(
every = diagEvery
) |
Check the line "nx = 30016 # nb of cells along x-axis" in your code. is it correct value of nx? |
nx=300*16 is correct. |
Respected Sir, I have loaded these modules before simulation in my environment. $ module list
|
Do you know if this error always occurs at the same time ? |
yes, if I use this input file. |
I am unable to reproduce the segfault unfortunately. Could you remove all diagnostics and try again ? Then remove the LoadBalancing and try again ? |
Okay sir |
There is no error in your script. I tested it. working fine. Some times the simulation terminate before completion, when there is memory issue. Check if you have sufficient memory to run the simulation and to save the output data. it was happen to me in the past. |
Thank you sir for your suggestion. I check this. |
I am facing the same problem. Please solve my problem. This is the command for job submission. qsub -q low -I -P physics -N test -M $[email protected] -l select=4:ncpus=8 -l walltime=01:00:00 -l select= n # request n number of nodes/slots [phz178389@khas026 ~/LWFA] |
Please specify who you are responding to. Same problem as what ? |
Simulation error at some timesteps. |
Please explain better. What did you do to obtain the same problem. |
I am attaching you error found during simulation. $ mpirun -np 2 $HOME/Smilei-master/smilei LWFA.py Reading the simulation parametersHDF5 version 1.8.16 Geometry: 2Dcartesian
Load Balancing:
Vectorization:
Patch arrangement :Patch arrangement :Initializing MPI
OpenMP
Initializing the restart environmentInitializing moving window
Initializing species
Initializing laser parameters
Initializing Patches
Creating Diagnostics, antennas, and external fields
finalize MPI
Applying external fields at time t = 0Applying prescribed fields at time t = 0Initializing diagnosticsRunning diags at time t = 0Species creation summary
Memory consumption
Expected disk usage (approximate)
Cleaning up python runtime environement
Time-Loop started: number of time-steps n_time = 30000
Stack trace (most recent call last): ===================================================================================
|
You already sent the error before. I asked what did you do to obtain it. Did you modify something? Please make an effort to be clearer |
I have done these changes but problem is same. |
I have done these changes.
|
Ok
|
This is working on my personal laptop but not working on HPC. |
This means the cluster has installed something wrong. You can try asking them if they have a different version of the compilers. |
Thank you for your help, I will ask them. |
Respected Sir,
I am facing a problem with the running of my code in HPC. I am sending you errors.
Attaching 9320 to 1476299.pbshpc
The text was updated successfully, but these errors were encountered: