Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add DataFrame.min and DataFrame.max over 'set_ids' axis or MeshIndex #333

Merged
merged 16 commits into from
Apr 4, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -135,7 +135,7 @@ jobs:
- uses: actions/checkout@v3

- name: "Set licensing if necessary"
if: env.ANSYS_VERSION > 231
if: matrix.ANSYS_VERSION > 231
shell: bash
run: |
echo "ANSYS_DPF_ACCEPT_LA=Y" >> $GITHUB_ENV
Expand Down Expand Up @@ -183,7 +183,7 @@ jobs:
- name: "Upload Test Results"
uses: actions/upload-artifact@v3
with:
name: ${{ env.PACKAGE_NAME }}_${{ matrix.python-version }}_${{ matrix.os }}_pytest_${{ env.ANSYS_VERSION }}
name: ${{ env.PACKAGE_NAME }}_${{ matrix.python-version }}_${{ matrix.os }}_pytest_${{ matrix.ANSYS_VERSION }}
path: tests/junit/test-results.xml
if: always()

Expand Down
4 changes: 2 additions & 2 deletions .github/workflows/ci_release.yml
Original file line number Diff line number Diff line change
Expand Up @@ -131,7 +131,7 @@ jobs:
- uses: actions/checkout@v3

- name: "Set licensing if necessary"
if: env.ANSYS_VERSION > 231
if: matrix.ANSYS_VERSION > 231
shell: bash
run: |
echo "ANSYS_DPF_ACCEPT_LA=Y" >> $GITHUB_ENV
Expand Down Expand Up @@ -179,7 +179,7 @@ jobs:
- name: "Upload Test Results"
uses: actions/upload-artifact@v3
with:
name: ${{ env.PACKAGE_NAME }}_${{ matrix.python-version }}_${{ matrix.os }}_pytest_${{ env.ANSYS_VERSION }}
name: ${{ env.PACKAGE_NAME }}_${{ matrix.python-version }}_${{ matrix.os }}_pytest_${{ matrix.ANSYS_VERSION }}
path: tests/junit/test-results.xml
if: always()

Expand Down
80 changes: 80 additions & 0 deletions examples/01-Detailed-Examples/06-compute-min-max.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,80 @@
"""
.. _ref_compute_statistics_example:

Compute minimum and maximum of a DataFrame
==========================================
In this example, transient mechanical displacement data is used
to show how to compute the min or max of a given DataFrame.
"""

###############################################################################
# Perform required imports
# ------------------------
# This example uses a supplied file that you can
# get using the ``examples`` module.

from ansys.dpf import post
from ansys.dpf.post import examples

###############################################################################
# Get ``Simulation`` object
# -------------------------
# Get the ``Simulation`` object that allows access to the result. The ``Simulation``
# object must be instantiated with the path for the result file. For example,
# ``"C:/Users/user/my_result.rst"`` on Windows or ``"/home/user/my_result.rst"``
# on Linux.

example_path = examples.download_crankshaft()
simulation = post.StaticMechanicalSimulation(example_path)

# print the simulation to get an overview of what's available
print(simulation)

###############################################################################
# Extract displacement data
# -------------------------

displacement = simulation.displacement(all_sets=True)
print(displacement)

###############################################################################
# Compute the maximum displacement for each component at each time-step
# ---------------------------------------------------------------------

# The default axis is the MeshIndex
maximum_over_mesh = displacement.max()
print(maximum_over_mesh)
# is equivalent to
maximum_over_mesh = displacement.max(axis="node_ids")
print(maximum_over_mesh)

# Compute the maximum displacement for each node and component across time
# ------------------------------------------------------------------------
maximum_over_time = displacement.max(axis="set_ids")
print(maximum_over_time)

# Compute the maximum displacement overall
# ----------------------------------------
maximum_overall = maximum_over_time.max()
print(maximum_overall)

###############################################################################
# Compute the minimum displacement for each component at each time-step
# ---------------------------------------------------------------------

# The default axis is the MeshIndex
minimum_over_mesh = displacement.min()
print(minimum_over_mesh)
# is equivalent to
minimum_over_mesh = displacement.min(axis="node_ids")
print(minimum_over_mesh)

# Compute the minimum displacement for each node and component across time
# ------------------------------------------------------------------------
minimum_over_time = displacement.min(axis="set_ids")
print(minimum_over_time)

# Compute the minimum displacement overall
# ----------------------------------------
minimum_overall = minimum_over_time.min()
print(minimum_overall)
202 changes: 201 additions & 1 deletion src/ansys/dpf/post/dataframe.py
Original file line number Diff line number Diff line change
Expand Up @@ -77,6 +77,8 @@ def __init__(
self._last_display_width = display_width
self._last_display_max_colwidth = display_max_colwidth

self._last_minmax: dict = {"axis": None, "min": None, "max": None}

@property
def columns(self) -> MultiIndex:
"""Returns the MultiIndex for the columns of the DataFrame."""
Expand Down Expand Up @@ -513,7 +515,7 @@ def treat_elemental_nodal(treat_lines, pos, n_comp, n_ent, n_lines):
else empty
for i in range(len(combination))
]
to_append.append(empty)
to_append.append(empty) # row where row index headers are
# Get data in the FieldsContainer for those positions
# Create label_space from combination
label_space = {}
Expand Down Expand Up @@ -833,3 +835,201 @@ def animate(
return fc.animate(
save_as=save_as, deform_by=deform_by, scale_factor=scale_factor, **kwargs
)

def min(self, axis: Union[int, str, None] = 0) -> Union[DataFrame, float]:
"""Return the minimum value over the requested axis.

Parameters
----------
axis:
Axis to perform minimum across.
Defaults to the MeshIndex (0), the row index containing mesh entity IDs.
This computes the minimum across the mesh for each set.
Can also be the SetIndex (1), the column index containing set (time/frequency) IDs.
This computes the minimum across sets (time/frequency) for each mesh entity.

Returns
-------
A scalar if the result of the query is a single number,
or a DataFrame if several numbers along one or several axes.

Examples
--------
>>> from ansys.dpf import post
>>> from ansys.dpf.post import examples
>>> simulation = post.StaticMechanicalSimulation(examples.download_crankshaft())
>>> displacement = simulation.displacement(all_sets=True)
>>> # Compute the maximum displacement value for each component at each time-step
>>> minimum_over_mesh = displacement.min(axis="node_ids")
>>> print(minimum_over_mesh) # doctest: +NORMALIZE_WHITESPACE
results U (m)
set_ids 1 2 3
components
X -7.4732e-04 -1.5081e-03 -2.2755e-03
Y -4.0138e-04 -8.0316e-04 -1.2014e-03
Z -2.1555e-04 -4.3299e-04 -6.5101e-04
>>> # Compute the maximum displacement for each node and component across time
>>> minimum_over_time = displacement.min(axis="set_ids")
>>> print(minimum_over_time) # doctest: +NORMALIZE_WHITESPACE
results U (m)
node_ids components
4872 X -3.4137e-05
Y 5.1667e-04
Z -4.1346e-06
9005 X -5.5625e-05
Y 4.8445e-04
Z -4.9795e-07
...
>>> # Compute the maximum displacement overall
>>> minimum_overall = minimum_over_time.min()
>>> print(minimum_overall) # doctest: +NORMALIZE_WHITESPACE
results U (m)
components
X -2.2755e-03
Y -1.2014e-03
Z -6.5101e-04
"""
self._query_min_max(axis)
return self._last_minmax["min"]

def max(self, axis: Union[int, str, None] = 0) -> Union[DataFrame, float]:
"""Return the maximum value over the requested axis.

Parameters
----------
axis:
Axis to perform maximum across.
Defaults to the MeshIndex (0), the row index containing mesh entity IDs.
This computes the maximum across the mesh for each set.
Can also be the SetIndex (1), the column index containing set (time/frequency) IDs.
This computes the maximum across sets (time/frequency) for each mesh entity.

Returns
-------
A scalar if the result of the query is a single number,
or a DataFrame if several numbers along one or several axes.

Examples
--------
>>> from ansys.dpf import post
>>> from ansys.dpf.post import examples
>>> simulation = post.StaticMechanicalSimulation(examples.download_crankshaft())
>>> displacement = simulation.displacement(all_sets=True)
>>> # Compute the maximum displacement value for each component at each time-step
>>> maximum_over_mesh = displacement.max(axis="node_ids")
>>> print(maximum_over_mesh) # doctest: +NORMALIZE_WHITESPACE
results U (m)
set_ids 1 2 3
components
X 7.3303e-04 1.4495e-03 2.1441e-03
Y 1.3962e-03 2.7884e-03 4.1656e-03
Z 2.1567e-04 4.3321e-04 6.5135e-04
>>> # Compute the maximum displacement for each node and component across time
>>> maximum_over_time = displacement.max(axis="set_ids")
>>> print(maximum_over_time) # doctest: +NORMALIZE_WHITESPACE
results U (m)
node_ids components
4872 X 5.6781e-06
Y 1.5417e-03
Z -2.6398e-06
9005 X -2.6323e-06
Y 1.4448e-03
Z 5.3134e-06
...
>>> # Compute the maximum displacement overall
>>> maximum_overall = maximum_over_time.max()
>>> print(maximum_overall) # doctest: +NORMALIZE_WHITESPACE
results U (m)
components
X 2.1441e-03
Y 4.1656e-03
Z 6.5135e-04
"""
self._query_min_max(axis)
return self._last_minmax["max"]

def _query_min_max(self, axis: Union[int, str, None]) -> None:
"""Create a DPF workflow based on the query arguments for min/max."""
# Translate None query to empty dict
if axis in [None, 0, self.index.mesh_index.name]:
axis = 0
elif axis in [1, ref_labels.set_ids]:
axis = 1
else:
raise ValueError(f"'{axis}' is not an available axis value.")

# print(f"{axis=}")
# If same query as last and last is not None, do not change
if self._last_minmax["axis"] == axis and not self._last_minmax["axis"] is None:
return
# If in need of an update, create the appropriate workflow
wf = dpf.Workflow(server=self._fc._server)
wf.progress_bar = False

# If over mesh
if axis == 0:
min_max_op = dpf.operators.min_max.min_max_over_label_fc(
fields_container=self._fc,
label="time",
server=self._fc._server,
)
# Here the fields are located on the label ("time"), so we have to "transpose" it.
# Extract the data for each time (entity) from the field and create a fields_container

min_fc = dpf.FieldsContainer(server=self._fc._server)
min_fc.add_label(label="time")
min_field = min_max_op.outputs.field_min()
for i, time in enumerate(min_field.scoping.ids):
min_fc.add_field(
label_space={"time": time},
field=dpf.fields_factory.field_from_array(
arr=min_field.get_entity_data(i),
server=self._fc._server,
),
)

max_fc = dpf.FieldsContainer(server=self._fc._server)
max_fc.add_label(label="time")
max_field = min_max_op.outputs.field_max()
for i, time in enumerate(max_field.scoping.ids):
max_fc.add_field(
label_space={"time": time},
field=dpf.fields_factory.field_from_array(
arr=max_field.get_entity_data(i),
server=self._fc._server,
),
)

index = MultiIndex(
indexes=[i for i in self.index if i != self.index.mesh_index]
)
columns = self.columns

# If over time
else:
min_max_op = dpf.operators.min_max.min_max_over_time_by_entity(
fields_container=self._fc,
server=self._fc._server,
)
wf.set_output_name("min", min_max_op.outputs.min)
wf.set_output_name("max", min_max_op.outputs.max)

index = self.index
columns = MultiIndex(
indexes=[c for c in self.columns if c != self.columns.set_ids]
)

min_fc = wf.get_output("min", dpf.types.fields_container)
max_fc = wf.get_output("max", dpf.types.fields_container)

self._last_minmax["min"] = DataFrame(
data=min_fc,
index=index,
columns=columns,
)
self._last_minmax["max"] = DataFrame(
data=max_fc,
index=index,
columns=columns,
)
self._last_minmax["axis"] = axis
36 changes: 36 additions & 0 deletions tests/test_dataframe.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,7 @@
from pytest import fixture

from ansys.dpf import post
from ansys.dpf.post import examples
from ansys.dpf.post.index import (
CompIndex,
LabelIndex,
Expand Down Expand Up @@ -259,3 +260,38 @@ def test_dataframe_array_raise(transient_rst):
ValueError, match="Can only export to array if the DataFrame contains a single"
):
_ = df.array


def test_dataframe_min_max():
simulation = post.TransientMechanicalSimulation(examples.download_crankshaft())
df = simulation.displacement(all_sets=True)
# Over the mesh entities
min_over_mesh = [[-0.00074732, -0.00040138, -0.00021555]]
assert np.all(np.isclose(df.min()._fc[0].data.tolist(), min_over_mesh))
Copy link
Contributor

@MaxJPRey MaxJPRey Mar 16, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is it expected to expose and use a protected attribute _fc for the field containers?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@MaxJPRey not to the user. We do have ._core_object property though which we will implement for all classes wrapping a Core entity to give easy access in case it is needed.
Here in the tests this allows me to compare the result of the max operation with doing the same thing directly on the data held by the underlying fields container.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OK, got it. Thanks for the info.

assert np.all(np.isclose(df.min(axis=0)._fc[0].data.tolist(), min_over_mesh))
assert np.all(
np.isclose(df.min(axis="node_ids")._fc[0].data.tolist(), min_over_mesh)
)

max_over_mesh = [[0.00073303, 0.00139618, 0.00021567]]
assert np.all(np.isclose(df.max()._fc[0].data.tolist(), max_over_mesh))
assert np.all(np.isclose(df.max(axis=0)._fc[0].data.tolist(), max_over_mesh))
assert np.all(
np.isclose(df.max(axis="node_ids")._fc[0].data.tolist(), max_over_mesh)
)

# Over the SetIndex
min_over_time = [-3.41368775e-05, 5.16665595e-04, -4.13456506e-06]
assert np.all(np.isclose(df.min(axis=1)._fc[0].data[0].tolist(), min_over_time))
assert np.all(
np.isclose(df.min(axis="set_ids")._fc[0].data[0].tolist(), min_over_time)
)
max_over_time = [5.67807472e-06, 1.54174694e-03, -2.63976203e-06]
assert np.all(np.isclose(df.max(axis=1)._fc[0].data[0].tolist(), max_over_time))
assert np.all(
np.isclose(df.max(axis="set_ids")._fc[0].data[0].tolist(), max_over_time)
)

# Raise unrecognized axis
with pytest.raises(ValueError, match="is not an available axis value"):
df.max(axis="raises")