Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Release: Refactor Load Types #2241

Merged
merged 82 commits into from
Jan 21, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
82 commits
Select commit Hold shift + click to select a range
ed9de89
basic outline of return types
mikedh Jun 13, 2024
5df249f
resolver type hints
mikedh Jun 17, 2024
78567b7
3~BABMerge branch 'main' into refactor/loadtype
mikedh Jul 11, 2024
3553255
merge main and try lighter touch
mikedh Jul 11, 2024
9aa327d
merge main
mikedh Jul 11, 2024
30ca68b
merge
mikedh Sep 1, 2024
e5732dc
merge main
mikedh Oct 22, 2024
84b5565
wip
mikedh Oct 23, 2024
c73170c
Updated use of weights in procrustes analysis.
Oct 30, 2024
20dc9ed
Fix nearest.ipynb Typo
Tallerpie Dec 1, 2024
145ba83
Merge pull request #2313 from tillns/main
mikedh Dec 3, 2024
1ac2419
Merge pull request #2327 from Tallerpie/patch-1
mikedh Dec 3, 2024
6dedb95
update test_scenegraph to pytest style
mikedh Dec 3, 2024
209d0a5
codespell
mikedh Dec 3, 2024
52737d8
change test
mikedh Dec 3, 2024
4833682
Merge branch 'main' into refactor/loadtype
mikedh Dec 3, 2024
942ac83
pass test_minimal anyway
mikedh Dec 3, 2024
6d4e34e
pass test_loaded
mikedh Dec 3, 2024
a1e6877
move all fetching into WebResolver
mikedh Dec 3, 2024
6e7fd43
fix some metadata passing
mikedh Dec 3, 2024
9a8e969
hmm
mikedh Dec 3, 2024
f2f079a
metadata to every geometry
mikedh Dec 4, 2024
5d71ab1
try matching old behavior
mikedh Dec 4, 2024
9645641
fix more tests
mikedh Dec 4, 2024
75fd8cc
fix test_gltf
mikedh Dec 4, 2024
e76ce2b
off also return mesh
mikedh Dec 4, 2024
751ec61
match more old behavior
mikedh Dec 4, 2024
181dbcb
fix test_scene
mikedh Dec 5, 2024
a25082b
fix for #2330
mikedh Dec 15, 2024
f5b9b55
need a metadata policy
mikedh Dec 16, 2024
d608c3a
apply Jan25 resources.get deprecation
mikedh Dec 16, 2024
9b51b6a
Also add map_Kd to the OBJ/MTL material kwargs
ChuangTseu Dec 17, 2024
ff2a542
fix and test voxels in scenes
mikedh Dec 17, 2024
8d4693c
fix and test #2335
mikedh Dec 26, 2024
5b580b6
fix rounding issue in uv_to_color()
TooSchoolForCool Dec 27, 2024
c46eca1
Merge pull request #2336 from TooSchoolForCool/main
mikedh Jan 2, 2025
693081a
Merge pull request #2328 from mikedh/release/weight
mikedh Jan 2, 2025
a5ca023
fix and test divide-by-zero in visual.interpolate
mikedh Jan 2, 2025
08328c3
try tacking on load info as an attribute
mikedh Jan 2, 2025
6b5373e
deprecate Path3D.to_planar->Path3D.to_2D
mikedh Jan 9, 2025
f8dbf9a
fix test_gltf
mikedh Jan 9, 2025
273c92d
fix deprecation wrapper
mikedh Jan 9, 2025
381abe7
blender booleans
mikedh Jan 10, 2025
37b93e9
wrap _uri_to_bytes
mikedh Jan 12, 2025
94d35f5
fixed not loading point cloud colors from glb format files
henryzhengr Jan 14, 2025
f50e262
include a LoadSource for all geometry
mikedh Jan 14, 2025
eb4a088
run blender tests
mikedh Jan 14, 2025
13d1d65
make source available more easily
mikedh Jan 14, 2025
517fecb
should source really be optional
mikedh Jan 14, 2025
aa7e826
py38 syntax
mikedh Jan 14, 2025
90409b4
clean up and expand corpus and fix surfaced bugs
mikedh Jan 14, 2025
83425db
fix more tests
mikedh Jan 15, 2025
eb60b49
try adding in weights
mikedh Jan 15, 2025
63a92f6
some type fixes
mikedh Jan 15, 2025
24d2db1
add missing import
mikedh Jan 16, 2025
c3c858d
fix deepcopy override
mikedh Jan 16, 2025
fab9580
load_dict shenanigans
mikedh Jan 16, 2025
837270e
fix util type hint
mikedh Jan 16, 2025
780a572
run corpus again
mikedh Jan 16, 2025
eaa6004
remove dae until pycollada/pycollada/147 releases
mikedh Jan 16, 2025
ffdc7eb
add report logic
mikedh Jan 17, 2025
f72cef9
apply march 2024 deprecation of graph.smoothed
mikedh Jan 17, 2025
d0e0ac9
disable dae
mikedh Jan 17, 2025
a94c287
fixed not loading point cloud colors from glb format files (#2339)
mikedh Jan 17, 2025
2a63259
add hsv_to_rgba and roundtrip test for #2339
mikedh Jan 18, 2025
e696ca2
type hints and cleanup on color
mikedh Jan 19, 2025
6b6a08b
clip hsv and add some type hints
mikedh Jan 19, 2025
ce82b3b
fix typo
mikedh Jan 19, 2025
9539bb8
type hint
mikedh Jan 19, 2025
2f30869
fix beartype odd behavior
mikedh Jan 19, 2025
7bb986b
tuples arent arraylike
mikedh Jan 19, 2025
e588abd
remove type hint
mikedh Jan 19, 2025
092e017
make source a property
mikedh Jan 20, 2025
fa076ac
fix typo
mikedh Jan 20, 2025
4eb0cfc
remove deprecated Scene.deduplicated
mikedh Jan 20, 2025
96f0040
make file_obj any
mikedh Jan 20, 2025
195cc6a
remove source kwarg
mikedh Jan 21, 2025
db11fe3
Repairing the convex hull, even in the case of NumPy input
guystoppi Jan 21, 2025
f55646a
`trimesh.bounds.oriented_bounds` is slow for NumPy array input. (#2342)
mikedh Jan 21, 2025
63e06b9
Also add map_Kd to the OBJ/MTL material kwargs (#2332)
mikedh Jan 21, 2025
44d5fc9
add test for #2332
mikedh Jan 21, 2025
b15df31
fix typo
mikedh Jan 21, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 3 additions & 3 deletions .github/workflows/release.yml
Original file line number Diff line number Diff line change
Expand Up @@ -54,7 +54,7 @@ jobs:

pypi:
name: Release To PyPi
needs: [tests, containers]
needs: [tests, containers, corpus]
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
Expand Down Expand Up @@ -118,13 +118,13 @@ jobs:
- name: Install Trimesh
run: pip install .[easy,test]
- name: Run Corpus Check
run: python tests/corpus.py
run: python tests/corpus.py -run

release:
permissions:
contents: write # for actions/create-release
name: Create GitHub Release
needs: [tests, containers]
needs: [tests, containers, corpus]
runs-on: ubuntu-latest
steps:
- name: Checkout code
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -68,5 +68,5 @@ jobs:
- name: Install Trimesh
run: pip install .[easy,test]
- name: Run Corpus Check
run: python tests/corpus.py
run: python tests/corpus.py -run

18 changes: 9 additions & 9 deletions docs/requirements.txt
Original file line number Diff line number Diff line change
@@ -1,13 +1,13 @@
pypandoc==1.13
pypandoc==1.14
recommonmark==0.7.1
jupyter==1.0.0
jupyter==1.1.1

# get sphinx version range from furo install
furo==2024.5.6
myst-parser==3.0.1
pyopenssl==24.1.0
autodocsumm==0.2.12
jinja2==3.1.4
matplotlib==3.8.4
nbconvert==7.16.4
furo==2024.8.6
myst-parser==4.0.0
pyopenssl==24.3.0
autodocsumm==0.2.14
jinja2==3.1.5
matplotlib==3.10.0
nbconvert==7.16.5

2 changes: 1 addition & 1 deletion examples/nearest.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -75,7 +75,7 @@
"# create a scene containing the mesh and two sets of points\n",
"scene = trimesh.Scene([mesh, cloud_original, cloud_close])\n",
"\n",
"# show the scene wusing\n",
"# show the scene we are using\n",
"scene.show()"
]
}
Expand Down
3 changes: 2 additions & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ requires = ["setuptools >= 61.0", "wheel"]
[project]
name = "trimesh"
requires-python = ">=3.8"
version = "4.5.3"
version = "4.6.0"
authors = [{name = "Michael Dawson-Haggerty", email = "[email protected]"}]
license = {file = "LICENSE.md"}
description = "Import, export, process, analyze and view triangular meshes."
Expand Down Expand Up @@ -120,6 +120,7 @@ test_more = [
"matplotlib",
"pymeshlab",
"triangle",
"ipython",
]

# interfaces.gmsh will be dropped Jan 2025
Expand Down
206 changes: 168 additions & 38 deletions tests/corpus.py
Original file line number Diff line number Diff line change
Expand Up @@ -6,29 +6,85 @@
will download more than a gigabyte to your home directory!
"""

import json
import sys
import time
from dataclasses import asdict, dataclass

import numpy as np
from pyinstrument import Profiler
from pyinstrument.renderers.jsonrenderer import JSONRenderer

import trimesh
from trimesh.typed import List, Optional, Tuple
from trimesh.util import log, wrap_as_stream

# get a set with available extension
available = trimesh.available_formats()

# remove loaders that are thin wrappers
available.difference_update(
[
k
for k, v in trimesh.exchange.load.mesh_loaders.items()
if v in (trimesh.exchange.misc.load_meshio,)
]
)
# remove loaders we don't care about
available.difference_update({"json", "dae", "zae"})
available.update({"dxf", "svg"})
@dataclass
class LoadReport:
# i.e. 'hi.glb'
file_name: str

# i.e 'glb'
file_type: str

# i.e. 'Scene'
type_load: Optional[str] = None

# what type was every geometry
type_geometry: Optional[Tuple[str]] = None

# what is the printed repr of the object, i.e. `<Trimesh ...>`
repr_load: Optional[str] = None

# if there was an exception save it here
exception: Optional[str] = None


@dataclass
class Report:
# what did we load
load: list[LoadReport]

# what version of trimesh was this produced on
version: str

# what was the profiler output for this run
# a pyinstrument.renderers.JSONRenderer output
profile: str

def compare(self, other: "Report"):
"""
Compare this load report to another.
"""
# what files were loaded by both versions
self_type = {o.file_name: o.type_load for o in self.load}
other_type = {n.file_name: n.type_load for n in other.load}

both = set(self_type.keys()).intersection(other_type.keys())
matches = np.array([self_type[k] == other_type[k] for k in both])
percent = matches.sum() / len(matches)

print(f"Comparing `{self.version}` against `{other.version}`")
print(f"Return types matched {percent * 100.0:0.3f}% of the time")
print(f"Loaded {len(self.load)} vs Loaded {len(other.load)}")

def on_repo(repo, commit):

def from_dict(data: dict) -> Report:
"""
Parse a `Report` which has been exported using `dataclasses.asdict`
into a Report object.
"""
return Report(
load=[LoadReport(**r) for r in data.get("load", [])],
version=data.get("version"),
profile=data.get("profile"),
)


def on_repo(
repo: str, commit: str, available: set, root: Optional[str] = None
) -> List[LoadReport]:
"""
Try loading all supported files in a Github repo.

Expand All @@ -38,6 +94,10 @@ def on_repo(repo, commit):
Github "slug" i.e. "assimp/assimp"
commit : str
Full hash of the commit to check.
available
Which `file_type` to check
root
If passed only consider files under this root directory.
"""

# get a resolver for the specific commit
Expand All @@ -47,32 +107,43 @@ def on_repo(repo, commit):
# list file names in the repo we can load
paths = [i for i in repo.keys() if i.lower().split(".")[-1] in available]

report = {}
if root is not None:
# clip off any file not under the root path
paths = [p for p in paths if p.startswith(root)]

report = []
for _i, path in enumerate(paths):
namespace, name = path.rsplit("/", 1)
# get a subresolver that has a root at
# the file we are trying to load
resolver = repo.namespaced(namespace)

check = path.lower()
broke = (
"malformed empty outofmemory "
+ "bad incorrect missing "
+ "failures pond.0.ply"
).split()
broke = "malformed outofmemory bad incorrect missing invalid failures".split()
should_raise = any(b in check for b in broke)
raised = False

# clip off the big old name from the archive
saveas = path[path.find(commit) + len(commit) :]
# start collecting data about the current load attempt
current = LoadReport(file_name=name, file_type=trimesh.util.split_extension(name))

print(f"Attempting: {name}")

try:
m = trimesh.load(
file_obj=wrap_as_stream(resolver.get(name)),
file_type=name,
resolver=resolver,
)
report[saveas] = str(m)

# save the load types
current.type_load = m.__class__.__name__
if isinstance(m, trimesh.Scene):
# save geometry types
current.type_geometry = tuple(
[g.__class__.__name__ for g in m.geometry.values()]
)
# save the <Trimesh ...> repr
current.repr_load = str(m)

# if our source was a GLTF we should be able to roundtrip without
# dropping
Expand Down Expand Up @@ -104,19 +175,19 @@ def on_repo(repo, commit):
# this is what unsupported formats
# like GLTF 1.0 should raise
log.debug(E)
report[saveas] = str(E)
current.exception = str(E)
except BaseException as E:
raised = True
# we got an error on a file that should have passed
if not should_raise:
log.debug(path, E)
raise E
report[saveas] = str(E)
current.exception = str(E)

# if it worked when it didn't have to add a label
if should_raise and not raised:
# raise ValueError(name)
report[saveas] += " SHOULD HAVE RAISED"
current.exception = "PROBABLY SHOULD HAVE RAISED BUT DIDN'T!"
report.append(current)

return report

Expand Down Expand Up @@ -165,33 +236,92 @@ def equal(a, b):
return a == b


if __name__ == "__main__":
trimesh.util.attach_to_log()
def run(save: bool = False):
"""
Try to load and export every mesh we can get our hands on.

Parameters
-----------
save
If passed, save a JSON dump of the load report.
"""
# get a set with available extension
available = trimesh.available_formats()

# remove meshio loaders because we're not testing meshio
available.difference_update(
[
k
for k, v in trimesh.exchange.load.mesh_loaders.items()
if v in (trimesh.exchange.misc.load_meshio,)
]
)

# TODO : waiting on a release containing pycollada/pycollada/147
available.difference_update({"dae"})

with Profiler() as P:
# check against the small trimesh corpus
loads = on_repo(
repo="mikedh/trimesh",
commit="2fcb2b2ea8085d253e692ecd4f71b8f450890d51",
available=available,
root="models",
)

# check the assimp corpus, about 50mb
report = on_repo(
repo="assimp/assimp", commit="c2967cf79acdc4cd48ecb0729e2733bf45b38a6f"
loads.extend(
on_repo(
repo="assimp/assimp",
commit="1e44036c363f64d57e9f799beb9f06d4d3389a87",
available=available,
root="test",
)
)
# check the gltf-sample-models, about 1gb
report.update(
loads.extend(
on_repo(
repo="KhronosGroup/glTF-Sample-Models",
commit="8e9a5a6ad1a2790e2333e3eb48a1ee39f9e0e31b",
available=available,
)
)

# add back collada for this repo
available.update(["dae", "zae"])
report.update(
# try on the universal robot models
loads.extend(
on_repo(
repo="ros-industrial/universal_robot",
commit="8f01aa1934079e5a2c859ccaa9dd6623d4cfa2fe",
available=available,
)
)

# show all profiler lines
log.info(P.output_text(show_all=True))

# print a formatted report of what we loaded
log.debug("\n".join(f"# {k}\n{v}\n" for k, v in report.items()))
# save the profile for comparison loader
profile = P.output(JSONRenderer())

# compose the overall report
report = Report(load=loads, version=trimesh.__version__, profile=profile)

if save:
with open(f"trimesh.{trimesh.__version__}.{int(time.time())}.json", "w") as F:
json.dump(asdict(report), F)

return report


if __name__ == "__main__":
trimesh.util.attach_to_log()

if "-run" in " ".join(sys.argv):
run()

if "-compare" in " ".join(sys.argv):
with open("trimesh.4.5.3.1737061410.json") as f:
old = from_dict(json.load(f))

with open("trimesh.4.6.0.1737060030.json") as f:
new = from_dict(json.load(f))

new.compare(old)
1 change: 0 additions & 1 deletion tests/generic.py
Original file line number Diff line number Diff line change
Expand Up @@ -366,7 +366,6 @@ def check(item):
batched.append(loaded)

for mesh in batched:
mesh.metadata["file_name"] = file_name
# only return our limit
if returned[0] >= count:
return
Expand Down
2 changes: 1 addition & 1 deletion tests/regression.py
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ def typical_application():
meshes = g.get_meshes(raise_error=True)

for mesh in meshes:
g.log.info("Testing %s", mesh.metadata["file_name"])
g.log.info("Testing %s", mesh.source.file_name)
assert len(mesh.faces) > 0
assert len(mesh.vertices) > 0

Expand Down
Loading
Loading