Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Replace np.float to float since in numpy >= 1.20 np.float is depicted #58

Open
wants to merge 28 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
28 commits
Select commit Hold shift + click to select a range
b0fa718
replace np.float to float since in numpy >= 1.20 np.float is depicted
skoulouzis Jan 6, 2023
50603be
Merge pull request #2 from QCDIS/1-attributeerror-module-numpy-has-no…
skoulouzis Jan 6, 2023
bbb34ba
use laserchicken without np.float
skoulouzis Jan 10, 2023
ac96d47
use laserchicken without np.init
skoulouzis Jan 10, 2023
54afd0f
replace np.int with int
skoulouzis Jan 10, 2023
3352baa
replace laserchicken with laserchicken @ git+https://github.com/QCDIS…
skoulouzis Jan 10, 2023
1710f92
Merge pull request #7 from QCDIS/6-pip-subprocess-error
skoulouzis Jan 10, 2023
943fda4
Create python-package-conda.yml
skoulouzis Mar 1, 2023
bee539c
added cron job
skoulouzis Mar 1, 2023
6d916e9
updated py versions
skoulouzis Mar 1, 2023
16c4a8f
disable Codecov
skoulouzis Mar 1, 2023
ec3149a
added gcc in conda
skoulouzis Mar 1, 2023
05f3239
change test date in crone
skoulouzis Mar 1, 2023
c16e33e
install libcxx
skoulouzis Mar 1, 2023
3842ce6
install cxx-compiler
skoulouzis Mar 3, 2023
d85e253
set libgcc 5.2.0
skoulouzis Mar 3, 2023
0f6e34c
set gcc=12.1.0
skoulouzis Mar 3, 2023
d35992f
move numpy to environment.yml
skoulouzis Mar 3, 2023
6f4cd03
Merge pull request #9 from QCDIS/8-glibcxx_3430-not-found
skoulouzis Mar 3, 2023
a6eaa57
move requirements to environment.yml and created test-requirements.txt
skoulouzis Mar 3, 2023
227f834
test only in conda
skoulouzis Mar 3, 2023
341fd53
added scipy
skoulouzis Mar 3, 2023
a6740a1
Merge pull request #10 from QCDIS/8-glibcxx_3430-not-found
skoulouzis Mar 3, 2023
b361ccd
remove requirements
skoulouzis Apr 4, 2023
d7d2c17
Merge pull request #11 from QCDIS/8-glibcxx_3430-not-found
skoulouzis Apr 4, 2023
45dff02
Update setup.py
skoulouzis Apr 4, 2023
b36c461
added sync
skoulouzis Apr 30, 2023
4d3c10f
Merge remote-tracking branch 'origin/master'
skoulouzis Apr 30, 2023
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
43 changes: 43 additions & 0 deletions .github/workflows/git-sync.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,43 @@
name: Sync with Original Repo
on:
schedule:
- cron: "0 0 * * Sun"



jobs:
sync:
runs-on: ubuntu-latest

steps:
- name: Checkout
uses: actions/checkout@v2

- name: Fetch upstream
run: |
git remote add upstream https://github.com/eEcoLiDAR/Laserfarm.git
git fetch upstream

- name: Check if there are changes
id: has_changes
run: echo "::set-output name=changed::$(git rev-parse HEAD != upstream/master)"

- name: Create new branch
run: |
git checkout -b update-upstream-${{ github.run_number }}
if: steps.has_changes.outputs.changed == 'true'

- name: Merge upstream changes
run: |
git merge upstream/master --no-edit
if: steps.has_changes.outputs.changed == 'true'

- name: Create pull request
uses: peter-evans/create-pull-request@v3
if: steps.has_changes.outputs.changed == 'true'
with:
title: 'Update from upstream repository'
commit-message: 'Merge latest changes from upstream repository'
branch: 'update-upstream'
base: 'master'
delete-branch: true
49 changes: 49 additions & 0 deletions .github/workflows/python-package-conda.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,49 @@
name: Python Package using Conda

on:
push:
schedule:
- cron: '0 0 2 * *'


jobs:
build-linux:
runs-on: ubuntu-latest
strategy:
max-parallel: 5

steps:
- uses: actions/checkout@v3
- name: Set up Python 3.10
uses: actions/setup-python@v3
with:
python-version: '3.10'
- name: Add conda to system path
run: |
# $CONDA is an environment variable pointing to the root of the miniconda directory
echo $CONDA/bin >> $GITHUB_PATH
- name: Install dependencies
run: |
conda env update --file environment.yml --name base
- name: Lint with flake8
run: |
conda install flake8
# stop the build if there are Python syntax errors or undefined names
flake8 . --count --select=E9,F63,F7,F82 --show-source --statistics
# exit-zero treats all errors as warnings. The GitHub editor is 127 chars wide
flake8 . --count --exit-zero --max-complexity=10 --max-line-length=127 --statistics
- name: Test with pytest
run: |
conda install pytest pytest-cov
pytest
pytest tests --cov=laserfarm --cov-report=xml

# - name: Upload coverage to Codecov
# uses: codecov/codecov-action@v1
# with:
# token: ${{ secrets.CODECOV_TOKEN }}
# file: ./coverage.xml
# flags: unittests
# name: codecov-umbrella
# fail_ci_if_error: true

46 changes: 0 additions & 46 deletions .github/workflows/test.yaml

This file was deleted.

3 changes: 3 additions & 0 deletions .idea/.gitignore

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

14 changes: 14 additions & 0 deletions .idea/Laserfarm.iml

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

6 changes: 6 additions & 0 deletions .idea/inspectionProfiles/profiles_settings.xml

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

8 changes: 8 additions & 0 deletions .idea/modules.xml

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

6 changes: 6 additions & 0 deletions .idea/vcs.xml

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

12 changes: 12 additions & 0 deletions build/lib/laserfarm/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
__license__ = 'Apache Licence 2.0'
__author__ = 'Netherlands eScience Center'
__email__ = '[email protected]'

from .__version__ import __version__

from laserfarm.data_processing import DataProcessing
from laserfarm.geotiff_writer import GeotiffWriter
from laserfarm.retiler import Retiler
from laserfarm.classification import Classification

from laserfarm.macro_pipeline import MacroPipeline
1 change: 1 addition & 0 deletions build/lib/laserfarm/__version__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
__version__ = '0.2.0'
108 changes: 108 additions & 0 deletions build/lib/laserfarm/classification.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,108 @@
import logging
import pathlib
import numpy as np
import shapefile
import shapely
import laserfarm
import laserchicken
from shapely.geometry import shape
from laserfarm.pipeline_remote_data import PipelineRemoteData
from laserchicken.io.load import load
from laserchicken.io.export import export
from laserchicken import filter

logger = logging.getLogger(__name__)


class Classification(PipelineRemoteData):
""" Classify points using polygons provided as shapefiles. """

def __init__(self, input_file=None, label=None):
self.pipeline = ('locate_shp',
'classification',
'export_point_cloud')
self.input_shp = []
self.point_cloud = None
if input_file is not None:
self.input_path = input_file
if label is not None:
self.label = label

def locate_shp(self, shp_dir):
"""
Locate the corresponding ESRI shape file of the point cloud

:param shp_dir: directory which contains all candidate shp file for
classification
"""

laserfarm.utils.check_file_exists(self.input_path,
should_exist=True)
pc = load(self.input_path.as_posix())

shp_path = self.input_folder / shp_dir

laserfarm.utils.check_dir_exists(shp_path, should_exist=True)

# Get boundary of the point cloud
self.point_cloud = pc
x = pc[laserchicken.keys.point]['x']['data']
y = pc[laserchicken.keys.point]['y']['data']
point_box = shapely.geometry.box(np.min(x), np.min(y),
np.max(x), np.max(y))

for shp in sorted([f.absolute() for f in shp_path.iterdir()
if f.suffix == '.shp']):
sf = shapefile.Reader(shp.as_posix())
mbr = shapely.geometry.box(*sf.bbox)

if point_box.intersects(mbr):
self.input_shp.append(shp)

return self

def classification(self, ground_type):
"""
Classify the pointset according to the given shape file.
A new feature "ground_type" will be added to the point cloud.
The value of the column identify the ground type.

:param ground_type: identifier of the groud type. 0 is not identified.
"""

# Get the mask of points which fall in the shape file(s)
pc_mask = np.zeros(len(self.point_cloud['vertex']['x']['data']),
dtype=bool)
for shp in self.input_shp:
this_mask = filter.select_polygon(self.point_cloud,
shp.as_posix(),
read_from_file=True,
return_mask=True)
pc_mask = np.logical_or(pc_mask, this_mask)

# Add the ground type feature
laserchicken.utils.update_feature(self.point_cloud,
feature_name='ground_type',
value=ground_type,
array_mask=pc_mask)
# Clear the cached KDTree
laserchicken.kd_tree.initialize_cache()
return self

def export_point_cloud(self, filename='', overwrite=False):
"""
Export the classified point cloud

:param filename: filename where to write point-cloud data
:param overwrite: if file exists, overwrite
"""
if pathlib.Path(filename).parent.name:
raise IOError('filename should not include path!')
if not filename:
filename = '_classification'.join([self.input_path.stem,
self.input_path.suffix])
export_path = (self.output_folder / filename).as_posix()

export(self.point_cloud, export_path, overwrite=overwrite)

return self
Loading