Skip to content

Commit

Permalink
Review1 (#181)
Browse files Browse the repository at this point in the history
* my workflow

* example templates for manuscript.

* manuscript info

* update citations

* cleanup text

* cite

* update readme

* contribute

* add website badge

* update web link

* Update README.md

* Update README.md

* add video link

* auto update

* utility files device specific

* move the pdf generator to workflows

* test error in paper.bib

* test citation style

* add citations

* add a few more citations

* try without space

* add all

* no spaces allowed in citation names

* add demo images

* Update paper.md

* give images some space

* image captions

* add mention of wade

* update authors and acknowledgements

* Updated with Dan's Recs

* Mary Comments

#69 (comment)

* SteveO comments

#69 (review)

* Walter's comments

#69 (review)

* Create config.yml

* Create bug.yml

* Create feature.yml

* hyperlink

* Update bug.yml

* Update feature.yml

* Added in Kristiina and Kris's comments

* Update README.md

* #69 (comment)

* #69 (comment)

* #69 (comment)

* #69 (comment)

* #69 (comment)

* #69 (comment)

* add kris's comments

* add WSL2 link

* Update paper.md

* add acknowledgements

* #69 (comment)

* #69 (comment)

* #69 (comment)

* add submitted badge

* update dois

* Update paper.md

#122

* Update paper.md

add Elizabeth to ack

* Update paper.md

add funder possibility lab

* Update paper.md

add funder.

* Update README.md

* add r code for analyzing data

* remove unnecessary code.

* trying to fix the unexpected period issue, not sure where it is coming from.

* revert bib to test

* add dois

* update paper acknowledgments and links.

* add comments.

* remove empty line

* Update about.vue

update about

* Update about.vue

add tutorial

* Update README.md

update video

* Update paper.md

update video
  • Loading branch information
wincowgerDEV authored Jul 22, 2023
1 parent 4db6bfa commit 402e595
Show file tree
Hide file tree
Showing 6 changed files with 76 additions and 19 deletions.
6 changes: 5 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
# Trash AI: Web application for serverless image classification of trash

[![Website](https://img.shields.io/badge/Web-TrashAI.org-blue)](https://www.trashai.org)
[![status](https://joss.theoj.org/papers/6ffbb0f89e6c928dad6908a02639789b/status.svg)](https://joss.theoj.org/papers/6ffbb0f89e6c928dad6908a02639789b)

### Project Information

Expand All @@ -14,7 +15,7 @@ Trash AI is a web application where users can upload photos of litter, which wil

#### Demo

[![image](https://user-images.githubusercontent.com/26821843/188515526-33e1196b-6830-4187-8fe4-e68b2bd4019e.png)](https://youtu.be/HHrjUpQynUM)
[![image](https://user-images.githubusercontent.com/26821843/188515526-33e1196b-6830-4187-8fe4-e68b2bd4019e.png)](https://youtu.be/u0DxGrbPOC0)

## Deployment

Expand Down Expand Up @@ -71,6 +72,9 @@ docker rm -v $id

- Runs the complex stuff so you don't have to.

### Tests
Instructions for automated and manual tests [here](https://github.com/code4sac/trash-ai/tree/production/frontend/__tests__).

## Contribute

We welcome contributions of all kinds.
Expand Down
2 changes: 1 addition & 1 deletion docs/localdev.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,12 +16,12 @@ These values can be adjusted by editing the localdev env file [.env](../localdev
It's suggested you work in branch `local` by creating your own local branch when developing
Pushing / merging PR's to any branches with a prefix of `aws/` will trigger deployment actions
For full functionality you will want to get a Google Maps API key and name it VITE_GOOGLE_MAPS_API_KEY, but it is not required
=======

Pushing / merging PR's to any branches with a prefix of `aws/` will
trigger deployment actions, when developing locally, create a new branch
and submit a pull request to `aws/trashai-staging`


---
# Set up

Expand Down
9 changes: 6 additions & 3 deletions frontend/src/views/about.vue
Original file line number Diff line number Diff line change
Expand Up @@ -23,6 +23,10 @@
To get started, visit the Upload Tab or
<a href="/uploads/0">click here</a>.
</p>
<h2>Tutorial</h2>
<p>
<iframe width="560" height="315" src="https://www.youtube.com/embed/u0DxGrbPOC0" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen></iframe>
</p>

<h2>What is it?</h2>
<p>
Expand Down Expand Up @@ -54,9 +58,8 @@
<h2>Disclaimer about uploaded images</h2>
<p>
The current version of Trash AI and the model we are using is just a
start! When you upload an image, we are storing the image and the
classification in an effort to expand the trash dataset and improve
the model over time.
start! The tool works best for images of individual pieces of trash imaged less than 1 meter away from the camera.
We are looking for collaborators who can help us improve this project.
</p>

<h2>Reporting issues and improvements</h2>
Expand Down
50 changes: 50 additions & 0 deletions notebooks/data_reader/data_reader.R
Original file line number Diff line number Diff line change
@@ -0,0 +1,50 @@
#Working directory ----
setwd("notebooks/data_reader") #Change this to your working directory

#Libraries ----
library(rio)
library(jsonlite)
library(ggplot2)
library(data.table)

# Data import ----
json_list <- import_list("example_data_download2.zip")

# Get path of the summary table.
summary_metadata <- names(json_list)[grepl("summary.json", names(json_list))]

# Get path of the image metadata.
image_metadata <- names(json_list)[!grepl("(.jpg)|(.png)|(.tif)|(schema)|(summary)", names(json_list))][-1]

# Filter the summary data.
summary_json <- json_list[[summary_metadata]]

# Flatten the summary data.
flattened_summary <- data.frame(name = summary_json$detected_objects$name,
count = summary_json$detected_objects$count)
# Filter the image data.
image_json <- json_list[image_metadata]

# Flatten the image data.
flattened_images <- lapply(1:length(image_json), function(i){
print(i)
data.frame(hash = image_json[[i]]$hash,
filename = image_json[[i]]$filename,
datetime = if(!is.null(image_json[[i]]$exifdata$DateTimeOriginal)){image_json[[i]]$exifdata$DateTimeOriginal} else{NA},
latitude = if(!is.null(image_json[[i]]$exifdata$GPSLatitude)){image_json[[i]]$exifdata$GPSLatitude} else{NA},
longitude = if(!is.null(image_json[[i]]$exifdata$GPSLongitude)){image_json[[i]]$exifdata$GPSLongitude} else{NA},
score = if(!is.null(image_json[[i]]$metadata$score)){image_json[[i]]$metadata$score} else{NA},
label = if(!is.null(image_json[[i]]$metadata$label)){image_json[[i]]$metadata$label} else{NA})
}) |>
rbindlist()

# Test equivalence in counts.
nrow(flattened_images[!is.na(flattened_images$label),]) == sum(flattened_summary$count)

# Figure creation ----
ggplot(flattened_summary, aes(y = reorder(name, count), x = count, fill = name)) +
geom_bar(stat = "identity") +
theme_classic(base_size = 15) +
theme(legend.position = "none") +
labs(x = "Count", y = "Type")

12 changes: 6 additions & 6 deletions paper.bib
Original file line number Diff line number Diff line change
Expand Up @@ -88,7 +88,8 @@ @ARTICLE{Hapich:2022
number = 1,
pages = "15",
month = jun,
year = 2022
year = 2022,
doi = "10.1186/s43591-022-00035-1"
}

@misc{Waterboards:2018,
Expand All @@ -108,11 +109,10 @@ @article{vanLieshout:2020
number = {8},
pages = {e2019EA000960},
keywords = {plastic pollution, object detection, automated monitoring, deep learning, artificial intelligence, river plastic},
doi = {https://doi.org/10.1029/2019EA000960},
doi = {10.1029/2019EA000960},
url = {https://agupubs.onlinelibrary.wiley.com/doi/abs/10.1029/2019EA000960},
eprint = {https://agupubs.onlinelibrary.wiley.com/doi/pdf/10.1029/2019EA000960},
note = {e2019EA000960 10.1029/2019EA000960},
abstract = {Abstract Quantifying plastic pollution on surface water is essential to understand and mitigate the impact of plastic pollution to the environment. Current monitoring methods such as visual counting are labor intensive. This limits the feasibility of scaling to long-term monitoring at multiple locations. We present an automated method for monitoring plastic pollution that overcomes this limitation. Floating macroplastics are detected from images of the water surface using deep learning. We perform an experimental evaluation of our method using images from bridge-mounted cameras at five different river locations across Jakarta, Indonesia. The four main results of the experimental evaluation are as follows. First, we realize a method that obtains a reliable estimate of plastic density (68.7\% precision). Our monitoring method successfully distinguishes plastics from environmental elements, such as water surface reflection and organic waste. Second, when trained on one location, the method generalizes well to new locations with relatively similar conditions without retraining (≈50\% average precision). Third, generalization to new locations with considerably different conditions can be boosted by retraining on only 50 objects of the new location (improving precision from ≈20\% to ≈42\%). Fourth, our method matches visual counting methods and detects ≈35\% more plastics, even more so during periods of plastic transport rates of above 10 items per meter per minute. Taken together, these results demonstrate that our method is a promising way of monitoring plastic pollution. By extending the variety of the data set the monitoring method can be readily applied at a larger scale.},
year = {2020}
}

Expand Down Expand Up @@ -145,7 +145,8 @@ @ARTICLE{Lynch:2018
number = 1,
pages = "6",
month = jun,
year = 2018
year = 2018,
doi = "10.1186/s40965-018-0050-y"
}


Expand Down Expand Up @@ -176,7 +177,7 @@ @article{Majchrowska:2022
pages = {274-284},
year = {2022},
issn = {0956-053X},
doi = {https://doi.org/10.1016/j.wasman.2021.12.001},
doi = {10.1016/j.wasman.2021.12.001},
url = {https://www.sciencedirect.com/science/article/pii/S0956053X21006474},
author = {Sylwia Majchrowska and Agnieszka Mikołajczyk and Maria Ferlin and Zuzanna Klawikowska and Marta A. Plantykow and Arkadiusz Kwasigroch and Karol Majek},
keywords = {Object detection, Semi-supervised learning, Waste classification benchmarks, Waste detection benchmarks, Waste localization, Waste recognition},
Expand All @@ -193,4 +194,3 @@ @misc{Proença:2020
year = {2020},
copyright = {arXiv.org perpetual, non-exclusive license}
}

Loading

0 comments on commit 402e595

Please sign in to comment.