[Question]: Support Coral.ai Accelerator ? #6872
Replies: 34 comments 1 reply
-
This is very specific and at the moment and we don't have the capacity for this yet. I will keep this in mind, and if there are more requests for this, then I will revisit this issue. |
Beta Was this translation helpful? Give feedback.
-
+1 Would love this. I have older hardware that doesn't support the AVX and AVX2 commands, and a couple of coral accelerators for use with the Frigate project. I would love to redirect one of them to be used here. |
Beta Was this translation helpful? Give feedback.
-
+1 This would be amazing to have, especially on a little Pi |
Beta Was this translation helpful? Give feedback.
-
I'd also like to see this, with energy prices climbing, anything to offset CPU use would be a bonus. |
Beta Was this translation helpful? Give feedback.
-
Would be amazing! I bought one for Frigate, but since it's dual Edge TPU, would love the use the second for Immich. +1 |
Beta Was this translation helpful? Give feedback.
-
I'm also very keen on this idea! There are dozens of selfhosters that already have these for Frigate. Dozens! |
Beta Was this translation helpful? Give feedback.
-
Literally dozens of us! |
Beta Was this translation helpful? Give feedback.
-
I am in exactly the same boat as jwynn6. CPU has no AVX but I do have a coral accelerator. Love to see this implemented if you have the time. |
Beta Was this translation helpful? Give feedback.
-
I also thing this would be an extremely nice feature. In the upcoming years we will probably see more AI edge accelerators on the market so it would be good to ride those waves |
Beta Was this translation helpful? Give feedback.
-
+1, this would be great |
Beta Was this translation helpful? Give feedback.
-
+1. There's definitely a lot of overlap between the people who self-host and the people who own Corals now. Largely because of Frigate for many of us. |
Beta Was this translation helpful? Give feedback.
-
With release v1.56.0 we have now |
Beta Was this translation helpful? Give feedback.
-
No avx on my older hardware either and with more functions such as facial recognition being added that require avx support for the coral would be awesome. |
Beta Was this translation helpful? Give feedback.
-
+1 running on older hardware, coral support would be awesome |
Beta Was this translation helpful? Give feedback.
-
Got a Google Coral TPU, if needed i can make a few test :) |
Beta Was this translation helpful? Give feedback.
-
+1 for supporting a Coral TPU. I'm running Immich on a older Pentium and while it's been good for general use, I had to turn off ML to avoid melting the CPU back into raw silicon. |
Beta Was this translation helpful? Give feedback.
-
For what it matters, I would love to see coral rupport on Immich! |
Beta Was this translation helpful? Give feedback.
-
+1 for Coral support & reopening this issue (: |
Beta Was this translation helpful? Give feedback.
-
Same here, +1 for coral support. |
Beta Was this translation helpful? Give feedback.
-
It's currently not feasible for us to support this, see #73 (reply in thread) |
Beta Was this translation helpful? Give feedback.
-
The Coral is dirt cheap for what it does. I have a USB Coral and a Dual Edge.... because of Frigate of course. So, add me to the dozen that owns a few. |
Beta Was this translation helpful? Give feedback.
This comment has been minimized.
This comment has been minimized.
This comment has been minimized.
This comment has been minimized.
-
Another user that would love coral tpu support. Running immich on a pi is great but slapping a GPU in there for the ML tasks isn't feasible. Corals are cheap and power efficient. I hope it's still feasible to add this as it would really help set immich even further apart from the competition. |
Beta Was this translation helpful? Give feedback.
-
Yet another +1 here. Same thing. Bought for Frigate but would love to leverage it to reduce usage on CPU/GPU for immich. |
Beta Was this translation helpful? Give feedback.
-
i'd offer to buy the Devs Coral cards but, it looks like Louisrossman or FUTO can now be the generous donor. look forwards to seeing HW acceleration for AI CPU offloading. Congratulation guys! |
Beta Was this translation helpful? Give feedback.
-
Same here. i have a dual tpu.. one half is used for frigate rn |
Beta Was this translation helpful? Give feedback.
-
I'm also using a coral USB with frigate in docker and it works great (perfect) on low-end hardware. +1 on supporting it in this project. |
Beta Was this translation helpful? Give feedback.
-
+1 for coral support!! |
Beta Was this translation helpful? Give feedback.
-
As mentioned in #73 (comment), coral support is just not possible and so will never happen. |
Beta Was this translation helpful? Give feedback.
-
Feature detail
I don't known if it's a good idea or not ... i read a few article on this hardware and this seem really cool for running tensorflow locally.
Maybe support this type of hardware (https://coral.ai/products/accelerator) could help your AI model ? support for a specifc hardware with relative low cost / power usage for accelerate tensorflow seem to be a good idea.
Maybe for a better object/facial recognition ?
Platform
Server
Beta Was this translation helpful? Give feedback.
All reactions