Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

eFSA80SC support #1

Open
mincrmatt12 opened this issue Mar 26, 2021 · 36 comments
Open

eFSA80SC support #1

mincrmatt12 opened this issue Mar 26, 2021 · 36 comments

Comments

@mincrmatt12
Copy link
Owner

add the new impl from mincrmatt12/elan-spi-fingerprint#2

@DupiDachs
Copy link

For me to help understanding the authentification procedure:

  • When scanning a finger, how do we get a match?
  • Is it hardware based? I.e. does the hardware somehow store a finger and then compare it to your current reading?
  • Or is there another(=software-based) algorithm?

I have read that you need to swipe your finger across the sensor to make it work. In Windows that is not necessary and I am wondering if there might be a way to also achieve this in Linux.

@mincrmatt12
Copy link
Owner Author

These sensors don't have any on-chip matching, so I use the libfprint builtin fingerprint matching.

The big downside to that is that it's based on code that was designed primarily for much larger images, so the tiny 80x80 ones we get don't really have enough features, so I've been using the slide stitching code to emulate having a larger image.

There are a few upstream issues in libfprint about trying to make touch-only sensors work too.

@DupiDachs
Copy link

Here we go ... Profit ...
I still have to clean it up, then I'll push it into my branch. Feel free to merge (and improve). :)
image

@DupiDachs
Copy link

DupiDachs commented Mar 29, 2021

see here:
https://github.com/DupiDachs/libfprint/tree/mincrmatt12/elan-spi/libfprint/drivers

I took a look at the output and it does not look as satisfying as what we get from your Python script. I guess the width of libprintf is at least 128 pixels. We have only 80, so the rest is white. But vertically, there it is somehow smudged. Need to take a look ...

@mincrmatt12
Copy link
Owner Author

Wow, nice! I'm probably not going to end up merging this since I'm planning on rewriting the driver with some upstream additions, but this looks good. The image should have a bit of white padding since I'm doing slide-style stitching, and so it needs some space to account for horizontal deviation. If you're properly swiping your finger along the sensor you should be getting a reasonably sharp (with usually a bit of smudging at the top) tall-but-skinny image.

You may have to remove the vertical flip in elanspi_process_frame since I don't think your sensor is flipped like the ones I had.

@DupiDachs
Copy link

Makes sense. I guess I will have to dig deeper into this. But this will take time, good that I am on holidays. ;)

The idea is to generate a full fingerprint for enrolling from a lot of consecutive readings and merging. Maybe even do that manual at first (I mean you usually have to do it only once). Then for verification only read a single frame and fit it to you enrolled finger. If it does not fit it would be a fail and a pass in case of a good fit.
I guess that is not as secure as a full fingerprint, but anyways I'll try.

@mincrmatt12
Copy link
Owner Author

I mean, that should be what it's doing, at least vertically. In theory every time it takes a single "frame" (from libfprint's point of view) it should be giving a long vertical slice, and then libfprint matches those vertical slices against each other using minutiae.

@mincrmatt12
Copy link
Owner Author

mincrmatt12 commented Apr 6, 2021

Alright, the new driver is theoretically working, but some upstream nonsense with the spi transfer helpers means it doesn't work if you don't manually do some tweaking before building it. If you want to do that, try the mincrmatt12/elan-spi-new, and once you pull it manually resolve the last few threads here, (for the first two, add some != NULL, and the last one change source_object to res on the indicated line) then build and try it.

update: the upstream nonsense has been resolved, you can now just use the new code on mincrmatt12/elan-spi-new. I have been force-pushing a bunch to there so I recommend not git pull-ing and instead git fetch and git reset-ing.

@Aiiaiiio
Copy link

Aiiaiiio commented Apr 6, 2021

FYI I tried the things you wrote, and it works pretty well. I scanned my pinky to demonstrate (with the img-capture example) :) There's always a "patch" at the top of the image, I don't know if that's normal.

finger

@mincrmatt12
Copy link
Owner Author

I think the "patch" is the driver thinking that image has a finger in it when it clearly doesn't (or at least not the part that gets cropped for stitching). I pushed some new changes that should fix that. When you say "it works", you mean you can enroll/verify, right?

@Aiiaiiio
Copy link

Aiiaiiio commented Apr 6, 2021

The image is better now, but still not perfect (see attached image).
I'm sorry, I didn't mean enroll/verify, but I tried it now and it worked. To be clear this time:
I ran examples/enroll, then examples/verify, and this is the end of the output:
process:24545): libfprint-device-DEBUG: 23:30:53.791: Device reported identify completion (process:24545): libfprint-SSM-DEBUG: 23:30:53.791: [elanspi] ELANSPI_FPCAPT_NSTATES completed successfully (process:24545): libfprint-device-DEBUG: 23:30:53.791: Completing action FPI_DEVICE_ACTION_IDENTIFY in idle!
So I suppose it works.

image

@mincrmatt12
Copy link
Owner Author

Hmm, well at least now it's picking images that actually have fingers in them. I pushed another change that should normalize partial images better, so the top has the right contrast and gets merged better. Maybe these sensors just need some more debounce though...

@Aiiaiiio
Copy link

Aiiaiiio commented Apr 7, 2021

It's better. Results vary, but even if the contrast is not quite right at the top, it's merged better.

image
image
image

@DupiDachs
Copy link

Currently trying your implementation. Some remarks (some of which are also true for the implementation I did based on your previous branch):

  • the sensor in my laptop is extremely quick and I artificially slowed it down with a 30ms sleep before each read (I dont know if there is a better way to approach this and somehow tell the sensor to simply not to update at full speed).
  • sensor on my device is rotated by 90°. So I need to exchange x and y in "elanspi_lookup_pixel_with_rotation"
  • whether or not to rotate by 180° might be better implemented by setting the flag "FPI_IMAGE_V_FLIPPED". I guess for stitching it might be also relevant whether the finger is scanned top down or bottom up (but I do not care).
  • somehow I am getting now a strange first line (so in my case a vertical as it is rotated by 90°). Could be that this is also an issue for the others, but we do not see it as it is "stitched away". Maybe it is also my fault with this code change, need to check...
  • these fp_dbg calls in __ssm_call_handler are really spamming the hell out of it. Commented it.
  • during enroll, I get "deactivating image device while it is not idle, this should not happen". No time yet to check. The single image capture works.

@DupiDachs
Copy link

DupiDachs commented Apr 8, 2021

OK. So it turned out to be my delay was causing the single line "rip". Well, another workaround could be to set max_frames to a higher value (already had it at 50). Guess I would need 100. Maybe there is another spot where I can put the delay...

"deactivating image device while it is not idle, this should not happen": might be a conflict with nr_enroll_stages? It always happens after the fifth go. Interestingly, it still successfully enrolles+verifies anyway...

Yup: Somehow conflicts with
"#define IMG_ENROLL_STAGES 5" in "fp-image-device-private.h"
Setting that to 7 makes it run till the end. I guess one could change parameterize the code within fp-image-device.c and fpi-image-device-c to not to refer to this define and use nr_enroll_stages of the class.

@mincrmatt12
Copy link
Owner Author

mincrmatt12 commented Apr 8, 2021

the sensor in my laptop is extremely quick and I artificially slowed it down with a 30ms sleep before each read (I dont know if there is a better way to approach this and somehow tell the sensor to simply not to update at full speed).

  • Yeah, from what I've seen it does look that way. I'm probably going to add a per-sensor tweakable "framedelay" by using fpi_ssm_jump_to_state_delayed somewhere in the capture machine and hopefully tune the threshold for similarity between frames.

sensor on my device is rotated by 90°. So I need to exchange x and y in "elanspi_lookup_pixel_with_rotation"

  • You also would need to swap references to sensor_width with sensor_height in various places (although I guess it doesn't matter since the 80sc is square); I'm probably just going to add a frame_width/frame_height.

whether or not to rotate by 180° might be better implemented by setting the flag "FPI_IMAGE_V_FLIPPED". I guess for stitching it might be also relevant whether the finger is scanned top down or bottom up (but I do not care).

  • IMG_V_FLIPPED only works for the entire fingerprint image after stitching, it still has to be oriented the right way around before then.

these fp_dbg calls in __ssm_call_handler are really spamming the hell out of it. Commented it.

  • You should be able to turn off the dbg calls in SSM with environment vars, but the examples (afaik) force set them to be on; I've just been running them through a grep to hide them

Yup: Somehow conflicts with
"#define IMG_ENROLL_STAGES 5" in "fp-image-device-private.h"
Setting that to 7 makes it run till the end. I guess one could change parameterize the code within fp-image-device.c and fpi-image-device-c to not to refer to this define and use nr_enroll_stages of the class.

  • I fixed the "not running 7 stages" thing upstream in !262. I'm not sure what causes the "deactivating while it is not idle" thing, probably I'm not calling some callback in the right place.

@DupiDachs
Copy link

Thanks for the response. :)
I find the overall performance with the stiching has improved quite a bit. 👍 I'll go with this implementation then and some small modifications.

Also, I made quite some digging into openCV and made some progess on matching a single read within a full finger as a replacement/extension for the minutiae matching in libfprint. Still not done yet ...

@mincrmatt12
Copy link
Owner Author

Cool. If you do make any progress on a better fingerprint matching system you might want to consider contributing it upstream into libfprint as a whole. There's been some prior work on the subject in https://gitlab.freedesktop.org/libfprint/libfprint/-/issues/272 and https://gitlab.freedesktop.org/libfprint/libfprint/-/issues/271

There's also a publicly available fingerprint dataset that was mentioned in https://gitlab.freedesktop.org/libfprint/libfprint/-/merge_requests/198.

@mincrmatt12
Copy link
Owner Author

Alright I've pushed some more code that should do some delaying to reduce the number of frames being taken as well as properly get rid of duplicate frames.

@Aiiaiiio if you could test it again at some point that'd be great

@DupiDachs in theory it should also now be rotating by 90 degrees for your sensor

@Aiiaiiio
Copy link

Here are my latest images. I noticed, that with my index finger enroll/verify works very good, but with my pinky, it's very hard to verify. Would you like me to attach the enrolled and verify pgms as well?

image
image
image

@mincrmatt12
Copy link
Owner Author

No, those images look pretty good as-is (even with the annoying blobs at the top of the image, there's now enough actual finger that the minutiae detector should just discard those bits). Could you attach the output (not the images, just the stdout) of verify-ing with both a finger you enrolled and one you didn't? I'm curious as to how much libfprint thinks they match (and also I want to make sure I won't need to lower the finger match (bz3) threshold for the smaller sensors)

@DupiDachs
Copy link

Thanks for the update, I gave it a go. Code is looking better, but the merge was annoying with all these formatting changes (CLANG?). ;)

whether or not to rotate by 180° might be better implemented by setting the flag "FPI_IMAGE_V_FLIPPED". I guess for stitching it might be also relevant whether the finger is scanned top down or bottom up (but I do not care).
IMG_V_FLIPPED only works for the entire fingerprint image after stitching, it still has to be oriented the right way around before then.

I took another look at the code and I think you do not have to do the 180° rotation for each frame. You can use the FPI_IMAGE_V_FLIPPED. Take a look at "fpi-assembling.c", routine "fpi_do_movement_estimation". In there, it stitches the image either bottom up or top down. So in the end, it should be OK to just flip it with the flag. Maybe my reasoning is off? Tried it with both 90° left and right - getting the same working result (but rotated).
Anyways, the 90° implementation is nice. Thanks for that!

@mincrmatt12
Copy link
Owner Author

mincrmatt12 commented Apr 11, 2021

Your logic is sound, but you're forgetting that I'm cropping the full image (which improves stitching since you get less false overlaps too far down the image if the finger is highly selfsimilar) so that still needs to know what way up the sensor is.

(also the format stuff is because upstream libfprint wants it that way)

@Aiiaiiio
Copy link

I was going to give you the outputs you requested, but something happened, now I can't get a match. I pulled your latest commit (elanspi: more agressive fp detect constants).The verify image is a bit messy, but I think a match should be possible regardless.

@mincrmatt12
Copy link
Owner Author

Hmm, perhaps I made them too aggressive. The logs would still be useful, though.

@Aiiaiiio
Copy link

Here's the output of a failed verification. Please tell me if you need anything else!

r_index_verification_failed.txt

@mincrmatt12
Copy link
Owner Author

The output looks ok, maybe the image it's trying to match against is just bad. You could try removing test-storage.variant and then re-enroll/verify to see. Just to clarify, though: it was working fine before the latest commit?

@DupiDachs
Copy link

With the new implementation, you scale the image factor two. What is the reason for this?

@mincrmatt12
Copy link
Owner Author

Makes the minutiae detector not remove as many potential features

@Aiiaiiio
Copy link

So, I took a little time to properly answer your questions.
Removing test-sotrage.variant didn't help.
I rolled back a few commits to check whether it was working earlier, and I think I misinterpreted the output, verify never worked. Sorry about that.

The enrolled image is always strange (crooked and only a part of the image is saved):
image

And this is the verify image. It is possible to identify the common parts of the images, but still, the verify image look way better:
image

@mincrmatt12
Copy link
Owner Author

Hmm, the only reason I can imagine for it being crooked is too much horizontal motion while you were swiping during enroll.
I'm surprised it's giving you a flat out 0 score for the match though, they do look somewhat similar. Does it ever get above 0? It's possible that it does work somewhat and the small sensors just need a lower threshold or something.

@Aiiaiiio
Copy link

Sorry, I was away for a few days.
So, I tried to keep my finger as straight as possible, and I swiped slowly and got better images (attached). The scores are still 0, they never go above that. If I place the two images above each other in gimp, they do have a lot of features in common.

image

image

@DupiDachs
Copy link

I finally have a working prototype in python for fingerprint matching based on akaze feature matching in openCV (so no minutiae) and the binarization coming from libprintf. Also, I now set up libprintf (meson) to compile with openCV and at least a stub for a simple openCV test.
But I guess to translate the whole thing to C++ will take another couple of weeks - also considering that I am not sure on how to integrate it exactly into libprintf.
The biggest issue is the stitching (not only in y, but also in x direction) for having a full fingerprint to perform the matching. That step, which is required to do the enroll, was done manually in advance (used PTgui, but guess others would work too). Ideally, this would also integrate in libprintf.

Next step is to clean up and put it to github. Any suggestions welcome after that. :)

@DupiDachs
Copy link

OK. It is done. celebrate

https://github.com/DupiDachs/libfprint/tree/featureMatch
I'll try to put the required information there and also drop by at gitlab.

@mincrmatt12
Copy link
Owner Author

Looks really nice! Although I doubt it'll be merged in its current state (with the whole manual enroll/merging thing), it looks like an excellent approach for matching. Have you tried testing it with any other datasets to get an idea of the false positive/negative rate?

@DupiDachs
Copy link

Yes, I agree. Without a proper enrolling procedure it is really annoying and does not make sense to merge with master. Took a lot of time just to do that. I guess I could program something for the enrolling based in openCV - but I rather enjoy the current state for now on my machine. :D
No deep testing done (exept using my own hands). Never had a false positive. From time to time some false negatives, but no problem as fprintd immedieately retries.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants