I think the combination of two things actually makes this slightly difficult to defeat:
The app will take a video and look for movement so a static photo won’t cut it.
They apparently flash the screen red green and blue which allows them to distinguish reflective and emissive surfaces. So you can’t just point it at a video of an old person because no suitable reflective colour displays exist yet.
There are a few ways I can think of to circumvent it:
Write an app that displays a video and simultaneously averages the colour of the front facing camera and applies that as a filter to the video, emulating a reflective display. There would be some lag but I bet it works.
Use the Android emulator and directly read the screen colour, and use that to filter the camera input (and connect the camera to an AI video). I dunno how detectable the Android emulator is these days though. Probably the age verification apps can detect it fairly easily.
Find a homeless person and pay them £2 to look at your phone.
How does that help verify age? I can take a selfie of the photo that comes with a frame for God’s sake
I think the combination of two things actually makes this slightly difficult to defeat:
There are a few ways I can think of to circumvent it:
Write an app that displays a video and simultaneously averages the colour of the front facing camera and applies that as a filter to the video, emulating a reflective display. There would be some lag but I bet it works.
Use the Android emulator and directly read the screen colour, and use that to filter the camera input (and connect the camera to an AI video). I dunno how detectable the Android emulator is these days though. Probably the age verification apps can detect it fairly easily.
Find a homeless person and pay them £2 to look at your phone.