It’d be a shame if someone use AI to generate an image to upload. You’d just be feeding AI data into AI, and that could ruin what Facebook’s trying to do.
a lot of the big names in image generation fingerprint the images they output to make them easy to filter out of training data. theres a possibility they could detect this fairly easily.
Typically, I think that requires the photo to be modified for the purpose of identifying it later. I don’t think any old photo can be retrofitted with such a capability posthoc.
It’d be a shame if someone use AI to generate an image to upload. You’d just be feeding AI data into AI, and that could ruin what Facebook’s trying to do.
a lot of the big names in image generation fingerprint the images they output to make them easy to filter out of training data. theres a possibility they could detect this fairly easily.
Is that the sort of thing that would survive screenshotting and messing with the color balance?
I’m not familiar with the exact techniques used but I am aware that there are ways of fingerprinting images that would survive such transformations.
Typically, I think that requires the photo to be modified for the purpose of identifying it later. I don’t think any old photo can be retrofitted with such a capability posthoc.
Or even taking a picture of the screen using a phone
Or just upload a picture of the unabomber.