which IMO is a bit silly - Meta can and probably is scraping all the available public information anyway, defederating doesn’t really fix that
If they’re federated everything gets sent to them automatically.
If they’re not, they only get the info users see, and it’s a hassle to compile index and store. Like they could keep a running index of every user page, but why would they?
If they’re federated everything gets sent to them automatically.
If they’re not, they only get the info users see, and it’s a hassle to compile index and store. Like they could keep a running index of every user page, but why would they?
Easy to spin up a scrapper server that isn’t threads to collect data.