FaceDeer
Basically a deer with a human face. Despite probably being some sort of magical nature spirit, his interests are primarily in technology and politics and science fiction.
Spent many years on Reddit before joining the Threadiverse as well.
- 0 Posts
- 14 Comments
FaceDeer@fedia.ioto
Technology@lemmy.world•Google Revisits JPEG XL in Chromium After Earlier Removal
16·4 days agoIt works because the .png and .jpg extensions are associated on your system with programs that, by coincidence, are also able to handle webp images and that check the binary content of the file to figure out what format they are when they’re handling them.
If there’s a program associated with .png on a system that doesn’t know how to handle webp, or that trusts the file extension when deciding how to decode the contents of the file, it will fail on these renamed files. This isn’t a reliable way to “fix” these sorts of things.
FaceDeer@fedia.ioto
Technology@lemmy.world•Google Revisits JPEG XL in Chromium After Earlier Removal
341·4 days agoSo it’s basically “nobody wants to use it because nobody is using it.”
I actually rather like it, and at this point many of the tools I use have caught up so I don’t mind it any more myself.
FaceDeer@fedia.ioto
Technology@lemmy.world•Gmail users warned to opt out of new feature - what we know
1·7 days agoYes, they are. Not sure why you are bringing that up.
I am bringing it up because the setting Google is presenting only describes using AI on your data, not training AI on your data.
FaceDeer@fedia.ioto
Technology@lemmy.world•Gmail users warned to opt out of new feature - what we know
11·7 days agoYes, exactly. Training an AI is a completely different process from prompting it, it takes orders of magnitude more work and can’t be done on a model that’s currently in use.
FaceDeer@fedia.ioto
Technology@lemmy.world•Gmail users warned to opt out of new feature - what we know
1·8 days agoYes, but the point is that granting Google permission to manage your data by AI is a very different thing from training the AI on your data. You can do all the things you describe without also having the AI train on the data, indeed it’s a hard bit of extra work to train the AI on the data as well.
If the setting isn’t specifically saying that it’s to let them train AI on your data then I’m inclined to believe that’s not what it’s for. They’re very different processes, both technically and legally. I think there’s just some click-baiting going on here with the scary “they’re training on your data!” Accusation, it seems to be baseless.
FaceDeer@fedia.ioto
Technology@lemmy.world•Gmail users warned to opt out of new feature - what we know
21·8 days agoUnderstand that basically ANYTHING that “uses AI” is using you for training data.
No, that’s not necessarily the case. A lot of people don’t understand how AI training and AI inference work, they are two completely separate processes. Doing one does not entail doing the other, in fact a lot of research is being done right now trying to make it possible to do both because it would be really handy to be able to do them together and it can’t really be done like that yet.
And if you read any of the EULAs
Go ahead and do so, they will have separate sections specifically about the use of data for training. Data privacy is regulated by a lot of laws, even in the United States, and corporate users are extremely picky about that sort of stuff.
If the checkbox you’re checking in the settings isn’t explicitly saying “this is to give permission to use your data for training” then it probably isn’t doing that. There might be a separate one somewhere, it might just be a blanket thing covered in the EULA, but “tricking” the user like that wouldn’t make any sense. It doesn’t save them any legal hassle to do it like that.
FaceDeer@fedia.ioto
Technology@lemmy.world•Gmail users warned to opt out of new feature - what we know
1·8 days agoI’m not seeing where any of this gives Google permission to train AI using your data. As far as I can see it’s all about using AI to manage your data, which is a completely different thing. The word “training” appears to originate in Dave Jones’ tweet, not in any of the Google pages being quoted. Is there any confirmation that this is actually happening, and not just a social media panic?
FaceDeer@fedia.ioto
World News@lemmy.world•AI slop tops Billboard and Spotify charts as synthetic music spreads
0·16 days agoWhen you watch a movie with computer-generated special effects, are you happy watching a lie?
FaceDeer@fedia.ioto
World News@lemmy.world•AI slop tops Billboard and Spotify charts as synthetic music spreads
0·16 days agoIf 97% of listeners can’t tell the difference then I’m not sure the point of quibbling the definition.
FaceDeer@fedia.ioto
World News@lemmy.world•AI slop tops Billboard and Spotify charts as synthetic music spreads
0·16 days agoUnless you enjoy music regardless of whether it’s AI generated, in which case the future’s going to have way more options.
FaceDeer@fedia.ioto
Technology@lemmy.world•Amazon to replace 600,000 US workers by 2033 with robots
0·1 month agoIsn’t this what all those artists were insisting they wanted? AI replacing rote labor instead of their “creative” jobs like producing advertising and concept art?
FaceDeer@fedia.ioto
Technology@lemmy.world•Perplexity offers to buy Google Chrome for $34.5 billion
01·4 months agoIsn’t something that you don’t respond to, sure. And here in social media you’re surrounded by like-minded people. But perhaps AI is more popular with the general public than you think?

Indeed. Although it’s painful to see these things happening at such crucial junctures to such crucial individuals, overall it’s a good thing that corruption is being rooted out. The process of rooting out corruption will of course expose a bunch of corruption in the process, but better that it be exposed than to let it continue to fester.
It would have been lovely to flip a magical switch the moment Ukraine left Russia’s orbit that made corruption go away, but that switch doesn’t exist. Corruption probes and prosecutions are what will cause the change to happen.