In what way are both sides unhappy?
In what way are both sides unhappy?
Except her pre-transition fastest 1000 free was faster than the record for female 1000 free.
To add on to that. Her pre-transition time was ~24 seconds slower than the male record and post transition her 1000 free was about 32 seconds slower than the female record. So if anything she was preforming better in her categories before she transitioned.
Bloodborne 60fps
I mean prolly. The chances of you never having caught it by this point are tiny
If you boot it up they give you the order to play it in. There is only 1 problem and it’s at the beginning. They tell you to watch 358/2 before playing 2
DO NOT DO THAT IT SPOILS A MAJOR PLOT IN PART 2
Play it in this order
1
Chain of memories
2
358/2 days
They originally did for the beta (for origin characters at least) but the players didn’t really like it so the feature was removed
Metal Gear Solid Peace Walker
Patapon 2/3
Persona 2 innocent sin
Persona 2 Eternal Punishment (look for the English translation of the psp version) (IS and EP are 2 parts of the same story you have you play them in that order)
Persona 3 Portable (p3p is more beginner friendly then p2 but imo the p2 duology is better)
God of War (chains of Olympus and ghost of Sparta)
Kingdom hearts birth by sleep
GTA liberty city stories and vice city stories
Ackshually…
The GameCube was stronger than the ps2 it was just limited by the disc format. Compare Resident Evil 4 on the two. The GameCube version had hair physics for Leon and the cutscenes were rendered in real time, while they had to pre-render them on ps2 (which also made alternate costumes better on GameCube since Leon would wear them during cutscenes)
Is anyone asking for that though? To make it illegal to have regular pictures of children in these datasets?
I was responding to this part of your comment which directly refers to legality
No but it is a reason why generating csam should be illegal. You’re using data trained on pictures of real kids
At that point it’s still using photos of children to generate csam even if you could somehow assure the model is 100% free of csam
If you don’t understand that then I’m done here because you either don’t understand what “ai” does on a fundamental level or you don’t understand how big the difference is between adult and child bodies.
This is a gross conversion to be having on something that is so wrong to do on so many levels.
You can’t make an ethno state without genocide so it is wrong and pointless to talk about
You can’t make ai csam without harming a child so it is wrong and pointless to talk about
Just like how you can’t generate a child without pictures of children to base it on you can’t generate them naked without pictures of their bodies. There is a reason pedos are attracted to those bodies and not women with no curves/small men.
I work with children, I see them everyday. The difference is so massive that an ai would not be able to approximate it with just photos of adults. Ai doesn’t “know” anything it just has photos that it uses to approximate what is being asked based off it’s data. Even if you kept describing in more detail what those bodies looked like it wouldn’t be able to create it without anything to base it on. It’d be like creating a van gogh style picture with no van gogh training data, no matter how much you try to describe the details of his style you’ll never get the ai to make something like it without the training data.
You can keep disagreeing, keep saying “but with more data” but ai can’t make anything original, that is a fundamental misunderstanding of it’s abilities. If it doesn’t have the data it can’t accurately do it.
I don’t see a reason to discuss if it’s possible to to something if the thing that’s being done is morally wrong. If you disagree then let’s talk about making a white ethno state or if we can do another Holocaust since morality doesn’t matter when discussing hypotheticals
You can’t generate csam without photos of children to make up the actual child part of the picture. It doesn’t matter if you actually use csam you’re still using photos of children to make pornography. Unless you think ai could create a van gogh style picture without any van gogh training data (and if you do then you don’t know enough about ai generated photos to talk about them with any authority)
It’s obviously accidental, but that doesn’t change that it happened and is something that will be near impossible to avoid as long as they continue to scrape data in the way they do for their models. They would need a human to filter it out like they already use for most LLMs.
The bodies of children are not just small versions of adult bodies.There are meaningful differences that an ai wouldn’t be able to just guess. Also do you not see any problem in using photos of real children to generate csam? Imagine someone used a picture of your child/niece/nephew to generate porn. Does that not feel wrong to you? It’s still using real photos of real children either way, even if it’s abstracted through training data.
Except you can’t know that. CSAM has been found in training data already and as long as they pull from social media they will continue to be trained with more.
Csam is in the training data. From a few months ago
Has your model seen humans in a profile view? Has it seen armor? Has it seen Van Gogh style paintings? If yes then it can create a combo of those things.
For CSAM it needs to know what porn looks like, what a child looks like and what a naked pubescent body looks like to create it. It didn’t make your van Gogh painting from nothing it had an idea of what those things were.
What game? Love me some old HL/source mods.