• 0 Posts
  • 166 Comments
Joined 1 year ago
cake
Cake day: June 22nd, 2023

help-circle
  • The overall average median lifetime earnings of $1,850,000 for men and $1,100,200 for women. Let’s just take the average and say an average American earns $1,475,100 in their lifetime.

    The important thing to remember is, in an unequal system where workers have most of the value of their work taken by a single person who the system disproportionately favors, that value is translatable to literal life. They are directly, inexorably going to die having had that value simply transferred to the other person or people who collect that value. Or put succinctly, they are giving up life, and the “owner” of the business is gaining the value of their life.

    Another note is that even though most valuations are stock, stock valuations do not exist in a vacuum. The stock market is the realizable increase in productivity value that we all collectively have caused.

    So based on that principle, just for fun, let’s convert these fortunes to human lives, to better understand just how much (economically-valued) life force these people have taken from people:

    Elon Musk: $262,000,000,000 = 176,259 American lives.

    Jeff Bezos: $208,000,000,000 = 141,007 American lives.

    Mark Zuckerberg: $203,000,000,000 = 137,617 American lives.


  • This is really unprovable, but my theory is that this is also another result of late-stage capitalistic exhaustion. While young people still want to be ethical and moral and safe, there’s a lot of moment-to-moment existential rebellion with so many layers of rules, norms and expectations.

    It’s similar to the rise of “treat” habits - if there’s no realistic possibility of the American dream and house and white picket fence and kids for an average worker’s salary, you have a moment of probably irresponsible spending that feels life affirming, to shake off the feeling of being in a Matrix pod that’s sucking out your life force in the most efficient manner possible.

    Hence, no condom! Or something.



  • The funny thing is this will do absolutely nothing to prevent a sitewide protest. There are so many ways for mods to effectively destroy a subreddit or redirect it while remaining public.

    In fact, and this is the important blindness that Reddit continues to have, the mods usually need to work hard daily just to keep a sub usable. Reddit is so dismissive of that effort and so brazenly presumes upon their volunteer labor that they seem to think subs just continue on sheer momentum, if only they could stop mods from sabotaging them.

    Mod posts every day pointing to a new community at Lemmy or elsewhere, stopping using bot removal tools, stopping troll culling, marking NSFW, etc will do the job.















  • It’s hard to have a nuanced discussion because the article is so vague. It’s not clear what he’s specifically been charged with (beyond “obscenity,” not a specific child abuse statute?). Because any simulated CSAM laws have been, to my knowledge, all struck down when challenged.

    I completely get the “lock them all up and throw away the key” visceral reaction - I feel that too, for sure - but this is a much more difficult question. There are porn actors over 18 who look younger, do the laws outlaw them from work that would be legal for others who just look older? If AI was trained exclusively on those over-18 people, would outputs then not be CSAM even if the images produced features that looked under 18?

    I’m at least all for a “fruit of the poisoned tree” theory - if AI model training data sets include actual CSAM then they can and should be made illegal. Deepfaking intentionally real under 18 people is also not black and white (looking again to the harm factor), but also I think it can be justifiably prohibited. I also think distribution of completely fake CSAM can be arguably outlawed (the situation here), since it’s going to be impossible to tell AI from real imagery soon and allowing that would undermine enforcement of vital anti-real-CSAM laws.

    The real hard case is producing and retaining fully fake people and without real CSAM in training data, solely locally (possession crimes). That’s really tough. Because not only does it not directly hurt anyone in its creation, there’s a possible benefit in that it diminishes the market for real CSAM (potentially saving unrelated children from the abuse flowing from that demand), and could also divert the impulse of the producer from preying on children around them due to unfulfilled desire.

    Could, because I don’t think there’s studies that answers whether those are true.