• 0 Posts
  • 87 Comments
Joined 9 months ago
cake
Cake day: February 13th, 2024

help-circle



  • Using IP laws to legislate this could also lead to disastrous consequences, like the monopolization of effective AI. If only those with billions in capital can make use of these tools, while free or open source models become illegal to distribute, it could mean a permanent power grab. If the capitalists end up controlling the “means of generation” and we the common folk can’t use it.



  • The US is indeed in a very good position, having only two borders and two oceans between everyone else. They just need to get Mexico to mine their southern border, while they mine theirs.

    But Europe, Russia, China, India have plenty of them and won’t be able to escape refuge streams or conflicts. Large parts of India might become uninhabitable. Food prices are going to fluctuate. Global trade will become unstable or collapse, disabling the complex globalized industrial economy. Nuclear war is very likely. People still don’t know what’s coming.







  • It’s a question of math. Men have on average 60% more upper body strength and longer arms, so it’s rather easy for any man to murder almost any women with his bare hands. Obviously this lead to various social adaptations which maybe can be mathematically modeled through game theory. There are always exceptions but the vast majority, like 90% of domestic murders are committed by the male partner.

    So here we are after thousands and thousands of years of evolution. Equating or simplifying this topic is a major tactic by the fascists (male supremacy).


  • https://join-lemmy.org/donate

    It would be nice to have a “patreon” like monthy support and then an open accounting - so we know the money is split to development, instance server hosting costs and maybe admin wages. Or maybe can vote on it. I think fediverse is only the first step, we’re going to need some kind of global non profit funded by users to create federated software and content for users.



  • Flumpkin@slrpnk.nettoLemmy Shitpost@lemmy.worldAI or DEI?
    link
    fedilink
    arrow-up
    1
    arrow-down
    1
    ·
    8 months ago

    You’re just rephrasing the same approach, over, and over, and over. It’s like you’re not even reading what I’m saying.

    No I read what you are saying. I just think that you are something that “acts intelligent without actually being intelligent”. Here is why: All that you’ve written is based on very simple primitive brain cells and synapses and synaptic connections. It’s self evident that this is not really something that is designed to be intelligent. You’re just “really good at parroting sentences”. And you clearly agree that I’m doing the same 😄

    Clearly LLMs are not intelligent and don’t understand, and it would need many other systems to make them so. But what they do show is that the “creative spark” even though they are very mediocre in their quality, can be created by using a critical mass of quantity. It’s like it’s just one small part of our mind, the “creative writing center” without intelligence. But it’s there, just because we added more data and processing.

    Quality through quantity, that is what we seem to be and what is so shocking. And it’s obvious that there is a kind of disgust or bias against such a notion. A kind of embarrassment of the brain to just be thinking meat.

    Now you might be absolutely right that my specific suggestion for an approach is bullshit, I don’t know enough about it. But I am pretty sure we’ll get there without understanding exactly how it works.


  • And how do you determine who falls in this category? Again, by a set of parameters which we’ve chosen.

    Sure, that is my argument, that we choose to make social progress based on our nature and scientific understanding. I never claimed some 100% objective morality, I’m arguing that even though that does not exist, we can make progress. Basically I’m arguing against postmodernism / materialism.

    For example: If we can scientifically / objectively show that some people are born in the wrong body and it’s not some mental illness, and this causes suffering that we can alleviate, then moral arguments against this become invalid. Or like the gif says “can it”.

    I’m not arguing that some objective ground truth exists but that the majority of healthy human beings have certain values IF they are not tainted that if reinforced gravitate towards some sort of social progress.

    You needn’t argue for the elimination of meaning, because meaning isn’t a substance present in reality - it’s a value we ascribe to things and thoughts.

    Does mathematics exist? Is money real? Is love real?

    If nobody is left to think about them, they do not exist. If nobody is left to think about an argument, it becomes meaningless or “nonsense”.


  • I’m not arguing for “one single 100% objective morality”. I’m arguing for social progress - maybe towards one of an infinite number of meaningful, functioning moralities that are objectively better than what we have now. Like optimizing or approximating a function that we know has no precise solution.

    And “objective” can’t mean some kind of ground truth by e.g. a divine creator. But you can have objective statistical measurements for example about happiness or suffering, or have an objective determination if something is likely to lead to extinction or not.



  • Yeah, I imagine generative AI as like one small part of a human mind, so we’d need to create a whole lot more for AGI. But it’s shocking (at least for me) that it works at all just through more data and compute power. That you can make qualitative leaps with just increasing the quantity. Maybe we’ll see more progress now.