• 0 Posts
  • 13 Comments
Joined 1 year ago
cake
Cake day: June 30th, 2023

help-circle
  • I mostly agree with you but think it’s important to clarify that even with machine learning many humans can be replaced.

    To extend your metaphor, that library has always had a bunch of clerks sitting inside of it. They’ve been handling requests, finding books, and organizing them into a system that works to best serve that information.

    Now with machine learning, instead of having all of those clerks making the library run smoothly, they’ve effectively replaced 99% of all of the humans with an organizational system that serves content and helps find books even faster than a human would be able to.

    Slightly deeper: this machine learning replacement can also now mix and match bits of content. The human system before might have a request that looks like this - “I want information on Abrahamic Religion in Western Culture” so they’d gather up a ton of books and pass them to the person that requested info.

    In the new replacement system, the request could take bits and pieces from all of those books and present a mostly comprehensive overview of Abrahamic Religion in the West without having to run and fetch all of the books.

    Deeper yet, and the scary iceberg - today, someone still needs to write all of those books and we as a society tend to trust information gotten from those books (cited sources and all that) so humans are safe as the content authors right? We’ve basically just made a super efficient organizational and content delivery system. But as we start to trust the new system and use it more, we’re potentially seeing the system reference its own outputs as opposed to the source material…which creates a recursive, negative feedback loop.

    We still need human content creation today, but the scary part (IMO) is when we treat these LLMs as generative general AI. The LLMs are fallible and can be incorrect and often hallucinate - so when most people start blindly trusting these systems (they already do - look no further than general confusion on the terms AI and machine learning and LLMs), we’re going to get increasingly further away from new knowledge generation.









  • For sure, I sincerely don’t understand why anyone on a mostly anonymous Internet forum would need to define their own version of spirituality to talk about their personal experiences they think were spiritual

    I’m not trying to be obtuse, super promise, but spirituality is inherently a pretty subjective subject. We’re not all going to align. We don’t need to.

    There is a definition of spirituality, but it’s meaning to folks will differ.

    Again, I am not really spiritual (in my own meaning of the word) but I take issue with a proposed need to define but because it almost feels idk gatekeepy?

    You and I probably have a different understanding of the meaning of spiritual. We don’t need to align those meanings for me to share my dumbass acid story that I found spiritual.





  • Only on acid - my buddies and I got lost in a maelstrom and clung to a raft to survive. Two of us woke up on a serene island and made a beautiful community with the indigenous peoples of the island.

    The other two found another island and created a futuristic industrialized society.

    The ideological differences eventually formed physically into a great barrier called The Schism. They began polluting our lands and forced us into a hundred year war and many lives were lost.

    Peace was found when emissaries from both tribes travelled to the caldera of the great volcano at the center of our island and met with the Keeper of the Scrolls who revealed to us that The Schism was invisible - we took that to mean that the only thing truly separating our people was our perceived differences.

    But we were really, really trippin