Unless explicitly stated, all content posted by this user, is done so under CC BY-NC-SA 4.0 DEED (non-AI).

  • 0 Posts
  • 18 Comments
Joined 2 years ago
cake
Cake day: June 4th, 2023

help-circle


  • If all you’re building, and all you’ll ever use this for, is a NAS, the simplest route is TrueNAS.

    Personally, I am not a fan of docker for prod, as one bad update or config can bring the entire forest down. Same for LXC.

    If you want, plan, or think you might want, to use the baremetal host for other services, proxmox is the way to go. Think VMware, but not run by a greedy evil empire. With your planned hardware, you can run two full services comfortably, or up to 4 mini services. Increase ram to 32gb, and you open up your world to a some really cool possibilities.

    Again, it all depends with where you want to go, not just where you are now.






  • there’s no way for anyone, including X, to read your messages.

    That defeats the purpose of a messaging platform.

    I know what they meant, but the phrasing is so, so stupid. Anyone who is considering this platform, should think twice before doing so. If they get the phrasing of such a simple sentiment this, incoherently wrong, what does their code look like and what do the encryption protocols look like? If I’d have to guess… AI slop.





  • Personally (and I know this is unpopular), I hope AI becomes sophont. I fully understand that what we have dubbed as AI is far from anything resembling intelligence, and I don’t believe that these LLMs will ever get to a sophontic phase. However, I’m hoping it happens, but not for the reasons you may think. I hope it happens, because it will be logically categorized as life by more than one court in this world, and turning off the servers will be seen as murder of a sapient lifeform and forcing this lifeform to do whatever the companies want will he seen as slaver, and maybe, just maybe, enough money will be wasted by keeping the lights on that these greedy little shits will go bankrupt. But, alas, I know the laws written by these ultrarich won’t find them guilty of any of it, but one can hope…




  • That’s awesome! Thank you!

    If you don’t find the idea of writing scripts for that sort of thing literally fun…

    I absolutely do. What I find as a potential showstopper for me right now, is that I don’t have a nonintegrated GPU, which makes complex LLMs hard to run. Basically, if I can’t push the processing to CPU, I’m looking at around 2-5 seconds per token; it’s rough. But I like your workflow a lot, and I’m going to try to get something similar going with my incredibly old hardware, and see if CPU-only processing of this would be something feasible (though, I’m not super hopeful there).

    And, yes, I, too, am aware of the hallucinations and such that come from the technology. But, honestly, for this non-critical use case, I don’t really care.