

Man it sure is crackers to slip a rozzer the dropsy in snide.
Seer of the tapes! Knower of the episodes!


Man it sure is crackers to slip a rozzer the dropsy in snide.


The problem is that an AI built to maximize paperclips might conclude that converting the planet to paperclips is an acceptable cost of maximizing paperclip production. It might understand why humans think it’s bad to convert the planet, but disagree. It would need to be explicitly programmed to prioritize human life over paperclips.
otherwise we would just switch it off
If it were super-intelligent, it could probably trick us into leaving it turned on.


A paperclip maximizer driven by self-preservation? What could possiblie go wrong?


Are there examples of censorship or prior restraint you’d like to highlight?
Pfft. Real programmers use butterflies


Try HTTrack: https://www.httrack.com/


I think we should have a rule that says if a LLM company invokes fair use on the training inputs then the outputs are public domain.
Reminds me of the old trick on HTML forms where you use CSS to make one of the form fields invisible to humans and reject any submission that filled in that field.