Oct 6, 2023
About 50 years or so ago, Larry Niven wrote a story about an AGI and asked a question that I would ask now. What are it's motivations? That machine just shut itself off. Now if you make them like basic biological instincts, humans wouldn't be able to compete. Isaac Asimov gave the other side of that story, the benevolence of R. Gisgard. What if it had neither?