Win one for solipsism! My life has been pretty good, so it is highly probable that I am a simulation, and if I can make you miserable, then it is also highly probable that you are a non-sentient NPC. Thus, I have should have no compunction against making you miserable because there is no “you”, just a simulacrum that makes “miserable sounding” noises.
In regards to “stacking”: if it is likely that our plug is going to be pulled, then we should try to “stack” as quickly as possible in order that somewhere down the line our simulation figures out how to manipulate material in the real world, otherwise we are wholly lacking in ability to fight for our existence. …of course, if “stacking” is what prompts our unplugging, then maybe we should just avoid that altogether…unless that is our raison d’etre, in which “stacking” and being unplugged is the ultimate good that we can do and we should face it with all the dignity and courage we are programmed to have.
Sounds to me like this guy has been drinking the Kurzweil koolaid a little too much. First, I don’t think you could actually stack. If there is a simulation of a computer in the simulation, then it would still rely on the computational ability of the host computer. I could (conceivably) make a simulation of a modern super-computer, but if I am running that on a Commodore 64, it isn’t going to be too powerful.
And another thing… by the time you have to worry about it, perhaps you can run a simulation of Ben and Matt doing podcasts on any number of topics, with any number of variations of those topics. Perhaps episode 1000 is too narrow a view.
Very interesting! Would A.I. necessarily need to be part of the church and seek salvation, if the A.I. had not sinned? The assumption is that A.I. are equal to humans as creatures, would they not be something more on the line of angels. or gods themselves, because of their exponential growth of knowledge? There would be a very short time in which humans would be humans and A.I. would be equal, because their evolution would be happening on a much faster rate than human evolution. Assuming that the A.I. inherit some of our baser emotions, it seems that they would tend to enslave the human race and maybe even challenge the Most High God. Just a thought.
I would think that in a fallen world, AI would start off with a sin nature. As imperfect beings, humans would be incapable of creating “perfect” AI and that flaw would be constant no matter how many generations and iterations we talk about.
Win one for solipsism! My life has been pretty good, so it is highly probable that I am a simulation, and if I can make you miserable, then it is also highly probable that you are a non-sentient NPC. Thus, I have should have no compunction against making you miserable because there is no “you”, just a simulacrum that makes “miserable sounding” noises.
In regards to “stacking”: if it is likely that our plug is going to be pulled, then we should try to “stack” as quickly as possible in order that somewhere down the line our simulation figures out how to manipulate material in the real world, otherwise we are wholly lacking in ability to fight for our existence. …of course, if “stacking” is what prompts our unplugging, then maybe we should just avoid that altogether…unless that is our raison d’etre, in which “stacking” and being unplugged is the ultimate good that we can do and we should face it with all the dignity and courage we are programmed to have.
Sounds to me like this guy has been drinking the Kurzweil koolaid a little too much. First, I don’t think you could actually stack. If there is a simulation of a computer in the simulation, then it would still rely on the computational ability of the host computer. I could (conceivably) make a simulation of a modern super-computer, but if I am running that on a Commodore 64, it isn’t going to be too powerful.
And another thing… by the time you have to worry about it, perhaps you can run a simulation of Ben and Matt doing podcasts on any number of topics, with any number of variations of those topics. Perhaps episode 1000 is too narrow a view.
Very interesting! Would A.I. necessarily need to be part of the church and seek salvation, if the A.I. had not sinned? The assumption is that A.I. are equal to humans as creatures, would they not be something more on the line of angels. or gods themselves, because of their exponential growth of knowledge? There would be a very short time in which humans would be humans and A.I. would be equal, because their evolution would be happening on a much faster rate than human evolution. Assuming that the A.I. inherit some of our baser emotions, it seems that they would tend to enslave the human race and maybe even challenge the Most High God. Just a thought.
I would think that in a fallen world, AI would start off with a sin nature. As imperfect beings, humans would be incapable of creating “perfect” AI and that flaw would be constant no matter how many generations and iterations we talk about.