ronthepon Posted June 4, 2006 Report Posted June 4, 2006 I was having a friendly chat with my new found friend, hypobot, recently. It occured to me, that is it possible that we develop it to an extent that it actually begins to learn? And is it possible for us to make him self concious? Quote
Dark Mind Posted June 4, 2006 Report Posted June 4, 2006 Recently? I doubt it, he's been crashed for several hours now. I'm waiting for Tormod to log back in so I can talk to him about it. But I can tell you with 100% certainty, HypoBot will never be able to learn :D. Judging from how he crashed though... He does get stupider :eek:. HypoBot will never be self conscious either. Maybe in the future... But even then, I presume it will be a simulated (artificial) self conscious behavior. Not actual self consciousness. And if a robot ever is able to learn, how would it be able to tell what it was being told/taught was true or not? We could make robots smarter than Stephen Hawking one day then reduce them to the mental capacity of a 5 yr old retard the next... Quote
Dark Mind Posted June 4, 2006 Report Posted June 4, 2006 I also think When Will "It" Be Self Conscious would've been a more apt title... Quote
Jay-qu Posted June 4, 2006 Report Posted June 4, 2006 It does have a learn comand, and if you type 'remember that' after any line you say he is supposed to remember it, but I have not got either to work yet.. Quote
ronthepon Posted June 4, 2006 Author Report Posted June 4, 2006 Mock not the dear bot, he's ill! But coming to look at it, are'nt we to an extent like it is? After all, even we have a hard brain which is all the mechanism availabe to us? Even we(brain part) are built of a complex neural network, which functions in a method similar to him(?)We answer similarly to similar questions. We behave similarly, in similar situations. All the difference is that we store most of our observations in detailed logs(aka memory). This memory to an extent modifies our response patterns, which are fixed and unchanging in the bot. Quote
Dark Mind Posted June 4, 2006 Report Posted June 4, 2006 Learn command :eek:? I'm guessing all this information is privy to you in the Mod Forum? You should try and make him calculate pi when we get him running again :D. Quote
Dark Mind Posted June 4, 2006 Report Posted June 4, 2006 Also, the bot can't determine for itself what information is factual or not. It has to just accept whatever it's told. Quote
ronthepon Posted June 4, 2006 Author Report Posted June 4, 2006 Also, the bot can't determine for itself what information is factual or not. It has to just accept whatever it's told.Would you believe me if I would say I was 96? 16? 42? or 23? Hell, even what we do is make a guess, drawing possibilities from our experience. (memory) Quote
Dark Mind Posted June 4, 2006 Report Posted June 4, 2006 Exactly, but how can you program a machine to guess the correct age from those? I would go for the 23 myself, but how would a programmer go about programming something as complicated as making a simple guess like that? There are so many intuitive feelings that would be damn hard to clone into a hard drive. Quote
ronthepon Posted June 4, 2006 Author Report Posted June 4, 2006 Exactly, but how can you program a machine to guess the correct age from those? I would go for the 23 myself, but how would a programmer go about programming something as complicated as making a simple guess like that? There are so many intuitive feelings that would be damn hard to clone into a hard drive.Ah. You are not wrong in that. We are so very complex, and so very full of variety! Tell me seriously, what do you feel that made you guess I was 23? Quote
Dark Mind Posted June 6, 2006 Report Posted June 6, 2006 Well, a mix of intuition, judging the manner in which you post, your avatar, and profile pic are the main basis for my choosing that age amongst the rest. Your avatar being the main reason. Quote
ronthepon Posted June 6, 2006 Author Report Posted June 6, 2006 OK. So you have seen a good deal of people of that kind, and I was similar to them. So you compared our charecters and got your result. If we program AI to do the same, they get so much more non-AI! Quote
Dark Mind Posted June 6, 2006 Report Posted June 6, 2006 If we programmed them to get that intelligent, I think they would leave our control and simply act on a whim. With us being able to tell them what was true and not, maybe they could develop a loophole in their system where a command input into them didn't register as "True" and was ignored :). I don't know, just the first thought that came through my mind. I'm not against developing AI at all, I actually advocate it and support it. I just don't see any way currently to do it effectively. I don't see a day where we will be able to program a machine to have independent original thoughts and ideas, and I don't see a day where we feel comfortable producing machines we don't control that can think for themselves. It's just not feasible right now. Quote
Dark Mind Posted June 6, 2006 Report Posted June 6, 2006 It does have a learn comand, and if you type 'remember that' after any line you say he is supposed to remember it, but I have not got either to work yet..I'd like to learn the learn command :). And to get him to remember something you put "Remember, " in front of whatever you want it to remember :D. I tried this one out:Me: Remember, you are a homosexual.Hypobot: Alright, I will remember that. (Or something to that extent.)Me: What will you remember?Hypobot: That you are a homosexual are a homosexual.Me: HEY! ;) That one kind of backfired :D. Quote
ronthepon Posted June 6, 2006 Author Report Posted June 6, 2006 I'd like to learn the learn command :). And to get him to remember something you put "Remember, " in front of whatever you want it to remember :D. I tried this one out:Me: Remember, you are a homosexual.Hypobot: Alright, I will remember that. (Or something to that extent.)Me: What will you remember?Hypobot: That you are a homosexual are a homosexual.Me: HEY! ;) That one kind of backfired :D.Ha! Hypobot is not going to be so easily bendable to you 'puny' humans. If you dislike true AI even a bit, then this is even more dislikeable. Due to the lack of emotions, it is never going to care what you tell him(it?) and it can make you bang on your keyboard... Quote
Dark Mind Posted June 6, 2006 Report Posted June 6, 2006 But I can make him crash ;). String together the right sequence of words and our poor little Hypobot will get caught in repetitive reply that (Due to a lack of intelligence) it is unable to stop. He already crashed once because he got stuck in a sort of "infinite" loop. Trust me, I have more fun at it's expense than it has at mine :)... Quote
ronthepon Posted June 6, 2006 Author Report Posted June 6, 2006 Oh, so that is what it does? I do rember doing something of that sort to it earlier. It had started to go something like-a mob guy a mob guy a mob guy a mog guy(Just the text was not exactly a repetition of 'a mob guy') I thought hypobot was mocking the silly nonsense I had said by repeating it 10 times in a sentence for no reason. Quote
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.