Forerunner52 Posted May 17, 2016 Report Posted May 17, 2016 Hello this is my first time on this site i am a freshmen in highschool right now and i am wondering if we could simulate opening and closing things in vr games if we were to make an exo suit like on cod aw but we make the computer chips in the suit corispond to the game, if we were to do that then we could potentionaly get a lot closer to the fullDive tech then we think (PS: If we were to make the computer chips in the suit act like it was in the game then we could also make a helmet to it and that will send sensory signals into the spine or the brain that will simulate the action inside the game itself) Quote
weamy Posted May 17, 2016 Report Posted May 17, 2016 (edited) Hello this is my first time on this site i am a freshmen in highschool right now and i am wondering if we could simulate opening and closing things in vr games if we were to make an exo suit like on cod aw but we make the computer chips in the suit corispond to the game, if we were to do that then we could potentionaly get a lot closer to the fullDive tech then we think (PS: If we were to make the computer chips in the suit act like it was in the game then we could also make a helmet to it and that will send sensory signals into the spine or the brain that will simulate the action inside the game itself) What do you mean when you say "opening and closing things"? Do you mean opening things like boxes? Because that would be just part of the program, surely? Regarding exoskeletons, I suggest you take a look at this: Edited May 17, 2016 by weamy Quote
CaelesMessorem Posted May 21, 2016 Report Posted May 21, 2016 Do more research and create the device before making the game. If you do a bit of research, you'll see that there are literally no progress in this field as of now, making the rest pretty much unusable. I actually disagree with this bit. It's definitely true that more research needs to be done, but imagine for a second that we managed to make the device and begin moving on to the game. You'd hit a wall. All this effort was put into making the device while the gaming aspect was left behind. I've mentioned in a few other threads that people need to look into the gaming side as well while the VR research is being conducted. The closest things we have to SAO would be Minecraft for the sheer size and procedural generation of the map, and Bethesda's The Elder Scrolls series for the immersive open-world experience (not taking into consideration all the immersive survival games). There is a lot that could be created/ worked on now, and improved upon until the time when our FDVR device is created. This now more than 1 year old thread is titled “How Close Are We To The Game Of The Anime Sword Art Online And The Technology?”, though, and there’s more to achieving something like the game depicted in the SAO light novels, manga and anime than building a near perfect BCI. The game program depicted in these stories features computer-controlled characters (“AIs”) much more realistic than any existant now, easily able to pass the hardest Turing test. The fictional game is described as being almost entirely procedurally generated – the detailed design of its virtual world, not just scenery and characters, but themes and plots – are not created by humans, but by computer programs, far better than any present day program. These features would be revolutionary in present day, non-VR video games. These programming challenges can be worked on now, without waiting for enablement from fundamental science, and arguably address more profound questions than VR or neurological ones. Precisely. There are plenty of things that can be worked on now in preparation for FDVR. Improving gaming engines to get graphics as close to hyper-realism as possible and create little to no latency, having solid or revolutionary ideas in game progression and story-telling, improving the hardware so the software can be further developed. The list goes on, and that is just for the gaming aspect. There is also the media aspect, educational aspect, medical, etc. Quote
xTcHero Posted May 22, 2016 Report Posted May 22, 2016 I see I phrased my sentence a bit wrong. I did not mean that you should not start developing or planning the game itself; but rather the use of FDVR technology in games. It'd be great to have a near-perfect AI, open-world, self-generating game by launch, and we even got the technology to do so. But we won't benefit from jumping straight into VR with the BCI and CBI we have as of now, as there is too little research in this field to be able to take advantage of it. Honestly, if you find a BCI-engine right now, it'll be too different from the engines in the future, which means huge revises has to be made when it actually comes out. Quote
Kelton995068 Posted December 13, 2016 Report Posted December 13, 2016 My thoughts are thus far unknown by me to have been thought about. What if we did some brain scans to see what parts of the brain are active while sleeping. Now this idea I came up with is based off of this. When you dream everything seems real. Sight, smell, touch, even gravity. So knowing this I thought to myself. "How could we tapp into our brains while we are sleeping?" Well if you think about it everything is controlled my neurons in the brain. Pulses of the nearves that the brain sends and receives 24/7. And we already know the human mind can be manipulated to any extent that someone could be hipnotized, or even possessed. So why couldnt we find a way to make it to where we just slip all the information from the programing in the game through the neurons in the brain. Projecting the game into your head while your subcontious makes it seem like just a dream. Now connecting every individual brain together... That could be the hardest part about it. I've theorized that we could just have every system set up, connect to one enormous server. Idk it's just something to ponder. Let me know what you/you guys think about this. Email me back what you think about my theory. Quote
billvon Posted December 13, 2016 Report Posted December 13, 2016 My thoughts are thus far unknown by me to have been thought about. What if we did some brain scans to see what parts of the brain are active while sleeping. Now this idea I came up with is based off of this. When you dream everything seems real. Sight, smell, touch, even gravity. So knowing this I thought to myself. "How could we tapp into our brains while we are sleeping?" Well if you think about it everything is controlled my neurons in the brain. Pulses of the nearves that the brain sends and receives 24/7. And we already know the human mind can be manipulated to any extent that someone could be hipnotized, or even possessed. So why couldnt we find a way to make it to where we just slip all the information from the programing in the game through the neurons in the brain. Projecting the game into your head while your subcontious makes it seem like just a dream. Now connecting every individual brain together... That could be the hardest part about it. I've theorized that we could just have every system set up, connect to one enormous server. Idk it's just something to ponder. Let me know what you/you guys think about this. Email me back what you think about my theory.Because: 1) "Brain scans" (fMRI, DOT, EROS, MEG, PET etc) are not able to see very much detail; just which large sections of the brain are "lit up" during a given mental state.2) "Brain scans" cannot stimulate neurons. Quote
CraigD Posted December 14, 2016 Report Posted December 14, 2016 Welcome to hypography, Kelton! :) Please feel free to start a topic in the introductions forum to tell us something about yourself. Projecting the game into your head while your subcontious makes it seem like just a dream. Now connecting every individual brain together...I’ve long wondered if a “shared dream” system like this is possible, and had some discussion of it a couple of years ago, beginning with this post: Thinking about similarities between dreaming and computer generated VR leads me to wonder if a system like SAO’s might be achieved by, rather than generating a detailed, shared 3D graphical world in a computer and interfacing users with it, injecting “synchronizing” and communicating signals into people’s dreams – that is, rather than a shared computer simulated world, whether an artificially “shared dream” might be the model.I find the idea attractive, because except for the rare people who don’t have memorable dreams, we humans have for as long as there’s recorded history been capable of experiencing intensely immersive virtual reality in the form of dreams. One key limitation is that, without credible reports of an exception, dreams appear to be innately “single player games”. Whether it’s possible to read and interactively “nudge” dreaming brains to produce a shared, interactive experience is uncertain. Another possible limitation is whether people would be satisfied with a dream rather than a waking VR experience. Dreams are usually more difficult to remember than waking life, so if follows that VR game experiences of a dreaming brain would be more difficult to remember than those of an awake one. Quote
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.