From 'The Sims' to (Virtual) Reality
- teejaydub
- Jun 3, 2019
- 6 min read
Y'know what I think is really neat? The concept of living in a tiny home. There are some that don't share that sentiment (namely, my SO), but the debater in me feels the need to put forth my best effort to make a case for tiny homes. What better way to show someone what living in a small abode might be like than creating one in virtual reality to explore?
If you're distilling from the opening statement that I have seen this project through to the end in order prove a point in an argument, you're not wrong.
Let's make a simulator for mobile 6DOF VR to allow us to walk around a fully realized tiny house of my own design.
It's not (strictly) playing dollhouse
The Sims 3, now a decade old, has always had a pretty comprehensive set of house building tools and customisation. Don't get me wrong, I'm not about to say that you can realistically (or safely) build a house from a Sims 3 design, but it is robust enough to keep up with your imagination and let you explore unique designs with ease. This is in comparison to more clinical and decidedly less fun tools such as Google SketchUp.
After some final decorations in the Sims house, phase one was complete - it was time to take this design to the next level. Sweet Home 3D is a free piece of software to design a house by arranging spaces and furniture on a 2D blueprint, and offers the ability to export the house to a 3D model file (.obj). It fit the two keyword phrases I searched for in house designing software: "free" and "3d model export also free".

This was the point in the project that I had The Sims 3 taking up one part of the screen, and Sweet Home 3D in the other half, attempting to perfectly line up the designs. The forum consensus was that each block in the Sims is approximately 3x3 feet. So don't feel guilty about trapping your Sim in one tile - a nine square foot space is plenty of room.

Based on that tile size assumption, the dimensions of the house are approximately 21'x27'. Adding up the square footage of the first and second floor, we get almost exactly 1000sqft. Small house - not even tiny.
Imports and exports
This is about the time that the first roadblock came up. Exporting to an .obj file worked fine, but upon importing the file into the Unity game engine, I was unable to apply my own textures/materials to any of the objects in the house. Or, rather, only the color would be transferred, with no pattern. Turns out, this has to do with a limitation of Sweet Home 3D. At this point I had to open Blender, import the .obj file which triggered the generation of UV maps for the meshes. Upon exporting from Blender, my texture woes were over (no they weren't they were very far from over).
After fiddling with the import settings of the newly generated .fbx file, I was presented with some hideous-ass house that I didn't want to be associated with.

After gathering many textures from definitely-copyrighted sources on the internet, I attempted to improve the look of the scene. No amount of texturing can save an unlit scene, though. Cue me, firing up two hours of lighting workshops on YouTube.

To get the best chance at being able to create a convincing environment within the limits of mobile VR (remember the final goal is to be able to walk around the house in the Oculus Quest), I had to download the latest version of Unity - 2019.1.4. This build features some new baked lighting tools that would come in handy. Namely, the GPU lightmapper that would significantly decrease bake times.
Baking is a science
Seeing as this was my first attempt at creating a quasi-realistic interior in Unity, I'm gonna need you to set your expectations reaaaaal low. A li'l lower. Okay, that's good.
Rendering real-time lights in a scene is muy expensivo for processors/GPUs, so we've opted for the second option - baking lighting into a scene. My computer will pre-generate "lightmaps" for all of the objects in the scene given a fixed light source, so the shadows and lighting effects are "baked" into the object. This is conventionally used when objects in the scene aren't moving. Fortunately for me, I don't have any plans for motorized furniture or walls in this house*, so things will stay put.
After setting up a few area lights, a directional light, and some point lights in some smaller rooms, I hit the magical "Generate Lights" button and wait. And wait some more. The computer shoots out virtual rays of light that bounce around and report back on how this object should be shaded given the static light sources. It's not nearly as lengthy or in-depth as actual ray-tracing, but it can create some nice looking environments.

The output is hit or miss. I'm receiving a tremendous amount of "artifacting" on walls and other lightly shaded materials, as well as getting warnings in the Debug log for overlapping UV maps. This is supposedly solved by increasing the packing distance on the lightmaps, but I wasn't able to improve it much. I was also limited to a smaller lightmap size due to the limits of Mobile VR processing power. Had I been able to use HDR enabled skyboxes, higher res lightmaps, and other features of the HDRP, I'd have been able to get better results. I'm still going to experiment with some low-cost post processing effects to see if I can improve the look.
Going on a Quest
The next step in this project is to get this home inside the VR headset! It was a stupidly long but not difficult process, detailed in bullet points below.
Install this new "Unity Hub" program to add the Android SDK to Unity 2019
Discover that Unity Hub will only let you add a feature to a version of Unity that had been installed with the "hub"
Change plans and download Android Studio
Point Unity in the direction the SDK is installed, but find it is not there
Realize that downloading and installing Android studio isn't enough - you must also open it and choose an SDK version to download
Point Unity to the directory where the SDK is installed
Register your company name in the player settings (teejaydubproductions whaddap)
Download Oculus integration package from Asset Store and import too many things
Enable mobile VR checkbox from the Android player settings
Import VR player controller prefab into the scene
Hit play, watch the player fall through the floor infinitely into the abyss
Add a collider to the floor of the house
Test it out on the Quest and realize you are the height of an ant
Scale down the house
The TV was too far away in VR so I coded a quick "motorized" projector screen which solves the problem for me!
I left out some other li'l development tidbits but they weren't too interesting. There is a useful tool for simply copying the finished .apk file straight from the PC and installing it on the Quest. Just make sure you're a developer on Oculus' website.
This was a fun project that took a couple days to complete. I can tell you sincerely that seeing a living space in VR is totally different than dragging a 3d model of a room around a screen. Things become so much more obviously wrong in VR so it really does change the design process. I'm already remodeling, improving, and iterating on this design. Thanks for reading!
Bonus chapter: My woes with recording video from the Quest
Conveniently (or so I thought), you are able to record video of gameplay directly from the Oculus Quest. To share it, though, you must hook it up to a PC and copy it off of the internal storage.
Upon transferring the files, I discovered that I could not play the videos, which had a decent file size but a duration of zero seconds. Couldn't convert them, so I tried repairing them. I was led to a guide that had instructions to repair .mp4 files using ffmpeg.exe and another .exe. I was able to repair the files by using the command prompt and following along with the helpful instructions. For the first video.
I took two more videos, and the same symptoms occurred, but the solution was no longer, well, a solution. I could get halfway through the repair process, which left me with a .h264 file and an .aac file. I couldn't upload the .h264 file to YouTube, so I found an online converter that gave me a soundless .mp4 file from the .h264 file. I was able to upload these newly converted .mp4 files, but sharing the video led to the message "video unavailable" and I could not upload them to Google Photos either. Even though they were converted back to .mp4 with a viable duration, they still wouldn't upload.
The strangest savior revealed itself - Windows Movie Maker. It was able to import these converted .mp4 files and I could re-export them successfully. Made it to YouTube, too. So thank y- I mean fuck you microsoft you are still trash.
Comments