A strange thing is happening: there are people, groups of people even, walking the streets day and night staring wide-eyed at their mobile phones and laughing like manic children. What are these people doing? Are they taking pictures? Are they participating in some new social media craze? Is their activity an omen that the zombie apocalypse is upon us?
In a previous tutorial, we were able to measure vertical surfaces such as walls, books, and monitors using ARKit 1.5. With the advent of vertical plane anchors, we can now also attach objects onto these vertical walls.
In a previous tutorial, we were able to measure horizontal surfaces such as the ground, tables, etc., all using ARKit. With ARKit 1.5, we're now able to measure vertical surfaces like walls!
Have you ever seen pictures or videos of balloons being let go into the sky and randomly floating away in all directions? It's something you often see in classic posters or movies. Well, guess what? Now you'll be able to do that without having to buy hundreds of balloons, all you'll need is ARKit!
In our last ARKit tutorial, we learned how to measure the sizes of horizontal planes. It was a helpful entryway into the arena of determining spatial relationships with real world spaces versus virtual objects and experiences.
Have you noticed the many utility ARKit apps on the App Store that allow you to measure the sizes of horizontal planes in the world? Guess what? After this tutorial, you'll be able to do this yourself!
Have you been noticing SpaceX and its launches lately? Ever imagined how it would feel to launch your own rocket into the sky? Well, imagine no longer!
Hello, budding augmented reality developers! My name is Ambuj, and I'll be introducing all of you Next Reality readers to the world ARKit, as I'm developing an ARKit 101 series on using ARKit to create augmented reality apps for iPad and iPhone. My background is in software engineering, and I've been working on iOS apps for the past three years.
Ever notice how some augmented reality apps can pin specific 3D objects on the ground? Many AR games and apps can accurately plant various 3D characters and objects on the ground in such a way that, when we look down upon them, the objects appear to be entirely pinned to the ground in the real world. If we move our smartphone around and come back to those spots, they're still there.
If you've contemplated what's possible with augmented reality on mobile devices, and your interest has been piqued enough to start building your own Android-based AR app, then this is a great place to to acquire the basic beginner skills to complete it. Once we get everything installed, we'll create a simple project that allows us to detect surfaces and place custom objects on those surfaces.
Now that ARCore is out of its developer preview, it's time to get cracking on building augmented reality apps for the supported selection of Android phones available. Since Google's ARCore 1.0 is fairly new, there's not a lot of information out there for developers yet — but we're about to alleviate that.
There are hundreds, if not thousands, of programming languages and variations of those languages that exist. Currently, in the augmented reality space, it seems the Microsoft-created C# has won out as the overall top language of choice. While there are other options like JavaScript and C++, to name a few, C# seems to be the most worthwhile place to invest one's time and effort.
One of the primary factors that separates an augmented reality device from a standard heads-up display such as Google Glass is dimensional depth perception. This can be created by either RGB cameras, infrared depth cameras, or both, depending on the level of accuracy you're aiming for.
With the software installation out of the way, it's time to build the framework within which to work when building an augmented reality app for Android devices.
In this first part of my series on getting started with Windows Holographic, we are going to cover everything you need to get set up for developing HoloLens apps. There are many pieces coming together to make one single application, but once you get used to them all, you won't even notice. Now there are different approaches you can take to make applications for HoloLens, but this way is simply the fastest.
Continuing our series on building a dynamic user interface for the HoloLens, this guide will show how to rotate the objects that we already created and moved and scaled in previous lessons.
An incorrectly scaled object in your HoloLens app can make or break your project, so it's important to get scaling in Unity down, such as working with uniform and non-uniform factors, before moving onto to other aspects of your app.
So after setting everything up, creating the system, working with focus and gaze, creating our bounding box and UI elements, unlocking the menu movement, as well as jumping through hoops refactoring a few parts of the system itself, we have finally made it to the point in our series on dynamic user interfaces for HoloLens where we get some real interaction.
The reveal of Apple's new ARKit extensions for iPhones and iPads, while not much of a shock, did bring with it one big surprise. By finding a solution to surface detection without the use of additional external sensors, Apple just took a big step over many — though not all — solutions and platforms currently available for mobile AR.
Many developers, myself included, use Unity for 3D application development as well as making games. There are many that mistakenly believe Unity to be a game engine. And that, of course, is how it started. But we now live in a world where our applications have a new level of depth.
As a developer, before you can make augmented-reality robots that move around in the real world, controlled by a user's finger, you first need to learn how to harness the basics of designing AR software for a touchscreen interface.
Introduced along with the iPhone X, Animoji are animated characters, mostly animals, that are rendered from the user's facial expressions using the device's TrueDepth camera system to track the user's facial movements.
Just in time for the holiday season, Lenovo has released its Mirage AR head-mounted display with the Star Wars: Jedi Challenges game and accessories. Unfortunately, while its price point is a fraction of most other AR headsets, at the moment, it does have a few issues with the setup process.
Now that we have installed the toolkit, set up our prefabs, and prepared Unity for export to HoloLens, we can proceed with the fun stuff involved in building a dynamic user interface. In this section, we will build the system manager.
So after being teased last Christmas with an email promising that the Meta 2 was shipping, nearly a year later, we finally have one of the units that we ordered. Without a moment's hesitation, I tore the package open, set the device up, and started working with it.
This is a very exciting time for mixed reality developers and fans alike. In 2017, we have seen a constant stream of new hardware and software releases hitting the virtual shelves. And while most of them have been in the form of developer kits, they bring with them hope and the potential promise of amazing things in the future.
The release of Unity 5.6 brought with it several great enhancements. One of those enhancements is the new Video Player component. This addition allows for adding videos to your scenes quickly and with plenty of flexibility. Whether you are looking to simply add a video to a plane, or get creative and build a world layered with videos on 3D objects, Unity 5.6 has your back.
Being part of the wild frontier is amazing. It doesn't take much to blow minds of first time mixed reality users — merely placing a canned hologram in the room is enough. However, once that childlike wonder fades, we need to add more substance to create lasting impressions.
One of the truly beautiful things about the HoloLens is its completely untethered, the-world-is-your-oyster freedom. This, paired with the ability to view your real surroundings while wearing the device, allows for some incredibly interesting uses. One particular use is triggering events when a user enters a specific location in a physical space. Think of it as a futuristic automatic door.
When making a convincing mixed reality experience, audio consideration is a must. Great audio can transport the HoloLens wearer to another place or time, help navigate 3D interfaces, or blur the lines of what is real and what is a hologram. Using a location-based trigger (hotspot), we will dial up a fun example of how well spatial sound works with the HoloLens.
Many new developers are diving right into the Microsoft HoloLens, but augmented and mixed reality are fairly big subjects in terms of learning. There's a lot to cover and, unfortunately, very few places for someone brand new to Windows Holographic to begin lessons.
Soon, users will no longer need an expensive headset or even a smartphone to experience mixed reality. The new Microsoft update will be bringing mixed reality applications to every Windows computer next month. This new upgrade to Windows 10 named the Windows 10 Creators Update.
Welcome back to this series on making physical objects come to life on HoloLens with Vuforia. Now that we've set up Vuforia and readied our ImageTarget and camera system, we can see our work come to life. Because in the end, is that not one of the main driving forces when developing—that Frankenstein-like sensation of bringing something to life that was not there before?
Now that we've set up Vuforia in Unity, we can work on the more exciting aspects of making physical objects come to life on the HoloLens. In this guide, we will choose an image (something that you physically have in your home), build our ImageTarget database, and then set up our Unity camera to be able to recognize the chosen image so that it can overlay the 3D holographic effect on top of it.
With any continuously active software, it can start to become fairly complex after a few years of updates. New features and revisions both get layered into a thick mesh of menu systems and controls that even pro users can get bewildered by. If you are new to a certain application after it has been around for many years, it can be downright intimidating to know where to begin.
The HoloToolkit offers a great many, simple ways to add what seems like extremely complex features of the HoloLens, but it can be a bit tricky if you're new to Windows Holographic. So this will be the first in an ongoing series designed to help new developers understand what exactly we can do with the HoloLens, and we'll start with voice commands.
Now that we've got all of our software installed, we're going to proceed with the next step in our HoloLens Dev 101 series—starting a fresh project and building it into a Holographic application. Then we will output the application to the HoloLens Emulator so we can see it in action.
Don't let the lack of owning a HoloLens stop you from joining in on the fun of creating software in this exciting new space. The HoloLens Emulator offers a solution for everyone that wants to explore Windows Holographic development.
In recent weeks, Unity has made a few great leaps forward for HoloLens development. These new features will increase iteration speed inside Unity and quickly increase the output of applications in the mixed reality space. Of these new features, let's take some time to talk about Holographic Emulation and why this will do so much for the development community.
Microsoft's HoloLens has two gestures: bloom and air tap. While the two might not seem like much to learn, some people struggle with the air tap because the headset can be a bit particular. The easiest way to learn the proper form is to look through someone else's eyes while they do it, so we've captured that for you.
Microsoft's HoloLens comes with helpful features for capturing video and photos, but sharing whatever you record isn't as straightforward as you might expect. So here are the many ways to get your media off the device to share with the world.
To become a tried-and-true Pokémon master in Pokémon GO, there's an incredibly important decision that needs your attention: Team Instinct, Team Mystic, or Team Valor?
While wandering around in Pokémon GO, you'll occasionally see what appears to be leaves fluttering around nearby. This is actually meant to be Pokémon "rustling in the grass," but whatever the intention, it means that there may be a wild Pokémon in that area.
In Pokémon GO, having an in-depth understanding of your Pokémon's stats and abilities is crucially important to becoming a better player. Not all Pokémon are created equal; as such, it's critical that you look at each of your Pokémon—even duplicates—with a keen eye.
Finding Pokémon in the wild isn't the only way to add to your collection in Pokémon GO—you're also able to hatch your own from eggs that you've gotten from PokéStops.
Like previous installments in the Pokémon series, as you progress through Pokémon GO you'll be able to evolve your Pokémon into more-powerful monsters with new and more-damaging attacks. However, unlike older entries in the series, your Pokémon won't simply evolve when they reach a certain level. Instead, you'll have to "feed" them a certain amount of character-specific candy to induce the transformation.
If you want free Poké Balls and eggs when playing Pokémon GO, you can find them at PokéStops in variation locations around your city, which are marked with towering blue icons on your map. Once you're at Level 5, they'll also grant you Potions and Revives to help you in your battles against other trainers, so they're definitely something you should be visiting whenever you can.
The only way to know which Pokémon are in your area in Pokémon GO is the cryptic "nearby" list, which sometimes doesn't work—and also doesn't tell you which direction to head off to hunt that Pokémon you're looking for.
When Microsoft release an update to the HoloLens Development Edition at the end of May, there were a bunch of cool new features added in. Among them: New voice controls that make working in the HoloLens operating system much easier.
If you're standing in a foreign city, surrounded by signage in a language you don't understand, you won't suddenly be able to read it. But with a clever feature in Google's Translate app, your smartphone can.
If you're in the market for a new tattoo, the biggest hurdle to clear is imagining exactly how it's going to look. It's going to be part of your identity for the rest of your life, so you have to make sure it looks just right—or as your mom probably told you, "Think of what it's going to look like when you're 60."
Aaron Betsky, director of the Cincinnati Art Museum and previous director of the Netherlands Institute of Architecture, reports on the world's first postage stamp to employ augmented reality. Dutch advertising agency Gummo, the NIA and the Dutch postal service teamed up to present five unbuilt models by different Dutch architecture studios in 3D form. When held in front of a webcam, the illusion of a 3D building is projected in your hand. By slowly moving the stamp, you can experience the...