First of all, ARcore is one of the projects of Google, which is able to not only entertain users with their capabilities, but also not a joke to impress. ARcore is the basis of many applications of augmented reality such as Google Lens and Playground, as well as third-party apps like IKEA Place or games such as Pokémon Go, and now it is used in Google Maps for a new mode of navigation. At last night’s event Google I/O, the company announced several improvements that will make the process of interaction with ARcore more realistic.
Let’s start with improvements in the so-called Augmented Images which allow developers to create AR applications that can interact with certain 2D objects in the real world, such as posters, stickers, and packaging of various products.
ARcore is now better detects moving objects, and therefore virtual objects can be attached and interact with them much easier.
Another improvement called Light Estimation, and it is connected with the lighting of 3D objects in augmented reality. ARcore is now a better evaluation of the lighting in the real world that helps him to cast shadows on virtual objects. The API has been updated with the addition of special Environmental HDR, which works like a regular HDR in cameras or displays, but in this case intended for 3D objects. It uses available data about the lighting in a real scene and distributes the light to digital objects, which should help in creating more accurate shadows, highlights and reflections.
Google has also simplified the web designers process of implementation of AR-objects in their sites. So, when searching for some object, for example, a beige sofa in addition to the standard links in the search results you’ll also see the ability to put a 3D model of the sofa right in your living room and see how it will look with your Wallpaper. A new feature will appear later this year and will not need any additional app — it works native.
Share your opinion in the comments under this material in our Telegram chat.