Tuesday 23 December 2014

Interview: What happens when augmented reality meets social media?

Interview: What happens when augmented reality meets social media?

Taggar describes itself as the "world's first social augmented reality platform", which allows users to create hidden tags to share with friends and followers. According to the app's makers, everyday physical objects can take on a completely new life when scanned with Taggar.


We spoke to Taggar co-founder and CMO Charlotte Golunski about how the app works and why augmented reality is in danger of becoming the next QR code.


TechRadar Pro: How does Taggar work?


Charlotte Golunski: The app recognizes the objects that it is looking at, revealing content that has been secretly linked to the object. You can add your own tag to pretty much anything, leaving secret messages in the form of video, pictures and stickers. All tags are instantly stored in Taggar's cloud, so your creations can be seen by your whole social network right away.


TRP: What is so special about it? Isn't it just another Blippar?


CG: Much of the first generation of AR has only been used for marketing purposes. Whilst older AR apps have done a fantastic job of bring brands on board, people want more than just additional content from advertisers – this was one of the key hurdles in the adoption of QR codes.


In pioneering the next generation of AR – social augmented reality - Taggar is unique for allowing and encouraging users to add their own tags to the images and objects around them, actually interacting with the things they scan. Users can leave messages for their friends, post reviews or leave hidden AR comments tagged to the objects around them, building a social network that is grounded in AR.


TRP: What's the ultimate goal of the app? What do you want to achieve?


CG: In the near term, Taggar's key aim is to continue building the number of social connections within the app – ramping up both the number of users and the amount of tags each user leaves. If you look at an app like Snapchat, its utility stems from the value it has as a social connector – if Team Snapchat started with brands first they would never have been able to build such a strong user base.


Indeed, user engagement for AR has been low because the priority is wrong – therefore, instead of focusing on brands, Taggar is starting with people; developing a network of users who leave millions of virtual tags on the real-world objects around them.


TRP: Why have you partnered with Warner Music and Capitol Records? What are the benefits for record companies?


CG: Since the launch of Taggar in December 2013, we have partnered with a number of musicians to give fans access to unique content directly from the artists they care most about. This fits in with our plan to build Taggar as the world's first social network powered by AR –fans are able to connect with their favourite musicians in a completely new way, as well as engaging with the wider community of fellow fans through the app.


In terms of the benefits for record companies, Taggar provides a way to bridge the traditional "online/offline divide", tying together digital content to real-world physical objects that fans own, such albums, posters and t-shirts.


TRP: Has the app been designed with the music industry in mind?


CG: Taggar is a brand new platform that can be used in a variety of different ways. Whilst we felt there was a great fit for music labels in terms of fan engagement, we didn't design the product specifically for that industry. Our value as a social connector makes us relevant to film, food, travel – a whole range of industries.


Additionally, the development of Taggar on wearable devices – we were the first AR platform to collaborate with Google Glass – will create all kinds of interactive content for users to see, just by looking at the objects around them. Just by looking at a router, users will see an overlay of which cables to plug where, or looking at a recipe book could trigger a video of the recipe that a user sees in their field of vision whilst cooking.


TRP: Is Taggar the next step in AR? What do you see are the next important developments in the field?


CG: One development that looks set to explode AR into the mainstream is the growth in wearable devices. Smartphones may be fantastic, but they introduce an unnatural division between us and the real world – pulling your phone out your pocket every time you want to look something up, or entering text-based searches and receiving text-based answers: this just isn't how we're going to be interacting with information in the future.


As hardware is becoming increasingly refined, and computer vision algorithms get more and more powerful, we'll soon see a world where wearables like smart watches and glasses become widespread. When that time comes we'll see an AR revolution which knits technology with our everyday lives - information will be overlaid directly into our field of vision automatically, without the restrictions that a hand-held screen places on us.


TRP: What is Mike Lynch's involvement with the platform?


CG: Taggar was fortunate enough to receive investment from Mike Lynch's fund, Invoke Capital, who share Taggar's vision for the future of AR. Mike acts as both investor and advisor to Taggar, whilst the rest of the management team at Invoke Capital play a very hands-on role in supporting our development. The Invoke team is made up of the people that took Autonomy from a small Cambridge start-up to a world-class software company – to have their constant advice and management support is every start-up's dream!


TRP: What's next for Taggar?


CG: We'll be seeing some exciting collaborations with some of the music industry giants, as well as expanding the cultural reach of Taggar into the art scene. In fact, we're going to be taking some inspiration from the art world, which Taggar users will see appearing in the app very soon!


On the hardware front, we're scaling up our work on wearables, combining multiple recognition algorithms to make wearables incredibly powerful, and providing devices with the ability to access our cloud so that they'll be able to recognize over a million objects in the world around them.
















http://ift.tt/1AXgbim

No comments:

Post a Comment