Monday, August 4, 2014

Man Drives An OutRun Cabinet On Roads With Augmented Reality

With everyone getting obsessed with the incoming Oculus Rift, it's easy to forget that not so long ago companies like Nintendo and Sony were getting obsessed with augmented reality.
Garnet Hertz, a research scientist at Informatics has gone one better than the cards typically used for the Vita and 3DS, delivering his project in the form of an amazing customized OutRun cabinet.
It's been designed to accurately read roads and transfer the date to the game and Hertz's screen. The end results are pretty astonishing.

Monday, March 3, 2014

Mobile app "walkthrough", yes or not?

Mobile App walkthroughLately, it seems that many mobile apps, upon first launch, are stepping users through multi-screen “walkthroughs” where all the features of the app are demonstrated, painstakingly, one-by-one. Worse still, are those overlays of help screens that appear at start up, with the scrawled handwriting, circles and arrows best left on whiteboards – not in user education material.
That’s not to say that apps don’t need to explain their functions to users. But some are taking things too far. Long, step-by-step tutorials are not what users want to see when launching apps. And to be clear, I’m talking about apps here, not games. Mobile games are a different beast, with a slightly different set of rules.
The question, simply put: if a mobile app includes a “walkthrough,” has it already failed? That’s the premise of an article trending on Hacker News, written by Visual and User Interface Designer Max Rudberg. And like many posts with catchy, linkbaiting titles (“If you see a UI walkthrough, they blew it”) people are reacting viscerally to the headline without fully digesting the meat of the article’s content. For what it’s worth, I think Rudberg has a point, even if he’s over-generalized the situation.
Currently the top comment on the Hacker News post, as per usual, is contrary. “This is utter nonsense,” the commenter rebuts. But I don’t think that it is. Sure, you can always pull out examples of apps that actually do require extensive walkthroughs, or you can hedge and say, well, people learn things differently – some people appreciate and need walkthroughs! Or you can talk about how your mom or dad wouldn’t know how to do anything on a mobile phone without a tutorial. Or how there are different types of apps and different types of walkthroughs. And sure, that’s all true. It’s like everything else in life: the issue is not black or white. There’s nuance here.
But can we talk about walkthroughs possibly heading into overuse territory?
The problem, in a nutshell, is that in too many cases with new apps, the existence of the walkthrough speaks to core issues with the design. It can come across as lazy. Anything more than a couple of basic pointers sends a message to users that this app is complex, it’s complicated, and now I need to hold your hand while you learn to use it.
That feels wrong on some level. This is mobile. It’s meant to be simple. Accessible.
Apps are not enterprise software with hundreds of functions. They’re not bloated Office suites with overstuffed ribbons filled with a dizzying array of choices. They’re not even desktop software, requiring complicated installations and configuration procedures that once had users phoning I.T. support for help. They are apps, and the user generally already knows what they’re for before they installed it.
Maybe the user found it featured in a list, but they didn’t install it blindly. They saw the list, read the title and/or description, looked at screenshots and thought to themselves, “hmm, a camera that lets you add filters to your photos! I’m going to install that!”
And if they didn’t come across the app serendipitously, then they actually performed a specific keyword-based search and the app appeared in the search results where they selected it from among dozens of others. Or maybe they installed in after reading about it on the web, or hearing a friend talk about it.
The point is, the user to some extent already knows what the app is for. They’ve got the basics down already. Where do developers think they’re losing them, exactly? It’s not intuitive which button to tap first? Where specific content is found? How to perform a specific task? Does the app have a cold start problem (it’s social, but you have to add friends first)? For many of these challenges, the solution doesn’t have to be a long walkthrough or messy overlay. It can just be a better design. A better layout. A better sign-up flow.
Look, we grew up on a web were we had to click “Next” and “I agree” and where we had to learn to click an “X” to close all those pop-up ads. Walkthroughs often leave users with similar feelings – they’re the cruft that has to be scraped away before you actually get to use the app.
If the developer thinks a walkthough is still in order, then maybe it can be a screen or two and a few words. That’s still a walkthrough, yes, but a far more tolerable one. Or it can be just some intuitive hints that appear during the app’s use.
At the end of the day, walkthroughs, while sometimes unavoidable, are not the best part about using an app. Using the app is the best part about using the app, and getting there as quickly as possible should be developers’ goal.

Originally Posted by, TC, Sarah Perez
Screenshots via Archer Group; mobile tuts

Tuesday, February 18, 2014

Should You Build a Mobile App or Mobile Website

Monday, December 16, 2013

Keeping Memory of Horrible Earthquake through Augmented Reality


On the 17th of January 1995 the Great Hanshin Earthquake devastated Kobe and took around 6,000 lives. Nevertheless, Japan always arise again, so everything was completely reconstructed and restored. To preserve the memory of what the city looked like in the aftermath of the disaster and to show how it was restored and what has been changed the government has developed a location-based smartphone app.
Guests of the city can access the map archive and images by downloading the free mobile app junaio for either iOS or Android, and then go to certain places around the city. There are around 98 spots in the Shin-Nagata Minami and Shin-Nagata Kita districts where the photos were taken back in time. When users arrive to the location where image data is available, they just point their phones in the direction of where the photo was taken and they can see what it looked like after the earthquake and compare to what it looks like now.
The app will be especially interested for people who are interested to learn more about the earthquake and learn lessons of disaster recovery.

Sunday, December 8, 2013

Structure 3D sensor for mobiles and Augmented reality games

Structure sensor 3D scanner

Now a days our smartphone is the first computer we switch on, it last one to switch off everyday.
There are lots of nifty things you can use the smartphone for, and soon you’ll be able to use it as a 3D object scanner. Structure's 3D sensor camera for mobiles gives the device capabilities similar to those of Microsoft’s Kinect gaming controller like Scan objects and rooms in 3D, play augmented reality games, and develop mobile apps with 3D vision.

Occipital, the San Francisco company behind the device, says it will start shipping its product in February 2014.Occipital has developed apps that  allow people to scan objects in 3-D by walking around them, and to scan entire rooms. One shows how the sensor can enable augmented reality, where virtual imagery is overlaid onto the real world when seen through a viewfinder. In that app, a person plays fetch with a virtual cat by throwing a virtual ball that bounces realistically off real-world objects

If you’re a developer, Structure gives you the ability to build mobile applications that interact with the three dimensional geometry of the real world for the very first time. You can even launch your app on the App Store!

According to Richard Darell on bitrebels, "The Structure Sensor can be used on anything and you don’t have to stand still or place an object into a 3D scanner machine. This device is added on to your iPad, and you can literally walk around and scan entire objects or environments down to the smallest detail. Not only does it scan, meaning creating a wireframe mesh out of the things it sees, but it textures it naturally so it looks like a photorealistic copy of the real thing at the same time. It is a mind-boggling thing to see and to think that it is all possible with a little device added onto your iPad is just an unearthly feeling."

via bitrebels
Augmented reality games, Augmented reality, Structure 3D sensor camera for mobile

Thursday, December 5, 2013

The awesome world of Augmented reality and Artificial Intelligence

It is about time when Augmented reality comes out of its old concept of playing a video over a newspaper or a dancing dwarf in a user's hand. Infinity AR is working to build cross-platform tools, including on Google Glass, Optinvent’s Ora glasses, and Meta’s SpaceGlasses, to create augmented reality that fully integrates with the user’s life.

Recently Infinty AR, an Augmented reality company announced a partnership project with Beyond verbal from Israel, company that decodes and measures human emotions in voice. This is the concept where Augmented reality, with the help of Artificial intelligence makes our lifestyle rich.

Founded in 2012, the Tel Aviv-based Beyond Verbal raised $3.8 million in venture capital prior to its launch in May. It licensed its technology to New York City-based Infinity AR, which is building a combined artificial intelligence-augmented reality hub that senses a user’s needs.

check out this Infinity Augmented reality concept video,

Did someone say Google glasses??  Definitely need to consider again after watching this video.
Gone are the days when,  the moment you hear someone say Augmented reality the picture of someone holding a marker and dancing dwarf appears or video that plays over a newspaper comes in to mind. But things have changed a lot lately in Augmented reality and Gesture recognition with the emergence of Augmented reality smart glasses other wearable like wristbands etc.

Combining AR with AI and also with the help of smart glasses like Google, space and Icis glasses .
I am pretty sure AR is going in right direction.

Awesome world of Augmented reality, Augmented reality, Artificial Intelligence

Sunday, December 1, 2013

Icis Smart glasses : an alternative for Google glass

New player in the smart-glasses world has arrived in the form of Icis. This new Google Glass alternative is said to be much more immersive than Glass — and much less geeky. I wrote about other smart glasses like Vuzix, Google glass and space glass in my previous post.

ICIS Smart Glasses
Image by (
The product is a set of smart glasses — “smartspecks” — that can give you heads-up directions, take pictures, record videos, display on Facebook, Twitter, and Instagram updates from your friends, and more. Just like Google Glass, connectivity to the cloud is via your phone. Unlike Google Glass, however, all the intelligence resides in an app on that phone and It doesn't have Augmented reality feature or an object tracking feature.

Icis is prescription eyewear that you can use to display notifications from your social networks and turn-by-turn directions in your field of vision. The eyewear will come in a variety of styles and colors and are meant for people who wish to have real time data relayed to them in a way that is not as distracting as having to monitor your smartphone. Icis also features a camera, microphone, and speaker for you to use in whatever way you like.

ICIS Smart Glasses
Image by (
Icis is a bluetooth accessory controlled by a smartphone app. The app will be available for iOS, Android and Windows Phone. When you download the app and pair Icis to your phone, all you have to do is select the apps on the phone that you would like to get realtime updates from and which navigation app you would like to receive turn-by-turn notifications from.

LaForge’s product, unlike Google Glass , works as a convention pair of prescription glasses, and has a fully immersive user experience — it doesn’t require you to look up and to the right to see a tiny LCD screen, for instance. But it’s the ordinary, average, everyday look of a pair of glasses that’s one of the key difference.

To interface with apps in Icis, use the touchpad and shortcut buttons, on the arms of the glasses to do a variety of things from taking pictures, recording messages, or changing display modes.

ICIS Smart Glasses
Image by (
How does it work?
this "smartspecks" use a custom frame that has the electronic components, such as the battery, display and bluetooth antenna embedded within it. The custom lens that they use, can be cut to nearly any shape and allows users to see the information on the display in front of the user. Basically app is used to relay the information and data between glass and smartphone.

 Original Post by John Koetsier,venturebeat 


Powered by Blogger.


© 2013 Concept Oman. All rights resevered. Designed by Templateism

Back To Top