As soon as you open the Blippar app, you will be looking an object through the camera on your device. This will activate immediate digital search and the app will draw the information directly from the world around it.
Blippar will launch this new feature by first making blippable English album covers, fiction books, DVD covers and movie posters. When you blipp one of these items or images, it will immediately be presented with contextual and snackable information. For instance, if you blipp an album cover, you will be presented with the access to videos of the band, details on people’s review about them on Twitter, a source to buy tickets to an upcoming concert and photos of the band as well. This is just the first step in Blippar’s mission to make the physical world visually searchable using mobile devices and wearable devices. In the future, Blippar will add a wide catalogue of blippable objects gradually. The objective of this app is to make everything such as an apple, a chocolate bar, a dog on the street, EiffelTower and all most anything blippable. Web based search has dominated the digital experience, but this search behavior is limited by vocabulary and literacy. The image recognition capabilities of Blippar will take the search beyond the limitations of language to empower consumers to instantly pull out relevant information in a timely manner from the environment around them. Blippar’s search engine is net neutral and it will use the most accurate sources of information within the application. There is a fluid user interface with design and color input from the blipped item. The speed and accuracy of the Blippar platform lets users to access information faster than customary web searches as there is no delay in latency. The location based predictive computing technology uses deep learning and artificial intelligence that will improve and personalize the visual search results based on the user. The updated Blippar application will be available on Android and iOS platforms from April this year.