Touch is not only the medium of interactivity. We offer a wide variety of interactive solutions
Using cameras to capture hand movement in front of a display, enabling interaction with onscreen content without physical contact between the hand and the screen.
Computer vision technology can identify age, gender, group size, views, impressions, and more in real time. Use Intuiface to both capture this information for analysis – such as an ad’s audience metrics – and to trigger onscreen content uniquely personalized for predefined demographic categories.
Similar in concept to beacon technology, RFID/NFC systems uniquely identify tagged items. Our solution can communicate with any RFID/NFC reader, such as those from Nexmosphere, using the captured identity to show individualized information or trigger any of 200+ possible actions.
Our solution supports use of the spoken word, capturing information or reacting to commands activated from Amazon Alexa or Google Home with any of 200+ possible on-screen or behind-the-scenes actions. Crucial for creating accessible experiences for those who can’t effectively touch the display.
The Internet of Things is the universe of network-accessible devices that can send and receive information, everything from room lights and thermostats to your refrigerator. Our solution can communicate with and direct all of these devices in real time.
Tangible objects are items whose presence and orientation can be detected by a display. Intuiface can react to these objects – aka tangible object recognition – on any display/middleware combination supporting the TUIO protocol, treating them as unique identifiers as well as points of interest for displaying interactive content.
With beacon support, your Intuiface experiences can uniquely identify select items, be notified when approaching an item, or broadcast URLs in response to user choice. Create lift-and-learn scenarios for your store or personalized browsing experiences for your museum.
Intuiface experiences can communicate with one another across any network. Triggers in one experience launch any of 200+ actions in another. And a remote action API enables 3rd party apps to control / be controlled by Intuiface experiences from afar.
With its support for web triggers, Intuiface can receive navigation and selection commands from web content running on mobile devices. Alternatively, third party software can send mouse commands from a personal device to a display, simulating tap and drag gestures.
Copyright © 2023 click-ix.com