by Roaz » Thu Feb 28, 2013 1:40 pm
by Roaz
Thu Feb 28, 2013 1:40 pm
Augmented reality worlds is a vast term which can apply to multiply situations. We can imagine looking at a ball floating in the water and, through image processing, detect it and transform it in a sea monster; we can transform an empty swimming pool in a tropical aquarium; or we can play war games with virtual enemies. The main question about augmented reality is related with the reference points used in the real world. By image processing we can identify objects and use them as those anchors. Otherwise, we can use Ziphius's relative position and add virtual elements around it without exterior anchors.
In our drone, image processing capabilities are also the base for autonomous behaviors, it can detect colours or shapes and follow them (or perform any other action).
Right now, our creative team is idealizing a set of games and entertainment activities that will use this capabilities to create interesting apps but we still don't have them implemented.
Because detecting colours is one of the most easy way to get anchors and objects to follow, we are thinking about the possibility of allowing users to set the colours the drone should follow or identify. This way, the games can be played with the objects the user selects. Alternatively, for some specific games, we are thinking of selling the needed objects.
Because we can detect the relative position of the colour we are following in the captured image and because we have individual control of each propeller, we can drive Ziphius autonomously in the objects' direction. The ammount of target colour visible can give us an indication about the proximity of the object, specially for objects that we know.
We still don't have videos of these behaviors because we are still finalizing the image processing algorithms but we hope to provide some videos about this soon!
Augmented reality worlds is a vast term which can apply to multiply situations. We can imagine looking at a ball floating in the water and, through image processing, detect it and transform it in a sea monster; we can transform an empty swimming pool in a tropical aquarium; or we can play war games with virtual enemies. The main question about augmented reality is related with the reference points used in the real world. By image processing we can identify objects and use them as those anchors. Otherwise, we can use Ziphius's relative position and add virtual elements around it without exterior anchors.
In our drone, image processing capabilities are also the base for autonomous behaviors, it can detect colours or shapes and follow them (or perform any other action).
Right now, our creative team is idealizing a set of games and entertainment activities that will use this capabilities to create interesting apps but we still don't have them implemented.
Because detecting colours is one of the most easy way to get anchors and objects to follow, we are thinking about the possibility of allowing users to set the colours the drone should follow or identify. This way, the games can be played with the objects the user selects. Alternatively, for some specific games, we are thinking of selling the needed objects.
Because we can detect the relative position of the colour we are following in the captured image and because we have individual control of each propeller, we can drive Ziphius autonomously in the objects' direction. The ammount of target colour visible can give us an indication about the proximity of the object, specially for objects that we know.
We still don't have videos of these behaviors because we are still finalizing the image processing algorithms but we hope to provide some videos about this soon!