After a few months of prototyping and testing with the Google Mirror API, we changed direction and created a GDK (Native app) for Google Glass that interfaces with the Indigo RESTful API.
I don't know if anyone else has Google Glass, but if you want to test this for us, let me know.
Here is how it works:
1) Sideload APK onto your device
2) Start it by saying: Ok Glass, run connected home
3) A screen appears, and you tap it once to issue a voice command.
4) We followed the same patterns that were used in the Siri Proxy proof of concept:
a) Turn on <Device Name>
b) Turn off <Device Name>
c) Toggle <Device Name>
d) Set Brightness of <Device Name> to ## percent
d) Execute <Action Group Name>
The speed at which this works is almost instant. In our previous code, where we were using the Mirror API, it took up to 10 seconds. Now it is 1 second or less.
One issue we have is with capitalization. Because we are using the Google speech API to turn the voice command into text, we deal with the issue of it returning lower case device names. Because the Restful API is case sensitive, Kitchen Lights is different than kitchen lights. I had to go and change all of my devices and actions to lower case to make it work.
We will try to address this in a future build.
We have some additional clean-up to do, including moving the server IP, user name and password out of the application config to an external file or by making a config screen on glass itself that would let you set your server host name, userID and Password.
Still a work in progress, but it demonstrates next-generation control over your Indigo connected home and works great.
I realize that some of you think this is pointless (per the comments on a previous thread), but the implications of this are pretty big, especially for use cases like a security guard, disabled person, etc.
constructive comments appreciated.
dave