As far as your code using NLP, it is, in the strict sense. It's english, and you're parsing it
Is it "state of the art"? No, but it's clever, and I applaud you for trying it out - it's hard to do NLP at any level.
Your real issues (and I'm guessing here), are stemming from #3, which is the base problem that NLP solves - "turn on the lights" should be the same as "turn the lights on". Sounds simple, but the permutations on even a simple level are mind boggling to account for.
The code I have working uses a professional NLP engine for parsing the grammar into base commands, so that "turn on the kitchen lights" translates to
Action: Turn On
Target: Lights
Location: Kitchen
So you don't need to worry about all the NLP processing's (trust me - let someone else deal with this headache). Once you have the english parsed into base commands, the actions are pretty easy.
Now, that all being said, I'll tell you what stalled my project (other than lack of time to work on it) -
verbal dialog.
The first problem is pretty easy for me. After a command is issued like "turn on the lights", I can see that the location is missing, and send back text saying "Which lights do you mean?", and then parse that command. Pretty standard when you are using a keyboard, and thus text only.
But, the second is making this
verbal dialog. Your solution/hack of using Siri with the Notes feature is really clever (applause for figuring that out), but it lacks the next steps for Siri to come back and say "Which lights do you mean?"
I think the solution needs to be an phone-based app that can take dictation and send it up to be parsed (I have hacked this together) and then something that converts text from the server from "Which lights do you mean?" into spoken (verbal) dialog. After all, that's where we want, right -
interactive dialog?
So my long post here is to say that unless we attack this from a bi-directional phone app, I think we will be disappointed in the end result - it just won't do enough.
I don't know how to code iPhone apps (but under enough pressure I could learn), but on the horizon is something interesting. There are rumors floating around that Apple will announce some Home Automation hooks to Siri at the WWDC on June 2nd.
Now I hate waiting for anything like this, but since it's very soon, I think we should wait to see what they have in store. If it's good, I can re-up my developers license, and dive in. If not, we can see if we can get someone (or me) to develop a phone app to be how you interact with Indigo.
Either way, I think a solution is possible, and it could be very impressive.
Comments?