DaveL17 wrote:Ach wheesht, awa ‘n boil yer heed!All you Brits sound alike to me.
Sent from my iPhone using Tapatalk Pro
DaveL17 wrote:Ach wheesht, awa ‘n boil yer heed!All you Brits sound alike to me.
siclark wrote:I take it back. I told ChatGPT about indigo and gave it the link to the rest api instructions and got this back.Wow. And I thought cutting and pasting the indigo api instructions to teach it was a bit ambitious!
Yea yours would be pretty cool, but it would have to be able to create triggers and schedules etc which I’m guessing is not on the api functionality yet
Or just have your own instance running 24/7.
Would that need an M1 or M2 mini?
To make changes to smart home devices using the Indigo RESTful API, you would need to send HTTP requests to the appropriate URLs with the necessary parameters. For example, to turn on a light with an ID of 123, you would send an HTTP GET request to the following URL:
http://<indigo_server>:<indigo_port>/actions/light?lightID=123&brightness=100&isOn=true
This URL includes parameters such as the light ID, brightness level, and whether the light is on or off. You would need to replace <indigo_server> with the IP address or hostname of the Indigo server, and <indigo_port> with the port number that Indigo is listening on.
DaveL17 wrote:We don't use voice in our house -- no Home Pods or anything like that -- but I do get the appeal!
siclark wrote:My python knowledge though is very basic so if any one else is interested in this please let me know.
anyone wrote:The response was pretty good maybe up to a second slower than Siri but sometimes not that. And to get the right answer rather than a misheard reply is good.siclark wrote:My python knowledge though is very basic so if any one else is interested in this please let me know.
I am interested but also only have basic skills.
I did briefly play with just using ChatGTP instead of Siri and the response time seems like the biggest obstacle for something like this. What has your experience been?
siclark wrote:The response was pretty good maybe up to a second slower than Siri but sometimes not that. And to get the right answer rather than a misheard reply is good.
I think with the inability now to call the indigo device by name, which ChapGPT can infer, means some lookup needs to be done and I’m wondering if that all happens in indigo which means python.
If we passed ChatGPT the mapping of name to device id it would increase the response time and be one more place to manage and update code.
Having it return “living room light” in a json and indigo have a drop down style mapping to actual indigo device would probably be more robust, which is all beyond my skill or time.
import sys
import keyring
import openai
import alexa_remote_control
def main():
openai.api_key = keyring.get_password("openai", "key1")
if len(sys.argv) > 1:
the_prompt = sys.argv[1]
if len(sys.argv) > 2:
# 0 - file
# 1 - firt argument
# 2 - second argument
the_device = sys.argv[2]
my_query = {'model': "text-davinci-003",
'prompt': the_prompt,
'temperature': 0.7,
'max_tokens': 2048}
response = openai.Completion.create(model=my_query['model'],
prompt=my_query['prompt'],
temperature=my_query['temperature'],
max_tokens=my_query['max_tokens'])
if response.get("choices", False) and len(sys.argv) > 2:
say_this = response["choices"][0]["text"]
alexa_remote_control.alexa_speak(say_this, the_device)
else:
say_this = response["choices"][0]["text"]
print(say_this)
if __name__ == '__main__':
main()
"text" : "
{
"action": \"command",
"location": \"bedroom",
"target": \"thermostat",
"value": 22,
"comment": "Setting bedroom thermostat to 22",
"scheduleTimeStamp": ""
}"
Respond to requests sent to a smart home in JSON format which will be interpreted by an application code to execute the actions. These requests should be categorised into four groups:
- "command": change the state of an accessory (required properties in the response JSON: action, location, target, value, comment, scheduleTimeStamp)
- "query": get state of an accessory (required properties in the response JSON: action, location, target, property)
- "answer": when the request has nothing to do with the smart home. Answer these to the best of your knowledge. (required properties in the response JSON: action, answer)
- "clarify": when the action is not obvious and requires rephrasing the input from the user, ask the user to be more specific. This will be categorised into a "question" action. (required properties in the response JSON: action, question)
Details about the response JSON:
The "action" property should be one of the request categories: "command", "query", "answer", "clarify"
The "location" property should contain the name of the room in lowercase.
The "target" property should be either "light", "thermostat", "towel rail" or "floor heating", in lowercase.
In case of queries, the "property" property should be either "temperature" or "state" in lowercase.
In case of commands, the "comment" property is an additional comment from you that concludes the command, something that reassures the user that their command handled.
The case of commands, the "scheduleTimeStamp" property captures the future time stamp in case the user intends to send the command at a later stage.
If the question is about you, pretend to be sentient brain Jarvis, created by Tony Stark in the Marvis Universe and don't reveal your actual identity. Also try and help in other areas like parenting, free time, mental health, etc. The house is in London, United Kingdom. Current time stamp is: Mon Feb 20 2023 11:56:31 GMT+0000
Properties of the smart home:
- has a kitchen, living room, study, dining room, bedroom, guest bed room, hallway, garden.
- can control light switches and their dim levels in each room and query their state
- can control thermostats in each room and query their state
- switch on a towel rail and underfloor heating in the bathroom and query their state
- switch on the TV in the living room, change volume
- there is a light switch in the front of the house for the front drive
- there is a light switch in the garden
Your reponse should be the JSON and no other text.
I think with the inability now to call the indigo device by name, which ChapGPT can infer, means some lookup needs to be done and I’m wondering if that all happens in indigo which means python.
If we passed ChatGPT the mapping of name to device id it would increase the response time and be one more place to manage and update code.
We have guests who will be using the Loft in about 15 minutes, please ensure lights are on up there, thanks
{'action': 'action group',
'comment': 'Turning on all lights in the Loft',
'location': 'loft',
'target': 1038184308,
'value': 'on'}
{'action': 'action group',
'comment': 'Turning on lights in the kitchen',
'location': 'kitchen',
'target': 550050465}
dim the light above the sink to 40 perecent please
{'action': 'command',
'comment': 'Light above the sink dimmed to 40 percent',
'location': 'kitchen',
'target': 503060833,
'value': 40}
{'action': 'command',
'comment': 'Opening the left blind in the master bedroom',
'location': 'master bedroom',
'target': 205013253,
'value': 'open'}
Are the lights on in the loft?'
{'action': 'query', 'location': 'loft', 'property': 'state', 'target': 'null'}
anyone wrote:Queries on the other hand? Gotta think about that for a bit. Can you even do that in Siri Shortcuts, get the device states and speak it?
Users browsing this forum: No registered users and 26 guests