Hi all,
Wonder how is everyone’s experience that is trying this?
I spent a bit of time supporting chatGPT within iMessage plugin - relatively simple drop in replacement for current Natural language processing wit.ai model that predates and is supported. I will do my best to fine tune this and release, this like the wit.ai stuff sits to one side unless enabled.
Working well: all the chatGPT chatting bits
iMsg to ‘Indigo’: “Whats the capital of Turkey?”
iMsg to ‘Indigo’: “ Tell me a joke”
iMsg to ‘Indigo’: “Write a year 10 essay on modernism in 400 words.”
Working about 90% of the time:
iMsg to “Indigo”: “Turn on Kitchen main lights”
Feedback
- ChatGPT API has a new model gpt-turbo-3.5 with different setup mechanism (via user/system roles) x10 cheaper than davinci-text. Seems similar although not quite the same in my testing.
- chatGPT doesn’t remember any prior info - so you need to send the whole setup info each time, including deviceID and aliases. This leads to big requests and quite a lot of token usage (cost only a few cents still though from a day of heavy testing)
- ChatGPT often, and I mean often (probably 1 in 5 - 1 in
returns misformed JSON which can’t be parsed without a crash. Typically it seems to want to add something before the starting {. Catching this and editing out fixes this about 1/2 the times - but a awful lot of time the json is unreadable. Or it forgets and sends plain text.
eg. ask it to “turn on light”. without specifying location - should ask for clarification - but as is often the case seems to just make up a location and/or more likely to mis form JSON in these cases.
- the new gpt-turbo-3.5 model seems slightly worse at creating json in my testing. It does seem more targeted to language (not surprising for a language model). If this trend continues then JSON output won’t really be usable anytime soon I would guess
- It doesn’t use correct JSON quotes if single or double quotes in text given will typically be invalid JSON.
- for device control it seems in my limited tested that wit.ai is probably more accurate for standard commands. wit.ai now needs a facebook login to create Api-Key annoyingly - thankfully my older ones still work. The setup for Wit.ai is far more complicated (but that is all done by plugin), needs devices, entities, intents created within an online app which plugin creates (just needs API key), at start the plugin sends hundreds of examples - based on devices marked as per plugin’s documentation.
I suspect the AI future probably still is HomeKit and Siri/Apple will expand leaps and bounds based on this and Google competitor, and apples own in-house AI projects…
But interested in how accurate others are finding it?
I’ll see if can tidy up plugin, make sure supporting both davinci and gpt-turbo-3.5, and see if can add support for a /message/ url link to send messages in and out for use in shortcuts…
Glenn
Sent from my iPad using Tapatalk