Integration with InfluxDB and Grafana for graphing

Posted on
Fri Nov 06, 2015 4:56 pm
jtodd offline
Posts: 76
Joined: Apr 15, 2014

Integration with InfluxDB and Grafana for graphing

I need to graph things out of Indigo. Electric use, temperatures, solar power production, etc. etc. etc. The fact that there's no basic graphing tool included with Indigo is kind of a bummer - after all, the next thing that comes after "control" is "analysis".

So I found Karl Wachs' IndigoPlotD (viewtopic.php?f=165&t=11983) which is AMAZINGLY useful. I quickly created lots of different graphs, and his tool is super-flexible for getting things done. However... the interface is pretty awful, and I found that I was spending HOURS (actually, days) trying to frenzy-click my way through all of the pages of stuff and finding subtle bugs. This isn't his fault (I think) - it's a limitation of the way Indigo treats plugins. Having to add all the values I wanted to graph, and then defining what kind of values they were, and then defining the graphs, and then defining each line - it was super-tedious, and I found several bugs that caused me to have to start from scratch a few times.

I wrote a script to pull things from my SMA Solar (Sunnyboy) panels via SMASpot, and that kind of broke the bank. https://www.indigodomo.com/library/369/ I had many variables that I wanted to track, but every menu now had dozens and dozens of options which made working with IndigoPlotD extremely difficult.

So I wanted something that would "just work" with Indigo, and store and plot all of my values for EVERYTHING. Disk is cheap, CPUs are fast - just log it all and sort it out later. Personally, I like the GNUPlot look better from IndigoPlotD but I just burned myself out trying to get it working with all the stuff I had.

There are a bunch of new-ish graphing tools that are in use by the DevOps community, and it seemed that the data in Indigo was a good match for some of those tools. I've looked at InfluxDB with Grafana on top of it, which seemed to work well for operational tasks There are many other solutions (graphite, postgres, OpenTSDB, etc.) for graphing and time series data but I picked InfluxDB and Grafana because it seemed to be the quickest route for a RESTful import of Indigo data.

So I wrote this script to pull everything out of the REST APIs on Indigo and stuff them into InfluxDB.

There is a shortcoming to this script, and that is that not all possible values from Devices are plotted. See viewtopic.php?f=109&t=14358#p105051 for a conversation on this problem, but it's not major and there is a workaround of setting variables to the device data that you really want.

I am running it (currently) on a demo instance hosted at InfluxDB.com but I'll migrate to a VM running InfluxDB and Grafana located at home shortly. Note that InfluxDB and Grafana are NOT proprietary services, but are program suites that are open-source - I just chose to get started on an instance that was a freebie running as a cloud instance. Converting to your own instance should be super-trivial - just change the hostname/port/authentication data. Make sure you've created a database, then give the username you have created the correct permissions (i.e.: 'grant ALL on "indigo-data" to indigowrite')

Once I had the data flowing out, it only took me about 10 seconds to create the graph I show as an attachment. Adding lines to graphs, or whole graphs to dashboards is incredibly easy, and there are enough options to make it quite useful.

One of the things I couldn't figure out was the timestamps in the API data. They were kind of wacky, and didn't really look like UNIX Epoch timestamps - they were sometime in the 1980's if I converted them. I even tried subtracting them from currenttime, which led to dates around Jan 1, 2000. They were too long to be seconds since change - they were 500,000,000 or more in many cases and I've rebooted/restarted/updated devices much more recently than that. So I just left timestamps out of the data sent to InfluxDB, since it uses "now" as the default time stamp which is good enough accuracy for current data.

If someone would want to write an importer that takes historic SQL Logger data from Indigo and pushes it to InfluxDB, that would be amazingly useful to add here. Also this whole thing should be re-written in Python, but I don't know Python well enough so this hack will have to do.



Code: Select all
#!/bin/bash
#
# (c) 2015 John Todd jtodd@loligo.com
#
# Indigo to InfluxDB API converter
#  v0.1 2015/11/06
#
# This script connects to the Indigo (www.indigodomo.com) home automation
# system and scrapes all the device and variable information and deposits
# it into an InfluxDB using the RESTful APIs of both systems.
#
# You can use your own InfluxDB system, or something like https://customers.influxdb.com/
#
# Once the data is in InfluxDB, you can graph with tools like Grafana
# which will automatically look through InfluxDB tables and have a very
# nice interface.
#
# This script would need to be set up to execute in a crontab of some sort,
# so that it runs every minute.  Something like this:
#   * * * * /Users/johndoe/indigoinflux.sh
#
# Lots of things are not well-handled here. For one, many of the values
# in devices won't be captured since they're not visible via the API.
# You might have to create variables within Indigo and set the variables
# with the "hidden" device values so that they can be imported.
#
# This imports ALL device values and variables.  If you have a very large
# number, this might be problematic. However, InfluxDB is pretty good at
# handling lots of variables and disk is cheap these days.
#
# If you have device values or variables that are strings, they won't be
# pushed into InfluxDB. If you wish, there are sections in both the device
# and variable section that let you convert an alphanumeric into a number for
# well-known values like "Off" or "activated".
#
# Note: this is an awful hack. I really am not a programmer, and this is bash
# scripting at its most brutal and inelegant.  It was written in about two hours
# after giving up in frustration with other tools, so there has been little
# checking or serious thought given to streamlining.
# Please, for the love of all things good and holy, re-write this into something
# better and re-submit.
#
# What would really be useful would be a script that pulls all the data out
# of the SQL logger and pushes it into InfluxDB with timestamps included...
#
#

# Indigo data below here
user=demo
pass=myindigopasswordhere
site="127.0.0.1"
port="8176"

# InfluxDB data below here
iuser=indigowrite
ipass=blahblahDEblah
idb=indigo-data
isite="the-hostname-I-got-from-influxdb.c.influxdb.com"
iport="8086"


# Misc below here
wgetbin=/usr/local/bin/wget
wgethost="http://$site:$port"
wgetuser="--user=$user --password=$pass "
wgetcmd1="$wgetbin -q $wgetuser $wgethost"
tmpdir=/tmp/$site-$port
curl=/usr/bin/curl
oldIFS=$IFS

# housekeeping
# make sure the directories exist, delete the old cruft
rm -R $tmpdir
mkdir $tmpdir
mkdir $tmpdir/devices
mkdir $tmpdir/variables



# Start with fetching the devices from Indigo.
#
function getdevices {
IFS=$oldIFS
# get the devices file
cd $tmpdir;$wgetcmd1/devices.txt

# first, fetch all the device values from the Indigo server, ensuring
# that no stupid spaces screw things up
IFS=$'\n'

for device in `cat $tmpdir/devices.txt`
  do
     devicename=`echo $device | cut -f 3 -d "/"`
     IFS=$oldIFS
   cd $tmpdir/devices;`$wgetbin -q $wgetuser "$wgethost$device"`
   done

# Next iterate through all those files and search for stuff
#
IFS=$'\n'
for device in `ls $tmpdir/devices`
   do
      rawstate=`grep displayRawState $tmpdir/devices/$device|tr -d " "|cut -f 2 -d :`
      lastchanged=`grep "lastChanged :" $tmpdir/devices/$device|tr -d " "|cut -f 2 -d :`

      if ! [ -z "$rawstate" ];
       then
          # clear up weird characters, spaces in measurement name
          cleandevice=${device//[^a-zA-Z0-9_-]/}
          cleandevice=`echo $cleandevice | rev | cut -c 4- | rev`

                # now, clear up values to be float.
                # If you have special strings that you want to match and
                # set to 0 or 1 or whatever, this is the place to do it.
                # Note that booleans are not supported. Feel free to patch.
          if [ $rawstate == "off" ]; then rawstate="0"; fi
          if [ $rawstate == "inactive" ]; then rawstate="0"; fi
          if [ $rawstate == "disconnected" ]; then rawstate="0"; fi
          if [ $rawstate == "unavailable" ]; then rawstate="0"; fi

          if [ $rawstate == "active" ]; then rawstate="0"; fi
          if [ $rawstate == "on" ]; then rawstate="0"; fi
          if [ $rawstate == "connected" ]; then rawstate="0"; fi
          if [ $rawstate == "Receiving" ]; then rawstate="0"; fi
          if [ $rawstate == "ready" ]; then rawstate="0"; fi


                # now, last-ditch cleanup - make sure that there is at least one number or "." in
                # the value
                # Note, this will fail on values that contain two dots and will blow up the import.
                # like a value that is "3.2.4.2" will make it through to this point
                # Stuff like dates (10/10/2016) or serial numbers will get passed through here
                # and any non-numeric, non "." will be stripped out.  Ugly, very ugly.
                #
                value=`echo $rawstate | grep [0-9.]`
                # OK, strip out anything that isn't 0-9 or "."
                value=${rawstate//[^0-9.]/}

                if ! [ -z "$rawstate" ];
                   then
                    echo $cleandevice",host=$site,creator=indigo6.1.4,type=dev value="$rawstate  >> $tmpdir/influx-out
                   fi
       fi
   done
IFS=$oldIFS
}


# Fetch the variables from Indigo
#
function getvariables {
IFS=$oldIFS
# get the variables file
cd $tmpdir;$wgetcmd1/variables.txt

# first, fetch all the variable values from the Indigo server, ensuring
# that no stupid spaces screw things up
oldIFS=$IFS
IFS=$'\n'

for variable in `cat $tmpdir/variables.txt`
  do
        variablename=`echo $variable | cut -f 3 -d "/"`
        IFS=$oldIFS
        cd $tmpdir/variables;`$wgetbin -q $wgetuser "$wgethost$variable"`
   done

# Next iterate through all those files and search for stuff
#
IFS=$'\n'
for variable in `ls $tmpdir/variables`
   do
        value=`grep value $tmpdir/variables/$variable|tr -d " "|cut -f 2 -d :`

        if ! [ -z "$value" ];
         then
                # clear up weird characters, spaces in measurement name
                cleanvariable=${variable//[^a-zA-Z0-9_-]/}
                cleanvariable=`echo $cleanvariable | rev | cut -c 4- | rev`

                # now, clear up values to be float.
                # If you have special strings that you want to match and
                # set to 0 or 1 or whatever, this is the place to do it.
                # Note that booleans are not supported. Feel free to patch.
                if [ $value == "off" ]; then value="0"; fi
                if [ $value == "on" ]; then value="1"; fi


                # now, last-ditch cleanup - make sure that there is at least one number or "." in
                # the value
                # Note, this will fail on values that contain two dots and will blow up the import.
                # like a value that is "3.2.4.2" will make it through to this point
                # Stuff like dates (10/10/2016) or serial numbers will get passed through here
                # and any non-numeric, non "." will be stripped out.  Ugly, very ugly.
                #
                value=`echo $value | grep [0-9.]`
                # OK, strip out anything that isn't 0-9 or "."
                value=${value//[^0-9.]/}

                if ! [ -z "$value" ];
                   then
                    echo $cleanvariable",host=$site,creator=indigo6.1.4,type=var value="$value  >> $tmpdir/influx-out
                   fi
         fi
   done
 IFS=$oldIFS
 }



# now, run the main routines that call back to our functions, and post to influxdb
getvariables
getdevices

# Post to InfluxDB with the final result file
$curl -v -i -XPOST 'https://'$isite':'$iport'/write?db='$idb'&precision=s&u='$iuser'&p='$ipass'' --data-binary @$tmpdir/influx-out
Attachments
Screen Shot 2015-11-06 at 2.50.11 PM.png
Screen Shot 2015-11-06 at 2.50.11 PM.png (30.56 KiB) Viewed 6869 times

Posted on
Fri Nov 06, 2015 5:28 pm
jay (support) offline
Site Admin
User avatar
Posts: 18199
Joined: Mar 19, 2008
Location: Austin, Texas

Re: Integration with InfluxDB and Grafana for graphing

You could avoid all those variables by writing your script in Python using Indigo's IOM (which has full access to all device information) - just post the data from the Python script to the InfluxDB. Or make it a plugin.

Just a thought...

Jay (Indigo Support)
Twitter | Facebook | LinkedIn

Posted on
Fri Nov 06, 2015 11:22 pm
kw123 offline
User avatar
Posts: 8333
Joined: May 12, 2013
Location: Dallas, TX

Re: Integration with InfluxDB and Grafana for graphing

John,

this looks really cool! I would support Jay's statement to port this to indigo/ python, no need for other scripting, extracting data from indigo...

Karl

Posted on
Sat Nov 07, 2015 3:30 am
jtodd offline
Posts: 76
Joined: Apr 15, 2014

Re: Integration with InfluxDB and Grafana for graphing

I suppose we use what we have - I know enough shells scripting to be dangerous, but Python is still not where I'm comfortable. So it's a shell script for now.

If someone was interested, it's a pretty trivial thing to upload stuff to InfluxDB via the REST API. It's a file full of lines that look like this:

Sensor1temp,host=127.0.0.1,creator=indigo6.1.4,type=dev value=72.0

I added the "host", "creator" and "type" tags just for convenience to allow separation from other things in that particular InfluxDB database. There could also be a timestamp in UNIX Epoch seconds at the end (separated by a space) but if that's not present it will use the timestamp of the import.

JT

Posted on
Sat Nov 07, 2015 5:22 am
DaveL17 offline
User avatar
Posts: 6741
Joined: Aug 20, 2013
Location: Chicago, IL, USA

Re: Integration with InfluxDB and Grafana for graphing

Nice looking chart! I use Gnuplot outside of Karl's plugin (not a dig against IndigoPlotD) because I want more granular control than Karl can possibly provide. Anyway, my method is to have a schedule that writes selected Indigo data to a flat file on the server--generally once every 15 minutes (I have a few other frequencies too.) Then the schedule runs a series of Gnuplot scripts to update the charts. I've been extremely satisfied. Here's a sample:

temperaturesOutdoor.png
temperaturesOutdoor.png (12.07 KiB) Viewed 6771 times

I do the rest of my labeling in Indigo's control page editor. I have some scripts over in my forums, which I'm about to update with a little bit more functionality.

Good luck and nice job finding a solution that fits your needs.
Dave

I came here to drink milk and kick ass....and I've just finished my milk.

[My Plugins] - [My Forums]

Posted on
Thu Mar 03, 2016 4:21 pm
jtodd offline
Posts: 76
Joined: Apr 15, 2014

Re: Integration with InfluxDB and Grafana for graphing

I've been using this now for a few months, and it's quite successful... when it works.

I did have to edit the script a bit more to look for text values and change them into numeric values, modifying this block of the script and adding a few additional elements:

if [ $rawstate == "off" ]; then rawstate="0"; fi
if [ $rawstate == "inactive" ]; then rawstate="0"; fi
if [ $rawstate == "disconnected" ]; then rawstate="0"; fi
if [ $rawstate == "Connecting" ]; then rawstate="0"; fi
if [ $rawstate == "unavailable" ]; then rawstate="0"; fi
if [ $rawstate == "passive" ]; then rawstate="0"; fi
if [ $rawstate == "false" ]; then value="0"; fi

if [ $rawstate == "active" ]; then rawstate="1"; fi
if [ $rawstate == "on" ]; then rawstate="1"; fi
if [ $rawstate == "connected" ]; then rawstate="1"; fi
if [ $rawstate == "Receiving" ]; then rawstate="1"; fi
if [ $rawstate == "ready" ]; then rawstate="1"; fi
if [ $rawstate == "true" ]; then value="1"; fi


I find that Python crashes quite frequently (every two or three days?) and then the REST API totally fails until I restart Indigo. I get a modality window saying "Python has unexpectedly quit {Ignore blah blah}" and if that shows up then the script doesn't work. This seems to be a bug in Indigo that I'm tickling with the quantity of results that I'm asking via the HTTP interface. Has anyone built a Python requestor for this yet that I can drop in place of my shell scripts that use wget?

JT

Posted on
Thu Mar 03, 2016 6:29 pm
jay (support) offline
Site Admin
User avatar
Posts: 18199
Joined: Mar 19, 2008
Location: Austin, Texas

Re: Integration with InfluxDB and Grafana for graphing

What is the rate of your requests?

Jay (Indigo Support)
Twitter | Facebook | LinkedIn

Posted on
Thu Mar 03, 2016 7:06 pm
jtodd offline
Posts: 76
Joined: Apr 15, 2014

Re: Integration with InfluxDB and Grafana for graphing

I execute the script above once every minute. It works correctly for a semi-random period of time between 2 and 30 days but then crashes with the"Python has quit unexpectedly" error.

JT

Posted on
Fri Mar 04, 2016 10:33 am
jay (support) offline
Site Admin
User avatar
Posts: 18199
Joined: Mar 19, 2008
Location: Austin, Texas

Re: Integration with InfluxDB and Grafana for graphing

How many API requests are made in the script?

Jay (Indigo Support)
Twitter | Facebook | LinkedIn

Posted on
Fri Mar 04, 2016 10:37 am
jay (support) offline
Site Admin
User avatar
Posts: 18199
Joined: Mar 19, 2008
Location: Austin, Texas

Re: Integration with InfluxDB and Grafana for graphing

And, do you have other similar scripts hitting the RESTful API at the same time - maybe only periodically hitting at the same time?

Jay (Indigo Support)
Twitter | Facebook | LinkedIn

Posted on
Fri Mar 04, 2016 11:52 am
jtodd offline
Posts: 76
Joined: Apr 15, 2014

Re: Integration with InfluxDB and Grafana for graphing

No, that's the only thing that I have hitting the RESTful API. I do however have a fairly large number of variables (93) being updated through the Python API every 5 minutes.

I call that routine like this:
/Library/Application\ Support/Perceptive\ Automation/Indigo\ 6/IndigoPluginHost.app/Contents/MacOS/IndigoPluginHost -x $outfile

JT

Posted on
Mon Mar 07, 2016 1:59 am
jtodd offline
Posts: 76
Joined: Apr 15, 2014

Re: Integration with InfluxDB and Grafana for graphing

Are the Python API and the RESTful API colliding in some way?

Posted on
Mon Mar 07, 2016 12:14 pm
jay (support) offline
Site Admin
User avatar
Posts: 18199
Joined: Mar 19, 2008
Location: Austin, Texas

Re: Integration with InfluxDB and Grafana for graphing

Not sure what you mean...

Jay (Indigo Support)
Twitter | Facebook | LinkedIn

Posted on
Mon Mar 07, 2016 12:40 pm
jtodd offline
Posts: 76
Joined: Apr 15, 2014

Re: Integration with InfluxDB and Grafana for graphing

Is there a problem with me using the APIs (REST and Python command line) at the same time? Do you expect that there would be a python crash under any of those circumstances?

Is there anything I can provide during a crash event that would help you diagnose?

JT

Posted on
Tue Sep 06, 2016 12:31 pm
jtodd offline
Posts: 76
Joined: Apr 15, 2014

Re: Integration with InfluxDB and Grafana for graphing

Updates to this shellscript. Changed from wget to curl, fixed some bugs, and put in comments about ramdisks.

I would have added this as an attachment, but annoyingly this site doesn't allow attachments. So here it is in text form. I'm sure I've missed some punctuation that has been mangled by the editor...

JT

Code: Select all
#!/bin/bash
#
# (c) 2015-2016 John Todd jtodd@loligo.com
#
# Indigo to InfluxDB API conveter
#  v0.1 2015/11/06 - first release
#  v0.2 2016/09/06 - changed from wget to curl for portability
#
# This script connects to the Indigo (www.indigodomo.com) home automation
# system and scrapes all the device and variable information and deposits
# it into an InfluxDB using the RESTful APIs of both systems.
#
# You can use your own InfluxDB system, or something like https://customers.influxdb.com/
#
# Once the data is in InfluxDB, you can graph with tools like Grafana
# which will automatically look through InfluxDB tables and have a very
# nice interface.
#
# This script would need to be set up to execute in a crontab of some sort,
# so that it runs every minute.  Something like this:
#   * * * * /Users/johndoe/indigoinflux.sh
#
# Alternately, I just use Indigo's scheduling feature to execute once a minute,
# since if Indigo isn't running then it's kind of meaningless to run this
# script, right? Choose whatever method you like.
#
# Lots of things are not well-handled here. For one, many of the values
# in devices won't be captured since they're not visible via the API.
# You might have to create variables within Indigo and set the variables
# with the "hidden" device values so that they can be imported.
#
# This imports ALL device values and variables.  If you have a very large
# number, this might be problematic. However, InfluxDB is pretty good at
# handling lots of variables and disk is cheap these days.
#
# If you have device values or variables that are strings, they won't be
# pushed into InfluxDB. If you wish, there are sections in both the device
# and variable section that let you convert an alphanumeric into a number for
# well-known values like "Off" or "activated".
#
# I had some problems with Python crashing on older versions of MacOS (10.5)
# but it would be infrequent, and a restart of Indigo would fix it. We'll see
# how this works with 10.11 (El Capitan)
#
# Note: this is an awful hack. I really am not a programmer, and this is bash
# scripting at its most brutal and inelegant.  It was written in about two hours
# after giving up in frustration with other tools, so there has been little
# checking or serious thought given to streamlining.
# Please, for the love of all things good and holy, re-write this into something
# better and re-submit.
#
# What would really be useful would be a script that pulls all the data out
# of the SQL logger and pushes it into InfluxDB with timestamps included...
#
#

# Indigo data below here
user=demo
pass=notmyrealindigopasswordhere
site="127.0.0.1"
port="8176"

# InfluxDB data below here
iuser=indigodatauser
ipass=anotherboguspassword
idb=indigodata
isite="192.168.1.149"
iport="8086"


# Misc below here
curlbin=/usr/bin/curl
curlhost="http://$site:$port"
curluser="-u $user:$pass "
curlcmd1="$curlbin --digest -sS $curluser $curlhost"
#
# I create a ramdisk at boot, so I'm not paving my disks with temp
#  data all the time. Also, much faster for access. To do this,
#  make a file:
#   /Library/LaunchAgents/com.user.makeramdisk.plist
#  and put the following XML in it. Make sure it's owned by root/wheel.
#
# <?xml version="1.0" encoding="UTF-8"?>
# <!DOCTYPE plist PUBLIC "-//Apple Computer//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
# <plist version="1.0">
# <dict>
#   <key>Label</key>
#   <string>com.user.makeramdisk</string>
#   <key>Program</key>
#   <string>/usr/local/bin/makeramdisk.sh</string>
#   <key>RunAtLoad</key>
#   <true/>
# </dict>
# </plist>
#
# Then create a script in /usr/local/bin/makeramdisk.sh that contains this:
#
# #!/bin/sh
# # make a 32m ramdisk for scratch storage
# diskutil erasevolume HFS+ 'ramdisk' `hdiutil attach -nomount ram://65430`
#
# ...and you're good to go!
#
tmpdir=/Volumes/ramdisk/$site-$port
oldIFS=$IFS

# housekeeping
# make sure the directories exist, delete the old cruft
rm -R $tmpdir
mkdir $tmpdir
mkdir $tmpdir/devices
mkdir $tmpdir/variables



# Start with fetching the devices from Indigo. 
#
function getdevices {
IFS=$oldIFS
# get the devices file
cd $tmpdir;$curlcmd1/devices.txt > devices.txt

# first, fetch all the device values from the Indigo server, ensuring
# that no stupid spaces screw things up
IFS=$'\n'

for device in `cat $tmpdir/devices.txt`
  do   
     devicename=`echo $device | cut -f 3 -d "/"`
     IFS=$oldIFS
   cd $tmpdir/devices;`$curlcmd1 $curlhost"$device" > $tmpdir/"$device"`
  done

# Next iterate through all those files and search for stuff
#
IFS=$'\n'
for device in `ls $tmpdir/devices`
   do   
      rawstate=`grep displayRawState $tmpdir/devices/$device|tr -d " "|cut -f 2 -d :`
      lastchanged=`grep "lastChanged :" $tmpdir/devices/$device|tr -d " "|cut -f 2 -d :`
      
      if ! [ -z "$rawstate" ];
       then
          # clear up weird characters, spaces in measurement name
          cleandevice=${device//[^a-zA-Z0-9_-]/}
          cleandevice=`echo $cleandevice | rev | cut -c 4- | rev`
          
                # now, clear up values to be float.
                # If you have special strings that you want to match and
                # set to 0 or 1 or whatever, this is the place to do it.
                # Note that booleans are not supported. Feel free to patch.
          if [ $rawstate == "off" ]; then rawstate="0"; fi
          if [ $rawstate == "inactive" ]; then rawstate="0"; fi
          if [ $rawstate == "disconnected" ]; then rawstate="0"; fi
      if [ $rawstate == "Connecting" ]; then rawstate="0"; fi
          if [ $rawstate == "unavailable" ]; then rawstate="0"; fi
          if [ $rawstate == "passive" ]; then rawstate="0"; fi
                if [ $rawstate == "false" ]; then rawstate="0"; fi
                if [ $rawstate == "Error" ]; then rawstate="0"; fi
      if [ $rawstate == "error" ]; then rawstate="0"; fi

          if [ $rawstate == "active" ]; then rawstate="1"; fi
          if [ $rawstate == "on" ]; then rawstate="1"; fi
          if [ $rawstate == "connected" ]; then rawstate="1"; fi
          if [ $rawstate == "Receiving" ]; then rawstate="1"; fi
          if [ $rawstate == "ready" ]; then rawstate="1"; fi
                if [ $rawstate == "true" ]; then rawstate="1"; fi
          
                # now, last-ditch cleanup - make sure that there is at least one number or "." in
                # the value
                # Note, this will fail on values that contain two dots and will blow up the import.
                # like a value that is "3.2.4.2" will make it through to this point
                # Stuff like dates (10/10/2016) or serial numbers will get passed through here
                # and any non-numeric, non "." will be stripped out.  Ugly, very ugly.
                #
                value=`echo $rawstate | grep [0-9.]`
                # OK, strip out anything that isn't 0-9 or "."
                value=${rawstate//[^0-9.]/}

                if ! [ -z "$rawstate" ];
                   then
                    echo $cleandevice",host=$site,creator=indigo6.1.10,type=dev value="$rawstate  >> $tmpdir/influx-out
                   fi          
       fi   
   done
IFS=$oldIFS
}


# Fetch the variables from Indigo
#
function getvariables {
IFS=$oldIFS
# get the variables file
cd $tmpdir;$curlcmd1/variables.txt > variables.txt

# first, fetch all the variable values from the Indigo server, ensuring
# that no stupid spaces screw things up
oldIFS=$IFS
IFS=$'\n'

for variable in `cat $tmpdir/variables.txt`
  do
        variablename=`echo $variable | cut -f 3 -d "/"`
        IFS=$oldIFS
        cd $tmpdir/variables;`$curlcmd1 $curlhost"$variable" > $tmpdir/"$variable"`
   done

# Next iterate through all those files and search for stuff
#
IFS=$'\n'
for variable in `ls $tmpdir/variables`
   do
        value=`grep value $tmpdir/variables/$variable|tr -d " "|cut -f 2 -d :`

        if ! [ -z "$value" ];
         then
                # clear up weird characters, spaces in measurement name
                cleanvariable=${variable//[^a-zA-Z0-9_-]/}
                cleanvariable=`echo $cleanvariable | rev | cut -c 4- | rev`

                # now, clear up values to be float.
                # If you have special strings that you want to match and
                # set to 0 or 1 or whatever, this is the place to do it.
                # Note that booleans are not supported. Feel free to patch.
                if [ $value == "off" ]; then value="0"; fi
                if [ $value == "on" ]; then value="1"; fi
               
                if [ $value == "true" ]; then value="1"; fi
                if [ $value == "false" ]; then value="0"; fi


                # now, last-ditch cleanup - make sure that there is at least one number or "." in
                # the value
                # Note, this will fail on values that contain two dots and will blow up the import.
                # like a value that is "3.2.4.2" will make it through to this point
                # Stuff like dates (10/10/2016) or serial numbers will get passed through here
                # and any non-numeric, non "." will be stripped out.  Ugly, very ugly.
                #
                value=`echo $value | grep [0-9.]`
                # OK, strip out anything that isn't 0-9 or "."
                value=${value//[^0-9.]/}

                if ! [ -z "$value" ];
                   then
                    echo $cleanvariable",host=$site,creator=indigo6.1.10,type=var value="$value  >> $tmpdir/influx-out
                   fi
         fi
   done 
 IFS=$oldIFS
 } 
   
          
   
# now, run the main routines that call back to our functions, and post to influxdb
getvariables
getdevices

# Post to InfluxDB with the final result file

#echo "This is the path: $tmpdir/influx-out"

$curlbin -Ss -v -i -XPOST 'http://'$isite':'$iport'/write?db='$idb'&precision=s&u='$iuser'&p='$ipass'' --data-binary @$tmpdir/influx-out


Who is online

Users browsing this forum: No registered users and 2 guests