Migrate SQLite to PostGreSQL

Posted on
Tue Mar 24, 2015 8:40 pm
kw123 offline
User avatar
Posts: 6817
Joined: May 12, 2013
Location: Dallas, TX

Migrate SQLite to PostGreSQL

SQLite to PostGreSQL migration of a larger database. Assuming you have a large SQLdatabase for > 1 year and would not like to lose the data as it happens when you simply switch from SQLite to PostGreSQL in INDIGO

Some comments and observations:

In INDIGO, the
1. type definitions of the columns in SQLite and PostGreSQL are NOT the same
2. column sequences in SQLite and PostGreSQL are NOT the same

As a consequences one can not simply extract the SQL data base and up-load it into the PostGreSQL database

Here the path I am taking:

0. install PostGreSQL
1. In INDIGO in the SQL logger config, enable PostgreSQL for a short period of time, until INDIGO has defined all existing devices and variables, keys etc. Then switch back to SQLite.
2. copy SQLite database to temp directory
3. create a .dump file from the SQLite file
4. extract column definitions and sequence of both database structure (SQLite and PostGReSQL)
5. reformat the SQL dump output to a format that PostGeSQL understands, using the proper definitions and mappings
6. import the data into PostGreSQL :first delete all entries (structure still there, but empty tables), then insert the data, then commit.

for a 2.6GByte ~ 13 months database this process takes ~ 45 Minutes on a MACPRO 2008

After that all data is in the new PostGresSQL database and can be queried delivering the SAME results as the SQLite database

Thats the good news.

Now the problem: When now switching to PostGreSQL in INDIGO logger, I though INDIGO would just pick it up and run with the database. It was setup by INDIGO, populated with the right fields formats ..

But for each device entry we get the following error message:
Code: Select all
  SQL Logger Error                INSERT INTO device_history_273552650 ("onoffstate") VALUES (%s);
  SQL Logger Error                exception trying to insert row into table device_history_1043783476 ("backWindowSensor2") -- unable to modify table
  SQL Logger Error                ERROR:  duplicate key value violates unique constraint “device_history_1043783476_pkey"

What is missing, declaration of some keys, are the sql statements correct?
I though by letting INDIGO creating the database it should be setup correctly.

any pointer is appreciated.

Karl

here the code that does it all:
Code: Select all
# Karl Wachs
# 2015/3/23
#  migrate SQLite to Postgres
#

import StringIO
import csv
import time
import os, pwd, subprocess

timeStart = time.time()
userName   =   pwd.getpwuid( os.getuid() )[ 0 ]


POSTGRESuid      = "postgres"
TEMPPATH      = "/Users/"+userName+"/Documents/postgres/"
POSTGRESPATH   = "/Library/PostgreSQL/bin/"
SQLITEPATH      = "/usr/bin/"
INDIGOPATHlog   = "/Library/Application Support/Perceptive Automation/Indigo 6/Logs/"
DUMPFILE      = "indigo_history.dump"
HISTfile      = "indigo_history.sqlite"
POSTGRESdb      = "indigo_history"


step0 =True      # make copy
step1 =True      # make dump file
step2 =True      # format from dump to postgres import
step3 =False   # make postgres indigo db
step4 =True      # import into postgres


# static commands
cmd0   ="cp '"+INDIGOPATHlog+HISTfile+"'  '"+ TEMPPATH+HISTfile+"'"            ## make copy
cmd1   =SQLITEPATH+"sqlite3 "+ TEMPPATH+HISTfile+"  .dump  > "+ TEMPPATH+DUMPFILE   ## create dump file from sqlite
cmd3   =POSTGRESPATH+"createdb -U "+POSTGRESuid+" "+POSTGRESdb                  ## create POSTGRES indigo DB


try:
   ret = os.makedirs(TEMPPATH )  # make the data dir if it does not exist yet
except:
   pass

if step0:
   print "cmd: ",cmd0
   os.system(cmd0)
   print" copy SQLite file ... done"
   print "seconds  since start:", time.time() - timeStart

if step1:
   print "cmd: ",cmd1
   os.system(cmd1)
   print "SQLITE step 1 done"
   print "seconds  since start:", time.time() - timeStart


if step2:
   print " formatting commands from SQLite DUMP to postgres IMPORT "
   f=open(TEMPPATH+DUMPFILE,"r")
   nLine=0
   for line in f:
      nLine+=1
      line=line.strip("\n")
      if len(line) < 3: continue  # no junk lines
      if line.find("BEGIN TRANSACTION;")> -1:
         continue
      if line.find("CREATE")> -1:
         try:
            g.write("COMMIT;\n")
            g.close()
         except:
            pass
         

         fname= line.split(" (")[0].split("TABLE ")[1].strip()
         print "formatting table: ", fname
         g=open(TEMPPATH+"postgresimport-"+fname,"w")
         g.write("BEGIN TRANSACTION;\n")                # first line in sql postgre file
         g.write("DELETE FROM "+fname+" WHERE ID >0;\n")    # reset id's remove all previous entries
         g.write("COMMIT;\n")
         g.write("BEGIN TRANSACTION;\n")                # start new block
         
         
         # now replace create command strings from sqlite to postgre syntax

         fieldD1 = line.strip(");").split("(")[1].split(",")
         fieldDefs1=[[0,"id", "integer not null default nextval('"+fname+"_id_seq'::regclass"]]
         fieldDefs1.append([1,"ts", "timestamp without time zone default now()"])
         nFields1= len(fieldD1)
         for n in range(2,nFields1):
            b = fieldD1[n].split()
            b.insert(0,n)
            fieldDefs1.append(b)
      
         fieldDefs2=[[0,"id", "integer not null default nextval('"+fname+"_id_seq'::regclass"]]
         fieldDefs2.append([1,"ts", "timestamp without time zone default now()"])
         cmd4=POSTGRESPATH+"psql -U "+POSTGRESuid+" "+POSTGRESdb+" -t -A -F ' ' -c  \"\\d "+ fname+";\""
         p=subprocess.Popen(cmd4,shell=True,stdout=subprocess.PIPE,stderr=subprocess.PIPE)
         defs =p.communicate()[0]
         defs=unicode(defs).split("\n")
         n=-1
         for def2 in defs:
            n+=1
            if n<2: continue
            def2.strip("\n")
            if def2.find("id ") ==0: continue
            if def2.find("ts ") ==0: continue
            if def2.find("None")==0: continue
            fields2= def2.split()
            if len(fields2) < 2: continue
            fieldDefs2.append([n,fields2[0],fields2[1].upper()])
         nFields2= len(fieldDefs2)
         
         # now combine, SQLite and POSTGRES columns dont have to be in the same sequence, fix with index pointer [0]
         for k in range(2,nFields2):
            for l in range(2,nFields1):
               if fieldDefs1[l][1].strip('"').strip("'") == fieldDefs2[k][1].strip('"').strip("'"):      # same quantity?, remove extra ' or " for comparison
                  fieldDefs2[k][0] = l                                                   # set index pointer
                  break
         

         continue
         
      if line.find("INSERT")==0:
         if nFields1 >0 :
            xxx = line.strip(");").split(" VALUES(")
            if len(xxx)<2: continue
            if len(xxx[1])<2: continue
            f = StringIO.StringIO( xxx[1] )      ### split CSV type file
            r = csv.reader(  f , delimiter=',', quotechar="'")
            fieldsinLine = r.next()
            nFields3 = len(fieldsinLine)
            out = fieldsinLine[0] +",'"+fieldsinLine[1]+"'"
            for k in range(2,nFields2):
               l = fieldDefs2[k][0]
               aaa = fieldsinLine[l].strip("'")
               if fieldDefs2[k][2] =="TEXT":
                  aaa="'"+aaa+"'"
               elif aaa =="": aaa="NULL"
               out +=","+aaa
            line=xxx[0]+" VALUES("+out+");"  # compose the line again

         
         g.write(line+"\n")
         if nLine%100000 ==0: print "finished line: ",nLine, "; seconds since start:", time.time() - timeStart
         continue

   print nLine

   f.close
   try:
      g.write("COMMIT;\n")
      g.close
   except:
      pass
   print "seconds  since start:", time.time() - timeStart

if step3:
   print "cmd: ", cmd3
   os.system(cmd3)
   print "seconds  since start:", time.time() - timeStart


if step4:
   postgresFiles=[]  ## make import file list
   dList = os.listdir(TEMPPATH)
   for ld in range(len(dList)):
      if dList[ld].find("postgresimport-")>-1: postgresFiles.append(dList[ld])
   
   for ll in range(len(postgresFiles)):   # now import each device / variable table one by one
      cmd4=POSTGRESPATH+"psql -U "+POSTGRESuid+" "+POSTGRESdb+" < "+TEMPPATH+postgresFiles[ll]+" >"+ TEMPPATH+postgresFiles[ll]+".log  2>" + TEMPPATH+postgresFiles[ll]+"err.log"
      print ll,"out of ",len(postgresFiles), "; seconds since start:", time.time() - timeStart,";  .. now working on ",postgresFiles[ll]
      retcode= os.system(cmd4)
      if retcode >0:
         print "return code from sql > 0, stop check file: ",postgresFiles[ll]+"err.log"
#         exit()
      else:
         try:
            os.remove(TEMPPATH+postgresFiles[ll]+".log")
         except:
            pass
         try:
            os.remove(TEMPPATH+postgresFiles[ll]+"err.log")
         except:
            pass

print "seconds  since start:", time.time() - timeStart

print "done"


and here an SHORT example SQL upload file:
Code: Select all
BEGIN TRANSACTION;
DELETE FROM device_history_39256487 WHERE ID >0;
COMMIT;
BEGIN TRANSACTION;
INSERT INTO "device_history_39256487" VALUES(1,'2014-05-14 21:24:54',0,'NULL',0,'NULL',0,'NULL');
INSERT INTO "device_history_39256487" VALUES(2,'2014-05-14 21:24:55',0,'0 seconds',0.807,'0.807 kWh',0,'NULL');
INSERT INTO "device_history_39256487" VALUES(3,'2014-05-14 21:24:55',0,'0 seconds',0.807,'0.807 kWh',852.458,'852.458 W');
INSERT INTO "device_history_39256487" VALUES(4,'2014-05-14 21:27:47',0,'0 seconds',0.848,'0.848 kWh',852.458,'852.458 W');
INSERT INTO "device_history_39256487" VALUES(5,'2014-05-14 21:27:47',0,'0 seconds',0.848,'0.848 kWh',1.888,'1.888 W');
INSERT INTO "device_history_39256487" VALUES(6,'2014-05-14 21:28:08',0,'0 seconds',0.86,'0.860 kWh',1.888,'1.888 W');
COMMIT;

Posted on
Tue Mar 24, 2015 9:20 pm
RogueProeliator offline
User avatar
Posts: 2303
Joined: Nov 13, 2012
Location: Baton Rouge, LA

Re: Migrate SQLite to PostGreSQL

What is missing, declaration of some keys, are the sql statements correct?

It looks to me that when you were inserting the values into the Postgre database, you inserted the ID column as well; these should be auto-numbering sequences, so essentially you have used up the numbers that the sequence expects to use next. Try reseting the sequence as such:

Code: Select all
SELECT pg_catalog.setval(pg_get_serial_sequence('your_history_table_name', 'id'), (SELECT MAX(id) FROM your_history_table_name)+1);


I think that is the right syntax... note that MAX(id) query may fail for empty tables; not positive but I think it would return NULL.

Adam

Posted on
Wed Mar 25, 2015 10:46 am
kw123 offline
User avatar
Posts: 6817
Joined: May 12, 2013
Location: Dallas, TX

Re: Migrate SQLite to PostGreSQL

Adam,
thanks for the help.

Just tried this:
Code: Select all
INSERT INTO device_history_39256487 (ts,accumenergytotal,curenergylevel, ...) VALUES (timestamp '2015-01-02 15:22:00',55,33, ...);

that inserts a new record with a new ID#+1 with the old time stamp as requested

That seems to work! INDIGO is running happily with it!! :D :D :D

just one last warning from indigo when starting:
Code: Select all
  SQL Logger                      connected to indigo_history as postgres on 127.0.0.1
  SQL Logger                      searching for unused device and variable tables
  SQL Logger                      search complete (none found)
  SQL Logger                      updating all device tables
  SQL Logger                      modifying column "maxtemp7thday" type of table device_history_120648462 ("NoaPlusLoveField") from real to TEXT
  SQL Logger                      modifying column "mintemp6thnight" type of table device_history_120648462 ("NoaPlusLoveField") from real to TEXT
  SQL Logger                      modifying column "probofprecip5thnight" type of table device_history_120648462 ("NoaPlusLoveField") from integer to REAL
  SQL Logger                      modifying column "probofprecip6thnight" type of table device_history_120648462 ("NoaPlusLoveField") from real to TEXT
  SQL Logger                      modifying column "probofprecip7thday" type of table device_history_120648462 ("NoaPlusLoveField") from real to TEXT
  SQL Logger                      updating all variable tables
  SQL Logger                      update complete

It looks as if the type definitions are not well defined. The definition initially were created by INDIGO, now it is changing them.
And the .. probofprecip5thnight column should have the same data definition as column probofprecip6thnight in the NOAA data.
But it is self correcting. :)

will fine tune the script and then publish and will do the reverse too (POSTGRE --> SQLITE)


Karl

Posted on
Thu Mar 26, 2015 7:35 pm
kw123 offline
User avatar
Posts: 6817
Joined: May 12, 2013
Location: Dallas, TX

Re: Migrate SQLite to PostGreSQL

there seems to be a difference between SQLite and POSTGRE data structures

Both setup by indigo in SQLlogger:
SQLite has more columns for the same device as in POSTGRESQL

Code: Select all
PLSQL ...  "\d device_history_1427925925"
[
[0, u'id', u"integer not null default nextval('device_history_1427925925_id_seq'::regclass)"],
[1, u'ts', u'timestamp without time zone default now()'],
[2, u'display', u'TEXT'],
[3, u'humidity', u'INTEGER'],
[4, u'lastupdated', u'TEXT'],
[5, u'maxhumidity', u'INTEGER'],
[6, u'maxtemperature', u'TEXT'],
7 not there
[8, u'minhumidity', u'INTEGER'],
[9, u'mintemperature', u'TEXT'],
10-12 not there
[13, u'batterylevel', u'INTEGER'],
[14, u'humiditystatus', u'TEXT'],
[15, u'signalstrength', u'INTEGER'],
[11, u'temperature', u'TEXT'],
[16, u'type', u'INTEGER']
17-18 not there
]


SQLITE3 ...  .schema   device_history_1427925925
[
[0, u'id', u'INTEGER PRIMARY KEY'],
[1, u'ts', u'TIMESTAMP DEFAULT CURRENT_TIMESTAMP'],
[2, u'display', u'TEXT'],
[3, u'humidity', u'INTEGER'],
[4, u'lastupdated', u'TEXT'],
[5, u'maxhumidity', u'INTEGER'],
[6, u'maxtemperature', u'INTEGER'],
[7, u'message', u'TEXT'],
[8, u'minhumidity', u'INTEGER'],
[9, u'mintemperature', u'INTEGER'],
[10, u'resetdayvalue', u'INTEGER'],
[11, u'temperature', u'INTEGER'],
[12, u'voltage', u'INTEGER'],
[13, u'batterylevel', u'INTEGER'],
[14, u'humiditystatus', u'TEXT'],
[15, u'signalstrength', u'INTEGER'],
[16, u'type', u'INTEGER'],
[17, u'brightnesslevel', u'INTEGER'],
[18, u'onoffstate', u'BOOL']]   


any idea how to handle this when migrating from SQLite to POSTGRE.. drop the SQL fields or add them to POSTGRE
the fields are all empty in SQLite. This is aRFXCOM thermometer and should not have "brightness" anyway, see screen shot.

same result also visible with interactive apps.

Karl
Attachments
Screen Shot 2015-03-26 at 8.31.17 PM.png
Screen Shot 2015-03-26 at 8.31.17 PM.png (32.65 KiB) Viewed 3851 times

Posted on
Fri Mar 27, 2015 5:08 pm
matt (support) offline
Site Admin
User avatar
Posts: 19724
Joined: Jan 27, 2003
Location: Texas

Re: Migrate SQLite to PostGreSQL

I'd drop those fields since they shouldn't be there. I think you can get it into this state if the device type is changed. I suspect your SQLite database was changed, the device type (or something?) changed that caused the extra columns, then later you create the PostgreSQL database and it only used the active/current states. That is, if my memory is correct, once a state (column) is added to a table Indigo never tries to remove it since that would drop all the data for that column. You can manually drop those columns though if you want.

Image

Posted on
Sat Mar 28, 2015 12:44 am
kw123 offline
User avatar
Posts: 6817
Joined: May 12, 2013
Location: Dallas, TX

Re: Migrate SQLite to PostGreSQL

Matt,
yes SQLITE is ~ 1 year of data, all kinds of things happened during the year

one more question:
can I switch between SQLITE and POSTGRES from within a plugin or applescript?


Current status: 13+months, 2.6 Gbyte SQLite --> POSTGRE conversion takes ~ 1 hour. 9x% is fully automatic. , Data gap of < 5 minutes between SQLite and POSTGRES database.


here the steps:

1. manually switch from SQLITE to POSTGRES, let INDIGO create the data base
2. switch back to SQLite after POSTGRES is initialized (~ 10 seconds)
3. copy indigo_history.sqlite to temp directory
4. extract table definitions from SQLite copy and Postgres database
5. run select on SQLlite and format to be able to import into postgres (with new POSTGRES table definitions, thats why we need # 1)
6. drop Postgres INDIGO db
7. create Postgres INDIGO DB ( for a clean start)
8. import formatted postgres file (~ 55 minutes up to here)
9. select data from production SQLite with ID> than the last from the SQLite copy DB to catch entries from the last 50 minutes ~ 1minute
10. reformat to postgres ~ 1 minutes
11. import into postgres ~ 1 minutes
12 in INDIGO switch from SQLite to postgres.
done

I like to automate # 1, 2, 12 ==> how to switch to/from POSTGRES/SQLITE from within a plugin.

The rest works automatically

Karl

Posted on
Sun Mar 29, 2015 9:05 pm
matt (support) offline
Site Admin
User avatar
Posts: 19724
Joined: Jan 27, 2003
Location: Texas

Re: Migrate SQLite to PostGreSQL

Hmmm... not sure this is going to be possible. You would have to manually change the SQL Logger's plugin to switch that particular pref/setting, then have the SQL Logger plugin restart. The problem is that when you request the restart, I believe Indigo will overwrite your manual change to the settings file. So you would need to shutdown the plugin first, then tweak the settings file, then restart the plugin. The issue with that is there is only an API to restart a running plugin, not APIs to stop and start them. The answer to that would be to have the entire Indigo Server shutdown, then tweak the file, then restart the Indigo Server. That could work, I think, if cycling the entire Indigo Server is acceptable for what you are trying to accomplish.

Image

Posted on
Sun Mar 29, 2015 9:26 pm
RogueProeliator offline
User avatar
Posts: 2303
Joined: Nov 13, 2012
Location: Baton Rouge, LA

Re: Migrate SQLite to PostGreSQL

I like to automate # 1, 2, 12 ==> how to switch to/from POSTGRES/SQLITE from within a plugin.

Why do you want to do this from within a plugin? Just curious, but it doesn't seem like your effort would be worth the benefit -- I don't see someone going back-and-forth between SQLite and Postgre continuously and so a script, even with manual instructions, should be enough.

Just a thought... I've had similar thoughts (well, not about database, but I mean on tasks) where I decided after some thought that the effort wasn't worth the reward of generating a plugin...

Adam

Posted on
Sun Mar 29, 2015 11:02 pm
kw123 offline
User avatar
Posts: 6817
Joined: May 12, 2013
Location: Dallas, TX

Re: Migrate SQLite to PostGreSQL

because we can .. or "just do it" :D

would like to add some more functions:

my wind sensor creates 4 entries every 15 seconds. one of them is sufficient. it takes four updates to the db to create the final state (i.e. updates wind, direction text ..is updated one after the other) After 7 months it now has 3 million records ~ 30% of the whole database records. I would like to squeeze out the first 3 records and only leave the last one. This already works as part of the conversion from sqlite to postgres

Next issue I would like to tackle are the NOAA device records.They make up ! 30% of the database from a size point of view. Would like to be able to purge INDIVIDUAL devices to ~ x .. months.

just my hobby.

Karl

next project will be integration of a http://www.synapse-wireless.com setup. ... met the CEO and he gave me a starter kit, now I have to do something with it.

Posted on
Fri Apr 10, 2015 12:22 pm
kw123 offline
User avatar
Posts: 6817
Joined: May 12, 2013
Location: Dallas, TX

Re: Migrate SQLite to PostGreSQL

I am almost done making it go from SQLITE --> POSTGRES without hick-up. One last issue:

old device in SQLITE:
- stateNAME: 'accumenergytimedelta1', TYPE: 'INTEGER
- stateNAME: 'accumenergytotal1', TYPE: 'INTEGER'
...
device newly recreated in Postgres
- stateNAME: 'accumenergytimedelta', TYPE: 'INTEGER'
- stateNAME: 'accumenergytotal', TYPE: 'REAL'
...
some of the States in ONE device are defined differently in SQLITE than POSTGRES.
I can handle the difference in TYPE but I am not sure what to do about the difference in stateNAME. SQLITE has an additional "1" at the end of some of the stateNames.

Was the convention changed in the last year?

Any advice? Thanks

Karl

Posted on
Fri Apr 10, 2015 12:35 pm
matt (support) offline
Site Admin
User avatar
Posts: 19724
Joined: Jan 27, 2003
Location: Texas

Re: Migrate SQLite to PostGreSQL

The 2 states with the 1 suffix are actually older, pre 6.0 beta 12, versions. Upon opening older databases in Indigo 6 beta 12 and beyond, the state is changed to the versions without the 1 extension, although as you have noticed the SQL database table is not updated (the old column is not removed).

Image

Posted on
Fri Apr 10, 2015 2:56 pm
kw123 offline
User avatar
Posts: 6817
Joined: May 12, 2013
Location: Dallas, TX

Re: Migrate SQLite to PostGreSQL

Matt,
thanks,

is this the only one? can just do:
if stateSourceName.strip("1") ==stateDestName.strip("1"):
fondMatch=True
...


Karl

Posted on
Fri Apr 10, 2015 3:47 pm
matt (support) offline
Site Admin
User avatar
Posts: 19724
Joined: Jan 27, 2003
Location: Texas

Re: Migrate SQLite to PostGreSQL

I would only do it for those 2 states. I think there are some other states (native or plugin) that might have legitimate 1 ending characters.

Image

Posted on
Sat Apr 11, 2015 2:24 pm
kw123 offline
User avatar
Posts: 6817
Joined: May 12, 2013
Location: Dallas, TX

Re: Migrate SQLite to PostGreSQL

https://www.dropbox.com/s/ha00t4tj2id04xa/sql2postgre.py?dl=0

This python program can be used to convert an existing populated INDIGO SQLITE database to POSTGRES or the other way around.
Simply switch from SQLITE to PSOTGRES in indigo will not migrate ANY data. You will start from scratch.

- ALL or selected devices can be migrated and a date purge can be included for specific devices , e.g. do not convert NOAA data older than January 1, 2015.

- It CAN (if selected) purge data from ( e.g. RFXCOM) devices that have the same date-time stamp. It will take the last entry. Some device/plugins create multiple entries for one real change, i.e. Wind, Windgust, direction,.. all have create their own changes in the same second. In total 4 changes where only one i.e. needed.

These and other parameters can be changed directly in the program. SOME basic python knowledge is needed.

I have tested it in both directions several times and used the new databases to e.g. create plots with data > 12 months. And the resulting plots look identical.

In my case the resulting database after purging double/tripple timestamp entries devices and reduction of NOAA data to 3 months the size of the SQLITE database went from 2.5 GByte down to ~ 0.5 GByte! My wind sensor data went down from ~ 3.5 Million entries to ~ 1 Million entries with NO information lost.

It performs several checks and issues error messages or warnings when it finds a problem. Naturally I could only test it in my environment. The database used to develop and test it contains devices/ variables from INSTEON, ZWAVE, variables created by ACTIONs, other plugins etc.

Here the detailed direction for SQLITE to Postgres
########## SQLITE ==> POSTGRES
# 1. save the linked file to a directory (eg ~/documents/convert/)
# 2. install POSTGRES on MAC, see installation link in INDIGO ( http://wiki.indigodomo.com/doku.php?id= ... sql_logger )
# 3. ==>> STRONGLY RECOMMENDED for best results:
# in INDIGO SQLOGGER switch to postgres just for ~10 seconds (until all tables are defined)
# then switch back to SQLITE
# if not done, sometimes INDIGO times out switching to POSTGRES at the end of this conversion process , as it is changing too many table definitions. SQLITE and POSTGRES have differnet format definitions in INDIGO (text / int / real / bool)
# in the PYTHON file uncomment : CONVERSIONDirection = "SQLITE2POSTGRES" should be already done
# 4. open terminal; cd into directory where this file is stored (eg ~/documents/convert)
# 5. python sql2postgres.py <== type in terminal
# 6. have a coffee or 2
# 7. when finished sucessfully in INDIGO switch to POSTGRES in SQLLOGGER --- in my case 25 minutes for 2.5GBytes = 1+year of data
#
# anything that is in the POSTGRES INDIGO database will be overwritten with the information from SQLITE
# you might want to run it several times, first with debug level =1 to see any error messages
# you can run it as many times as you like
# no information in the SQLITE DB file is changed (if your move from SQLITE to POSTGRES (first step is to make a copy of the file)

########## POSTGRES ==> SQLITE
# 1. save this file to a directory (eg ~/documents/convert/)
# 2. ==>> STRONGLY RECOMMENDED for best results:
# in INDIGO SQLOGGER switch to SQLITE just for ~10 seconds (until all tables are defined)
# then switch back to POSTGRES
# if not done, sometimes INDIGO times out switching to SQLITE at the end of this conversion process , as it is changing too many table definitions. SQLITE and POSTGRES have differnet format definitions in INDIGO (text / int / real / bool)
# in the PYTHON file uncomment : CONVERSIONDirection = "POSTGRES2SQLITE"
# 3. open terminal; cd into directory where this file is stored (eg ~/documents/convert)
# 4. python sql2postgres.py <== type in terminal
# 5. have a coffee or 2
# 6. the newly created SQLite database is moved into place by this program
# 6. when finished sucessfully in INDIGO switch to SQLITE in SQLLOGGER
# process time: ~ 20 minutes for 2.6GBytes = 1+year of data
#


Karl

Posted on
Mon Jun 27, 2016 4:20 pm
kw123 offline
User avatar
Posts: 6817
Joined: May 12, 2013
Location: Dallas, TX

Re: Migrate SQLite to PostGreSQL

after finally migrating to postgres from sqlite, I have to say it is worth it.
The response time is better, the cpu utilization down and it can handle my 40+GByte of data, while sqlite was choking (could not open configuration anymore).
But the migration was not trivial (I wanted to keep all my data -- starting from scratch is easy)
Some of the devices over the years have changed states (added / removed some) And some definitions of the fields are different in POSTGRES vs SQLITE.
A simple dump/import does not work.

I anyone needs advice, what to avoid, traps,.. let me know.

Karl

Who is online

Users browsing this forum: No registered users and 1 guest