A couple questions:
1) Is there rate limiting? I have variables that are updated every 15 seconds, yet I only see a total of 1440 entries per day (1 per minute). BTW, I don't "need" historical data every 15 seconds, but I would like to make sure the behavior I'm seeing is correct.
If you are updating variables every 15 seconds that lead me to wonder if there might be a better event-driven solution. I.e. only taking action when the thing of interest becomes interesting. [edit: never mind, for a data logging application this might be appropriate. However you might want to check out rrdtool for this sort of thing, which includes a both a database designed for time-series data, and a graphing engine for viewing it]
2) Has anyone compared the performance of sqlite vs postgres? A query like "select * from variable_history where var_name='kwH_month' and date(ts)="2010-03-19" order by ts desc limit 1;" takes 12 seconds to execute. That seems slow to me, but I don't work with databases much. This is on a G4 1.5ghz PPC. There are 3254601 records in this table.
This means you need to create an index. That causes SQL to maintain a sorted view of the column(s) you're accessing, so that it doesn't have to process a large data set during retrieval. Try:
CREATE INDEX idx_var_name_ts ON variable_history(var_name, ts);