Import Ecobee Data in InfluxDB
By chimo on (updated on )This post is a note-to-self about how I imported my ecobee data into InfluxDB.
Getting Ecobee Data
Ecobee used to provide accounts to access their API but from what I’ve read, they’ve stopped offering this a while back. You can download a snapshot of your data in CSV format from your ecobee account, however.
So I’ve been as far back as I could and downloaded CSV files for every month. I’ve also set a reminder for myself on the first of every month to go and download the data from the month prior. I’ve been doing this for a while with the intention of importing this data into a database to then visualize it. Today was that day.
My file structure is as follows:
~/ecobee/
|-- 2023/
| |-- dec.csv
| |-- nov.csv
|-- 2024/
| |-- apr.csv
| |-- ...
|-- 2025/
...
Preparing InfluxDB
You can set this up some other way, of course, as long as you have a token with write access to a bucket.
Since I already had InfluxDB setup for some other things, I already had an “Operator” token. This is how I decided to created the necessary InfluxDB user/org/bucket/token:
# Run as operations user
chimo@influxdb:~$ export INFLUX_TOKEN=OPS_TOKEN # Create organization
chimo@influxdb:~$ influx org create --name ecobee ID Name f031e2bf5d087cca ecobee # Create user
chimo@influxdb:~$ influx user create --org ecobee --name ecobee ID Name 0e93ba6510965000 ecobee WARN: initial password not set for user, use `influx user password` to set it # Set password
chimo@influxdb:~$ influx user password --name ecobee ? Please type new password for "ecobee" ********* ? Please type new password for "ecobee" again ********* Successfully updated password for user "ecobee" # Create token
chimo@influxdb:~$ influx auth create --org ecobee --all-access -u ecobee ID Description Token User Name [...] 0e93bb6ebed65000 [token] ecobee [...] # Use ecobee token from now on
chimo@influxdb:~$ export INFLUX_TOKEN=ECOBEE_USER_TOKEN # Create bucket
chimo@influxdb:~$ influx bucket create --name ecobee --org ecobee ID Name Retention Shard group duration Organization ID Schema Type 0921b92574ec4d5a ecobee infinite 168h0m0s f031e2bf5d087cca implicit
Reformating and Importing Data
For a given file, the following script:
- Prints the necessary annotation headers.
Note: you could probably use the “–header” flag of the “influx write” instead. - Skips original headers included in the original ecobee files
- Combines the separate “Date” and “Time” fields into one field using “YYYY-mm-ddTHH:MM:SSZ” datetime format.
- Rewrites the original file with the new output.
- Imports the data into InfluxDB
#!/bin/sh -eu
filename="${1}"
output=$(
# New headers
echo "#constant measurement,living_room
#constant tag,source,csv
#datatype dateTime:RFC3339,string,string,string,string,double,double,double,long,long,double,double,long,long,long,long,long,double,double,long,long,long,long,long,long,long
Date,System Setting,System Mode,Calendar Event,Program Mode,Cool Set Temp (C),Heat Set Temp (C),Current Temp (C),Humidity Set Point (%RH),Current Humidity (%RH),Outdoor Temp (C),Wind Speed (km/h),Cool Stage 1 (sec),Heat Stage 1 (sec),Heat Stage 2 (sec),Fan (sec),Humidifier (sec),DM Offset,Thermostat Temperature (C),Thermostat Humidity (%RH),Thermostat Motion,Thermostat AirQualityAccuracy,Thermostat AirQuality,Thermostat VOCppm,Thermostat CO2ppm,Thermostat AirPressure"
# Skip original headers, combine date and time into a single datetime field.
tail -n +7 "${filename}" | sed -r 's/^([0-9]{4}-[0-9]{2}-[0-9]{2}),([0-9]{2}:[0-9]{2}:[0-9]{2})/\1T\2Z/g'
)
# Update file
echo "${output}" > "${filename}"
# Write data from file into InfluxDB
influx write -b ecobee --org ecobee -f "${filename}"
In order to process all the files in my ~/ecobee folder, I simply
called the script via the “find
” command:
# Format and write the data to InfluxDB
chimo@influxdb:~$ find $(pwd -P)/ecobee/ -type f -exec scripts/ingest.sh {} \;
Visualizing Data
You can then setup InfluxDB dashboard to visualize the data:

You could also use Grafana to pull the data from InfluxDB and present dashboards instead.