Update (2019-03-14): Netatmo changed their login process on Mar 7th 2019 – the download link below refers to the updated script on my GitHub page.
I’m a proud owner of a NetAtmo weather station for a couple of months and I think it’s the best wheater station with a great user experience you can get:
Some days ago I noticed the option in the web dashboard to export all my weather data into a comma-separated (CSV) or Microsoft Excel file. That’s a great feature for archiving your measured data….but wait – how could this be done in an automated way?
The short answer is: yes – but unfortunately not out-of-the-box. NetAtmo offers a great RESTful API for accessing the weather station data, so I started to do some coding with their web API. But I had no luck – there isn’t a method available for exporting the data. So I did a manual download and created some http/https traces to see what’s going on there. I simply put together the required http-calls and made a shell script that will do the job for me:
#!/bin/bash getmeasurecsv() { # ------------------------------------------------------ # Help # ------------------------------------------------------ # usage: getmeasurecsv <USER> <PASSWORD> <DEVICE_ID> <MODULE_ID> <TYPE> <STARTDATE> <ENDDATE> <FORMAT> # # USER + PASSWORD -> your NetAtmo Website login # DEVICE_ID -> Base Station ID # MODULE_ID -> Module ID # TYPE -> Comma-separated list of sensors (Temperature,Humidity,etc.) # STARTDATE -> Begin export date in format YYYY-mm-dd HH:MM # ENDDATE -> End export date in format YYYY-mm-dd HH:MM # FORMAT -> csv or xls # ------------------------------------------------------ # Parsing Arguments # ------------------------------------------------------ USER=$1 PASS=$2 DEVICE_ID=$3 MODULE_ID=$4 TYPE=$5 DATETIMEBEGIN=$6 DATETIMEEND=$7 FORMAT=$8 # ------------------------------------------------------ # Define some constants # ------------------------------------------------------ URL_LOGIN="https://auth.netatmo.com/en-us/access/login" URL_POSTLOGIN="https://auth.netatmo.com/access/postlogin" API_GETMEASURECSV="https://api.netatmo.com/api/getmeasurecsv" SESSION_COOKIE="cookie_sess.txt" # ------------------------------------------------------ # Convert start and end date to timestamp # ------------------------------------------------------ DATEBEGIN="$(date --date="$DATETIMEBEGIN" "+%d.%m.%Y")" TIMEBEGIN="$(date --date="$DATETIMEBEGIN" "+%H:%M")" DATE_BEGIN="$(date --date="$DATETIMEBEGIN" "+%s")" DATEEND="$(date --date="$DATETIMEEND" "+%d.%m.%Y")" TIMEEND="$(date --date="$DATETIMEEND" "+%H:%M")" DATE_END="$(date --date="$DATETIMEEND" "+%s")" # ------------------------------------------------------ # URL encode the user entered parameters # ------------------------------------------------------ USER="$(urlencode $USER)" PASS="$(urlencode $PASS)" DEVICE_ID="$(urlencode $DEVICE_ID)" MODULE_ID="$(urlencode $MODULE_ID)" TYPE="$(urlencode $TYPE)" DATEBEGIN="$(urlencode $DATEBEGIN)" TIMEBEGIN="$(urlencode $TIMEBEGIN)" DATEEND="$(urlencode $DATEEND)" TIMEEND="$(urlencode $TIMEEND)" FORMAT="$(urlencode $FORMAT)" # ------------------------------------------------------ # Now let's fetch the data # ------------------------------------------------------ # get token from hidden <input> field TOKEN="$(curl --silent -c $SESSION_COOKIE $URL_LOGIN | sed -n '/token/s/.*name="_token"\s\+value="\([^"]\+\).*/\1/p')" # and now we can login using cookie, id, user and password curl --silent -d "_token=$TOKEN&email=$USER&password=$PASS" -b $SESSION_COOKIE -c $SESSION_COOKIE $URL_POSTLOGIN > /dev/null # next we extract the access_token from the session cookie ACCESS_TOKEN="$(cat $SESSION_COOKIE | grep netatmocomaccess_token | cut -f7)" # build the POST data PARAM="access_token=$ACCESS_TOKEN&device_id=$DEVICE_ID&type=$TYPE&module_id=$MODULE_ID&scale=max&format=$FORMAT&datebegin=$DATEBEGIN&timebegin=$TIMEBEGIN&dateend=$DATEEND&timeend=$TIMEEND&date_begin=$DATE_BEGIN&date_end=$DATE_END" # now download data as csv curl -d $PARAM $API_GETMEASURECSV # clean up rm $SESSION_COOKIE } #____________________________________________________________________________________________________________________________________ urlencode() { # ------------------------------------------------------ # urlencode function from mrubin # https://gist.github.com/mrubin # # usage: urlencode <string> # ------------------------------------------------------ local length="${#1}" for (( i = 0; i < length; i++ )); do local c="${1:i:1}" case $c in [a-zA-Z0-9.~_-]) printf "$c" ;; *) printf '%%%02X' "'$c" esac done } #____________________________________________________________________________________________________________________________________ getmeasurecsv "user@email.com" "mySecretPassword" "12:23:45:56:78:33" "02:00:00:12:23:45" "Temperature,Humidity" "2015-05-17 10:00:00" "2015-05-18 12:00:00" "csv"
You just need to modify the last line to fit your environment. Here are some detailed information regarding the DEVICE_ID and the MODULE_ID. Both IDs can be found in your NetAtmo Dashboard settings. The DEVICE_ID will be the MAC-Address of your NetAtmo Base Station. The MODULE_ID is based on the serial number of your additional NetAtmo Modules:
When querying the base station the DEVICE_ID and MODULE_ID value must be the same – the MAC-Address of the base station. The Output in the CSV file will look like this:
I duplicated the last line for each of my NetAtmo modules and configured the script to run on my Synology DiskStation NAS for an automatic backup.
- UPDATE 2017:
See my recent blog post Netatmo Weather Station Shell Script
Hello again!
I have a new problem. Since 05.09.2019, 22 o’clock I can’t get any data.
I become this error message:
cat: cookie_sess.txt: No such file or directory
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 –:–:– –:–:– –:–:– 0curl: (7) Failed to connect to api.netatmo.com port 443: Connection refused
rm: cannot remove `cookie_sess.txt’: No such file or directory
### ### ### Retrieved file too small. Expected 140, got 0 bytes. Retrying. ### ### ###
Anybody who can help me?
No one?
What does it mean:
cat: cookie_sess.txt: No such file or directory?
Finally I solved my problem. It did not depend on the script. On my NAS there was an adapter defect, so the internet connection got lost. After I have fixed that, the script is working again.
Hi there,
since yesterday evening (13.06.2019) I only get temperature data till the 4th of june and rain data till the 9th of June, when I use the script with the monthly parameter -M . But I didn’t change anything and the station ist working fine. Has anyone else this problem too? What can I do?
I see the same problem.. Cant understand why. Also the same problem if I use -s and -e to specify a full month.
Hello,
I checked the issued and the Netatmo API is only returning a CSV or XLS file with 1024 measurements. I don’t know why this is happening. The best solution would be, to download the data every 24h. There is no official documentation for the used getmeasurecsv API and Netatmo does not provide support this API.
Hello,
I checked the issued and the Netatmo API is only returning a CSV or XLS file with 1024 measurements. I don’t know why this is happening. The best solution would be, to download the data every 24h. There is no official documentation for the used getmeasurecsv API and Netatmo does not provide support this API.
Okey, the post to site comment is messing up the response. You will have to check the TOKEN line code from the Github code. It’s slightly different from the code on this page. Then it will work.
Thank you for the response – I just fixed it.
I just updated the code – now it should work with the new login process from netatmo.
still (with just copy&paste with my netatmo credentials)
{“error”:{“code”:2,”message”:”Invalid access token”}}
Hat geklappt,
Vielen Dank ! Großartige Arbeit
How did you solve the problem? I am getting the same error.
I think the sed statement parsing the session cookie doesn’t work properly
Hello,
I replaced the TOKEN line with this one:
TOKEN=”$(curl –silent -c $SESSION_COOKIE $URL_LOGIN | sed -n ‘/token/s/.*name=”_token”s+value=”([^”]+).*/1/p’)”
Sorry this one:
TOKEN=”$(curl –silent -c $SESSION_COOKIE $URL_LOGIN | sed -n ‘/token/s/.*name=”_token”s+value=”([^”]+).*/1/p’)”
Since 07.03.2019 apparently the address of the URL-Login of …/de-DE/… has been changed. on …/de-de/… which leads to the script not being able to log in. The only error message you get is that the cookie files do not exist. 😀
Change Line:
URL_LOGIN=”https://auth.netatmo.com/de-DE/access/login”
To:
URL_LOGIN=”https://auth.netatmo.com/de-de/access/login”
not really…you already made another post with a much different solution than just change from DE to de
big error code with CSS and at the end
” An Error Was Encountered”
” The action you have requested is not allowed. “
so now that i have the datam is there any way to use this data to make Graphics comparison by selected dates?
thanks!
You need to replace API_GETMEASURECSV=”https://my.netatmo.com/api/devicelist”
to
API_GETMEASURECSV=”https://api.netatmo.com/api/devicelist”
Many thanks for the script, that I have been using for more than a year now. It seems that yesterday NetAtmo changed the paths to their scripts and the download did not work anymore. I then used a HTTP Trace and found out that apparently they changed the path to getmeasurecsv from “https://my.netatmo.com/api/getmeasurecsv” to “https://app.netatmo.net/api/getmeasurecsv”…
At least for me it seems to work again..
Thank you for the Update 🙂 Now the Script is working like a charm again :-D. I owe you a Beer 😉
Hello, and thank you for a great script! I’ve used it to fetch data from my newly bought Netatmo, and it’s working great.
I’m trying to expand a little on your work here, but I don’t have much experience with bash. What I’m planning to do is to have a perl script execute your bash script, and have the bash script store the exported data in csv files. Then, the perl script will parse those csv files, and put the data into a mysql database. Then I’ll make a separate web interface that outputs the data from mysql.
That might sound silly, given that Netatmo already provides a web interface. I do like to keep my own backup of the data, though. Also, I miss several features on the Netatmo interface – for instance being able to compare data (e.g. indoor and outdoor temperature in the same chart, or the outdoor temperature of this year compared to last year on the same chart).
So, my question is: Is there any way to set the parametres of the bash script via command line? That way I would be able to have the perl script give the date range when running the script.
Thanks again for the great work in making the script!
Sorry to bother you – I figured it out.
In case you or anyone else would like to do the same, here’s what I’ve done:
After the end of getmeasurecsv() {}, I’ve added this:
while [[ $# -gt 1 ]]
do
key="$1"
case $key in
-s|--startdate)
STARTDATE="$2"
DATETIMEBEGIN="$2"
shift # past argument
;;
-e|--enddate)
ENDDATE="$2"
DATETIMEEND="$2"
shift # past argument
;;
--default)
DEFAULT=YES
;;
*)
# unknown option
;;
esac
shift # past argument or value
done
echo START DATE = "${STARTDATE}"
echo END DATE = "${ENDDATE}"
Then I’ve added several lines calling the getmeasurecsv(), specifying the different modules and the date range:
#Inndoors
getmeasurecsv "my@email.com" "my_password" "70:ee:50:00:00:00" "70:ee:50:00:00:04" "Temperature,Humidity,CO2,Pressure,Noise" "${STARTDATE}" "${ENDDATE}" "csv" > inndoors.csv
#Rain
getmeasurecsv "my@email.com" "my_password" "70:ee:50:00:00:00" "70:ee:50:00:00:04" "Rain" "${STARTDATE}" "${ENDDATE}" "csv" > rain.csv
Then I call the script by command line like this:
bash getmeasurecsv.sh -s "2017-04-25 00:00:00" -e "2017-04-29 10:15:00"
Next up is writing the perl script to parse the data and add it to mysql. Thanks again for the great script! 🙂
Hello,
I would like to retrieve the public data from netatmo. I know only python. Could you help me to retrieve to csv file. Thank you.
Hello,
I´m sorry for my stupid question (I’m a beginner), but how can I run this script? Can I run it in Windows computer or how can I run it in Synology, as you wrote…
Thank you for your help.
Hi Jiri,
this is a bash script (bash – born again shell) . Windows will not work, bash is running on unix systems like ubuntu. also unix based raspberry os will work.
Great job !!!!
[…] two really cool bash scripts using curl to query base information about one’s devices and download the weather data. I modified them, especially for postprocessing (=grouping) the rain data, so I attach them to this […]
Thanks a lot for this script and the script to get the device data. With it getting my netatmo data was a piece of cake.
Hallo Michael,
Kennst Du dich mit VBA auch aus?
Da ich kein Linux habe, würde ich die Abfrage gerne direkt mit Excel machen.
Gruss
Sandro
Hallo,
in VBA nur ein bisschen, aber in VB. Das ganze nennt sich ReST API – im Internet gibts viele Beispiele wie man über VBA Daten von einer ReST Schnittstelle abfragen kann. Ich vermute aber über VBA könnte es etwas umständlich werden, da VBA ja schon etwas betagt ist – aber machbar ist es sicherlich.
Gruss
Michael
Hi Michael,
This looks exactly like what I need but I don’t get it to work. First I get en email from Netatmo everytime I run the script and second I get the following error: date: illegal option — –
Any ideas?
Hello Niklaus,
you can change your notification settings from the NetAtmo Website and disable the login notification.
Ich will die Daten via Statusboard darstellen. Da kommt mir das CSV sehr gelegen.
Ich bekomm aber beim Ausführen des Script einige Rechtefehler:
Returncode: 126
Ausgabe des Scripts:
Array
(
[0] => netatmo.sh: line 73: /bin/grep: Permission denied
[1] => netatmo.sh: line 73: /bin/cat: Permission denied
[2] => netatmo.sh: line 79: /bin/grep: Permission denied
[3] => netatmo.sh: line 79: /bin/cat: Permission denied
[4] => % Total % Received % Xferd Average Speed Time Time Time Current
[5] => Dload Upload Total Spent Left Speed
[6] =>
0 0 0 0 0 0 0 0 –:–:– –:–:– –:–:– 0
100 257 100 3 100 254 11 946 –:–:– –:–:– –:–:– 1909
[7] => 9;
[8] => netatmo.sh: line 88: /bin/rm: Permission denied
[9] => netatmo.sh: line 89: /bin/rm: Permission denied
[10] => netatmo.sh: line 73: /bin/grep: Permission denied
[11] => netatmo.sh: line 73: /bin/cat: Permission denied
[12] => netatmo.sh: line 79: /bin/grep: Permission denied
[13] => netatmo.sh: line 79: /bin/cat: Permission denied
[14] => % Total % Received % Xferd Average Speed Time Time Time Current
[15] => Dload Upload Total Spent Left Speed
[16] =>
0 0 0 0 0 0 0 0 –:–:– –:–:– –:–:– 0
100 257 100 3 100 254 20 1713 –:–:– –:–:– –:–:– 2134
[17] => 9;
[18] => netatmo.sh: line 88: /bin/rm: Permission denied
[19] => netatmo.sh: line 89: /bin/rm: Permission denied
)
Ich versteh nicht so recht was das Script da jetzt nicht darf.
Freue mich über jegliche Hilfe!
Vielen Dank
Grüße
Kai
Hallo Michael,
eine andere Frage: Wäre es möglich das Skript so zu modifizieren, dass immer nur ein Zeitraum von z.B. 24h als CSV Datei exportiert würde? Aktuell ist es ja so, dass die Startzeit/Datum stets definiert werden MUSS. Das Enddatum kann auch weiter in die Zukunft gesetzt werden – das ist kein Problem, ergibt lediglich NaN am Ende der CSV Datei. Ich nutze dein Skript nicht nur als Datenbackup, sondern lasse jede Stunde damit die Werte ab dem 1.1. in eine CSV Datei exportieren. Somit kommen jede Stunde neue Daten hinzu (der Scale Wert wurde dafür bereits angepasst, so dass nur eine Zeile / Wert(e) pro Stunde hinzukommt).
Mit einem fest definiertem Zeitraum von stets 24h könnte dann jede Stunde ein neuer, aktueller Zeitraum von 24h (jetzt minus 24h) exportiert werden und wie bei mir mit Charts visualisiert werden.
Grüße
kai
I wonder if you have any idea if it is possible to download data from other netatmo weather stations which distribute their data on the netatmo world maps?
Dagrun
Hello Dagrun,
you can access the weather map data by using the public API method GETPUBLICDATA. NetAtmo offers a great example on their website:
https://dev.netatmo.com/doc/methods/getpublicdata
You can pass an area specified by longitude and latitude to the API and get the measures from all weather stations within the specified area.
Michael
You are so awesome for helping me solve this mysetry.
Hallo Michael,
ersteinmal vielen Dank für Deine tolle Anleitung zur Ausgabe der Netatmo Daten im CSV Format. Das ist genau das was ich gesucht habe 🙂
Aber: Wie kann ich die ausgegebenen Daten auch als dateiname.csv in einem bestimmten Pfad abspeichern? Oder wird hier schon irgendwo das Ergebnis der Abfrage abgelegt? Wenn ja, wo? Das Skript läuft in der Konsole bei mir einwandfrei und spuckt alle Werte der definierten Zeitspanne aus – aber halt nur in der Konsole, oder?
Grüße
Kai
Hallo Kai,
damit die Ausgabe in eine Datei geschrieben wird, musst du am Ende der letzten Zeile “getmeasurecsv…..” einfach “> /pfad/zu/meiner/datei.csv” anhängen. Somit wird die gesamte Ausgabe in die angegebene Datei umgeleitet /geschrieben.
Grüße
Michael
Hallo Michael,
prima – vielen Dank für die schnelle Hilfe – funktioniert!
Grüße
Kai
The script works great for the base module. But I’ve discovered some problems with the outside- and rain module. Somehow there is a little mismatch in the adress that I can’t figure out the examples.
“12:23:45:56:78:33” “02:00:00:12:23:45”
vs
“03:00:00:XX:XX:XX” “03:00:00:12:45:E4”
I tried both combinations but it only downloads “9;”
Somebody else having the same?
Unfortunately, I don’t have a rain module. But the outdoor module is working for me. What is the first character of your outdoor module serial number shown on the netatmo page? First character should be something between A-Z?
Hi there,
my rain module has a serial number starting with ‘k’, so the module id starts with ’05’. The only possible filter ($TYPE) is “Rain”.
My wind module starts with a ‘l’, module id: ’06:..:.’. filter: “WindAngle,WindStrength,GustAngle,GustStrength”
Viele Grüße
Frank
@Michael: Schönes bash-script, sehr verständlich geschrieben. Sieht man leider selten.
Please have a look on my new blog post how to get all module data, including base_id and module_id:
Read NetAtmo weather station data via Script