Lors d'un récent voyage sur les lieux de la guerre 14-18, nous en avons profité pour faire quelques recherche sur un arrière-grand-oncle de ma femme que nous savions décédé lors du conflit. Nous avons trouvé sa tombe au cimetière de Lijssenthoek (Poperinge, BE).
La mise à disposition de bornes d'accès à des documents numérisés dans les différents musées et cimetières et la présence d'un François Guillier sur l'anneau de la mémoire de Notre-Dame-de-Lorette, m'ont amené à rechercher des informations sur des soldats portant mon nom.
Après quelques recherches sur différentes bases de documents en ligne, et de fil en aiguille, j'ai réussi à retrouver la trace de plusieurs homonymes.
Après vérification, il se trouve qu'il s'agit de mon arrière-grand-oncle !
Né le 10 mai 1892 à Villaines-sous-Malicorne (Sarthe)
A épousé Marguerite Marie Garnier le 6 février 1916
Mort pour la France le 30 juillet 1916 à Fleury-sur-aire (Meuse) de blessures de guerre
Having started with pollution sensors which turned out to be almost unusable because of the lack of calibration and because I live in Paris which can sometimes have Air Crisis I wanted to carry on with new sensors.
Trouble is, most of them tend to be very expensive or uncalibrated (and sometimes both!).
I was in particular interested in particle sensors (like PPD42NS, PM2.5 laser dust sensor, ...) and CO2 sensors. Each were, at the end of 2015, around $50.
Alima & Foobot
I was aware of the Alima prototype which evolved to become Foobot (honestly how did they come with this crazy name? Sounds really like foobar)
They had a successful campaign on Indiegogo (Indiegogo sounds crazy in French too since gogo means dupe/naive person. That said maybe it is appropriate to crowdfunding after all!).
There are not cheap but they measure (according to the specifications):
VOC: total Volatile Organic Compounds, including gases such as Formaldehyde, Benzene, Toluene, Ethylene glycol, etc.)
PM: Particulate Matter (PM2.5)
CO2: Carbon Dioxide
CO: Carbon Monoxide
And once you add the price of all the sensors and components, maybe having a calibrated system with apps and support for $199 is worth it.
The main problem with this kind of gadgets/toys/appliances/things is that they rely on the "Cloud" which mean that they are basically dead the day the company shuts down or decides they want to "refocus". There are countless examples of these, a very famous one being the Nabaztag/Karotz.
There is also the danger of data traveling to US or China without knowledge of what is done with it. At least foobot is based in Europe (Luxembourg) so I assume EU related privacy laws apply.
I also discovered that there were some info/code published about their prototype, renamed airboxlab so I would hope that in case of problem the foobot could be updated to talk to a private server.
With all this in mind, I took the plunge and ordered one. The installation can be done in minutes (as long as you have a phone/tablet on the same Wifi Network and that your Wifi SSID is visible).
At first the application wasn't neither intuive nor reliable. Newest version seem stable and easy to use.
It takes about a week for the Foobot to settle. At first I thought it was busted because all readings were bizarre but after a while they started to make sense.
At the begining, I had the notifications on but after a while they become quite annoying and frankly there is not much you can do if the air is not good in the middle of the night! You can also tell the system what the event was but honnestly I don't understand how that works.
Note that you also receive reports by email on a regular basis.
There is a API (via the Cloud, unfortunately not a direct connection to the appliance) with historical data.
It can be called up to 200 times (odd number even if mathematically even) so I download latest data every 15 minutes (96 times a day).
The API has been pretty stable/reliable so far.
The sensors "don't like" cooking (or maybe cooking = polluting?). A few month ago, we made some crêpes and the Foobot became all orange (= bad atmosphere). Graphs when to the roof! It is about the same if the oven is turn on.
I assume anything with oil will create loads of "particulate matters"... At least that what the readings say!
And because of this, the graphs for VOC and CO2 have exactly the same shape and it impossible to say if CO2 or Non-CO2 VOC are the ones triggering the sensor :-(
This wasn't clearly advertised when I bought it and this is a bit of a disappointment.
For CO, it is not even returned by the API. That said the Foobot is not an Carbon Monoxide detector and there should not be much CO around in a room so I am OK with that one.
Foobot computes an Index which represents the Air Quality (Indoor air quality (IAQ). The lower the better. The colours (shades of blue & orange varies according to this index).
They recently added a outdoor value. Data is computed by Breezometer (how? who knows) and the result is shown on the app. Data is also available directy from Breezometer but there is a small catch:
Breezometer index via Foobot API is also the lower the better (to keep data consistant)
Breezometer native index is the higher the better!
First, it works! Seems a bit bizarre said like this but I came across so many dysfunctional stuff that it is worth mentioning that this simple fact.
It measures values which seems in line with my other sensors (Temperature/Humidity) and activities (PM/VOC) even if I can't compare how well calibrated they are.
API is rather stable. The App (IOS/iPhone version but used on an iPad) a bit less depending on releases.
So as I pure sensor, I believe it can be recommended.
I am far more skeptical about the "better air" all the marketing, blog and website seem to be based on. OK knowledge is the be first step towards improvement but I still haven't understood "the magic" about "Predictive data [and] how Foobot learns about your habits and is able to detect pollution peaks before they actually happen".
Apart from opening the window (or stop breething), there is little which can be done in case of pollution alert!
Last summer, I hinted that I was about to switch from the monolithic programme with all its threads to a constellation of separate processes.
The main reason was that the monolithic application needs a restart for any change in the configuration and that a crash/exception on one thread breaks everything.
True, having several separate processes impose to find a way to start them all in the first place and something else to look after them. It also uses quite a bit of memory (because of Python overhead, a tiny process is almost as memory hungry as a small one). Last but not least, the interprocess communication can be problematic.
Enters the Raspberry Pi 2
Fortunately, the Raspberry Pi Foundation released the Raspberry Pi 2 which has now a quad-core CPU, 1GB of memory (and even the Raspberry Pi 3, more recently, but I doubt it would make much difference here). At ~ 3MB of memory per process, there is plenty of available RAM! And also, starting a Python process is now almost immediate compared to the 5-10 seconds needed on the 1-B+ model.
Communication : MQTT
Thinking about it, there is not much need of communication between processes. In the majority of cases, it is all bout sending the data to the display interfaces and to a database (for the sensor part).
Anyway, the ubiquitous MQTT can solve all the communication problems... These days, it seems that there isn't a similar hub project around which isn't using MQTT either at the core or at least for plug-ins communication.
I have already detailed the way I format the topic and the payload of the messages.
Every process is now using a bootstrap library which manages the daemonisation, the MQTT communication, and logs. There are 2 types of messages : the data and metadata (starting, heartbeat, ...).
Currently the model used is the following:
All measurements have the MQTT retain option activated to keep the last value available to a reconnecting process
Pushover is a notification system for mobiles (and desktop)
I am currently using 2 storage systems: RRD and a timeseries database (test in progress)
'display_bikes' and 'display_transport' call external webservices and/or doing web scraping of pages. Resulting data is only displayed but never stored.
This time, it is about playing with a ESP-201 as well as with the Nodemcu devkit but without nodemcu (the firmware). Here are a few notes about what I discovered while playing with these boards.
As I mentioned in the past, starting with Arduino 1.6.4, there is now full support for ESP8266.
A majority of the Arduino's functions are directly available to be used of the ESP8266 and some additional libraries have been directly developed specifically. The libraries and documentation are changing extremely fast: between my first attempts in the summer and now, a lot of material was added.
The biggest hurdle with these chips seems that timing. While a "normal" Arduino will happily wait for any kind of event to happen, the ESP8266 tends to reset very easily. Too easily maybe and I wasn't able to do some tasks such as the La Crosse decoder and its strict timings.
There is more info about watchdog in the documentation and in this interesting blog entry about porting code from the Spark Core to the ESP8266.