Running it all on the Raspberry Pi (Part 2)
In my last post, (Running it all on the Raspberry Pi (Part 1), I went through the additional software and services that were required to move all the processes to the Raspberry Pi.
In this installment I describe the updates required to my own scripts involved in the project
- update ds18b20.py (log directly into local SQLite database instead of a web call)
- update sensor-data.php (remove obsolete code)
- update chart-sensor.js (update URLs for calling sensor-data.php)
Updates to ds18b20.py
- Additional import for SQLite
import sqlite3 as sql
- Connect to the SQLite database
# Connect to SQLite database
con = sql.connect( '/var/sqlite/sensor-data.sqlite' )
cur = con.cursor()
- Rewritten function logData
# Log data to the SQLite database
def logData( id, value):
try:
cur.execute( "insert into sensor_data(timestamp,sensor_id,value) values(datetime('now','localtime'), {0}, {1} )".format( id, value ) )
con.commit()
return
except:
if con:
con.rollback()
return
the complete Python script
#!/usr/bin/env python
import os
import glob
import time
import re
import RPi.GPIO as GPIO
import urllib2 as url
import subprocess
import sqlite3 as sql
# Load kernel modules for 1-wire devices
os.system( 'modprobe w1-gpio' )
os.system( 'modprobe w1-therm' )
# Determine location of first found DS18B20 sensor
base_dir = '/sys/bus/w1/devices/'
device_folder = glob.glob( base_dir + '28*' )[0]
device_file = device_folder + '/w1_slave'
# Configure GPIO
GPIO.setmode( GPIO.BCM )
GPIO.setwarnings( False )
# Connect to SQLite database
con = sql.connect( '/var/sqlite/sensor-data.sqlite' )
cur = con.cursor()
# Log data to the SQLite database
def logData( id, value):
try:
cur.execute( "insert into sensor_data(timestamp,sensor_id,value) values(datetime('now','localtime'), {0}, {1} )".format( id, value ) )
con.commit()
return
except:
if con:
con.rollback()
return
# Controle state for LED pin (turn on/off the connected LED)
def ledMode( PiPin, mode ):
GPIO.setup( PiPin, GPIO.OUT )
GPIO.output( PiPin, mode )
return
# Read data from the raw device
def read_temp_raw():
f = open(device_file, 'r')
lines = f.readlines()
f.close()
return lines
# Determine temperature and humidity from the DHT22/AM2302 sensor
def read_dht22( PiPin ):
output = subprocess.check_output(["./Adafruit_DHT", "2302", str(PiPin)])
matches = re.search("Temp =\s+([0-9.]+)", output)
if ( matches ):
logData( 2, float(matches.group(1)) )
matches = re.search("Hum =\s+([0-9.]+)", output)
if ( matches ):
logData( 3, float(matches.group(1)) )
return
# Determine temperature from the DS18B20 sensor
def read_temp():
lines = read_temp_raw()
while lines[0].strip()[-3:] != 'YES':
time.sleep(0.2)
lines = read_temp_raw()
equals_pos = lines[1].find('t=')
if equals_pos != -1:
temp_string = lines[1][equals_pos+2:]
temp_c = float(temp_string) / 1000.0
temp_f = temp_c * 9.0 / 5.0 + 32.0
return temp_c, temp_f
# Turn off all LEDs
ledMode( 14, GPIO.LOW )
ledMode( 15, GPIO.LOW )
ledMode( 18, GPIO.LOW )
while True:
temp_c, temp_f = read_temp()
ledMode( 14, GPIO.HIGH if temp_c < 27 else GPIO.LOW )
ledMode( 15, GPIO.HIGH if temp_c >= 27 and temp_c < 29 else GPIO.LOW )
ledMode( 18, GPIO.HIGH if temp_c >= 29 else GPIO.LOW )
ts = time.strftime('%Y-%m-%d %H:%M:%S', time.localtime())
print '{0} - Temperature = {1:.2f} C ({2:.2f} F)'.format( ts, temp_c, temp_f )
logData( 1, temp_c )
read_dht22(22)
time.sleep(30)
Updates to sensor-data.php
Since we have moved the logging of the data into the Python script, the PHP script only has to take care of dumping request data in CSV format.
<?php
/* ============================================================================ */
/* SQLite tables */
/* ============================================================================ */
define( 'NEWLINE', "\n");
/* ============================================================================ */
/* Open SQLite connection */
/* ============================================================================ */
$db = new SQLite3( '/var/sqlite/sensor-data.sqlite' );
/* ============================================================================ */
/* Perform SQLite query with wait for unlocked state (default 6000ms) */
/* ============================================================================ */
function query_timeout( $query, $timeout = 6000 ) {
global $db;
if ( $db->busyTimeout( $timeout ) ) {
$results = $db->query( $query );
}
$db->busyTimeout( 0 );
return $results;
}
/* ============================================================================ */
/* List sensor data */
/* ============================================================================ */
function process_csv_log_data( $sensor, $period ) {
$results = query_timeout( "select s.sensor_id
, s.sensor_type
, s.sensor_name
, s.sensor_location
, d.timestamp
, d.value
from sensors s
, sensor_data d
where s.sensor_id = ${sensor}
and s.sensor_id = d.sensor_id
and d.timestamp > datetime('now', '-${period} hours', 'localtime')
order by d.timestamp" );
header("Content-type: text/csv");
echo 'timestamp,temperature'.NEWLINE;
while ( $row = $results->fetchArray( SQLITE3_ASSOC ) ) {
echo $row['timestamp'];
echo ',';
echo $row['value'].NEWLINE;
}
}
/* ============================================================================ */
/* Main process */
/* ============================================================================ */
$command = isset( $_GET['action'] )? $_GET['action'] : 'UNKNOWN_COMMAND';
switch( $command )
{
case 'csv_data';
$id = (int)$_GET['id'];
$period = (int)$_GET['period'];
process_csv_log_data( $id, $period );
break;
}
/* ============================================================================ */
/* Close SQLite connection */
/* ============================================================================ */
$db->close();
?>
Updates to chart-sensor.js
We need to make on little change to the JavaScript/jQuery. On the Raspberry Pi the files are located directly in the DocumentRoot of the web server, so we need to remove /raspberrypi
from the URL to get data from the SQLite database
d3.csv("/sensor-data.php?action=csv_data&id="+sensor+"&period="+period, function (error, data) {
data.forEach(function (d) {
d.timestamp = parseDate(d.timestamp);
d.temperature = +d.temperature;
});
Full version of the script
showChart = function( area, sensor, period, unit ) {
var margin = {top: 20, right: 50, bottom: 30, left: 50 },
width = 700 - margin.left - margin.right,
height = 250 - margin.top - margin.bottom;
var parseDate = d3.time.format("%Y-%m-%d %X").parse;
var x = d3.time.scale()
.range([0, width]);
var y = d3.scale.linear()
.range([height, 0]);
y.ticks(0.1);
var xAxis = d3.svg.axis()
.scale(x)
.tickFormat( function(d) { return d3.time.format('%H:%M')(d); } )
.orient("bottom");
var yAxis = d3.svg.axis()
.scale(y)
.tickSize(-width,0,0)
.tickFormat( d3.format(',.1f') )
.orient("left");
var yAxis2 = d3.svg.axis()
.scale(y)
.tickFormat( d3.format(',.1f') )
.orient("right");
var line = d3.svg.line()
.interpolate("basis")
.x(function (d) { return x(d.timestamp); })
.y(function (d) {
return y(d.temperature);
});
var svg = d3.select( area ).append("svg")
.attr("width", width + margin.left + margin.right)
.attr("height", height + margin.top + margin.bottom)
.append("g")
.attr("transform", "translate(" + margin.left + "," + margin.top + ")");
d3.csv("/sensor-data.php?action=csv_data&id="+sensor+"&period="+period, function (error, data) {
data.forEach(function (d) {
d.timestamp = parseDate(d.timestamp);
d.temperature = +d.temperature;
});
x.domain(d3.extent(data, function (d) {return d.timestamp; }));
y.domain( [d3.min(data, function (d) {return d.temperature; })-0.1,d3.max(data, function (d) {return d.temperature; })+0.1] ).nice();
svg.append("g")
.attr("class", "x axis")
.attr("transform", "translate(0," + height + ")")
.call(xAxis);
svg.append("g")
.attr("class", "y axis")
.call(yAxis)
.append("text")
.attr("transform", "rotate(-90) translate(0,-50)")
.attr("y", 6)
.attr("dy", ".71em")
.style("text-anchor", "end")
.text(unit);
svg.append("g")
.attr("class", "y2 axis")
.attr("transform", "translate(" + width + " ,0)")
.call(yAxis2)
.append("text")
.attr("transform", "rotate(90) translate(0, -50)")
.attr("y", 6)
.attr("dy", ".71em")
.style("text-anchor", "start")
.text(unit);
svg.append("path")
.datum(data)
.attr("class", "line")
.attr("d", line);
});
};
Final Steps
To make the charting part of the project available to the Lighttpd server the following files need to be copied to /var/www
- chart-sensor.css
- chart-sensor.js
- chart-sensor.html
- sensor-data.php
- favicon.ico
Since I don’t want to loose the data already collected over the last couple of days, I copied the file sensor-data.sqlite
from the original server to the Raspberry Pi.
Now it is just a matter of killing the current process. First get the process id for the collection script
pi@raspberrypi ~/raspberrypi $ ps aux | grep ds18b20
root 8641 0.0 0.3 5116 1608 pts/0 S 12:50 0:00 sudo ./ds18b20.py
root 8642 0.2 1.4 11904 6464 pts/0 S 12:50 0:01 python ./ds18b20.py
pi 8672 0.0 0.1 3540 812 pts/0 S+ 12:58 0:00 grep --color=auto ds
next kill the process that is running the Python process, which is process id 8642
sudo kill -9 8642
and restart it again in the background
sudo nohup ./ds18b20.py &
Checking performance
Using the top
command you can get information about the running processes on the Raspberry Pi.
When we are only collecting data from the sensors, we don’t even see the Python process appearing in the list, meaning it is hardly consuming and CPU and/or memory at all
top - 20:40:46 up 20:33, 2 users, load average: 0.51, 0.22, 0.12
Tasks: 92 total, 1 running, 91 sleeping, 0 stopped, 0 zombie
%Cpu(s): 2.0 us, 1.6 sy, 0.0 ni, 96.4 id, 0.0 wa, 0.0 hi, 0.0 si, 0.0 st
KiB Mem: 448776 total, 299020 used, 149756 free, 47392 buffers
KiB Swap: 102396 total, 0 used, 102396 free, 182384 cached
PID USER PR NI VIRT RES SHR S %CPU %MEM TIME+ COMMAND
10464 pi 20 0 4672 1420 1024 R 1.3 0.3 0:00.46 top
10065 pi 20 0 9884 1660 1008 S 0.7 0.4 0:04.75 sshd
35 root 20 0 0 0 0 S 0.3 0.0 1:16.33 mmcqd/0
1627 root 20 0 1744 500 408 S 0.3 0.1 0:36.00 ifplugd
2357 pi 20 0 81988 8972 6640 S 0.3 2.0 2:08.33 lxpanel
2381 pi 20 0 6452 2348 1984 S 0.3 0.5 0:06.19 menu-cached
2494 pi 20 0 19144 2300 1888 S 0.3 0.5 0:11.50 gvfs-afc-volume
2522 root 20 0 0 0 0 S 0.3 0.0 1:56.86 w1_bus_master1
10436 root 20 0 0 0 0 S 0.3 0.0 0:00.29 kworker/0:2
1 root 20 0 2144 728 620 S 0.0 0.2 0:04.05 init
Accessing the charts
top - 02:12:03 up 1 day, 2:05, 2 users, load average: 0.72, 0.36, 0.19
Tasks: 92 total, 2 running, 90 sleeping, 0 stopped, 0 zombie
%Cpu(s): 95.8 us, 3.9 sy, 0.0 ni, 0.0 id, 0.0 wa, 0.0 hi, 0.3 si, 0.0 st
KiB Mem: 448776 total, 374504 used, 74272 free, 52260 buffers
KiB Swap: 102396 total, 0 used, 102396 free, 248568 cached
PID USER PR NI VIRT RES SHR S %CPU %MEM TIME+ COMMAND
8238 www-data 20 0 20984 4624 1892 R 33.8 1.0 0:57.94 php-cgi
8237 www-data 20 0 20204 4452 2360 S 31.5 1.0 1:07.59 php-cgi
8236 www-data 20 0 20188 3876 1892 S 31.2 0.9 0:51.13 php-cgi
11816 pi 20 0 4672 1424 1024 R 1.3 0.3 0:00.88 top
35 root 20 0 0 0 0 S 0.3 0.0 2:17.80 mmcqd/0
1691 root 20 0 1744 528 436 S 0.3 0.1 1:03.18 ifplugd
1737 root 20 0 0 0 0 S 0.3 0.0 1:27.30 RTW_CMD_THREAD
2055 pi 20 0 23256 12m 2184 S 0.3 2.8 1:53.15 Xtightvnc
2357 pi 20 0 82072 9288 6884 S 0.3 2.1 2:43.49 lxpanel
11680 pi 20 0 9884 1644 1004 S 0.3 0.4 0:02.01 sshd
We can see three php-cgi
commands (by user www-data
) appearing in the list of processes, each consuming around 33% CPU and 1% memory. This basically means that when the Raspberry Pi is sending the data to a web browser to display the charts, the CPU is being used between 90% and 100%, leaving not much CPU power for other stuff. For now this should be fine since the web module can only be accessed from within the local network, but it is something that needs to be looked at.
Continue reading in Running it all on the Raspberry Pi (Part 3)