If you read my previous post you can see I was using powershell to write data to a CSV file locally on my PC. This was not a great solution because my PC had to be running at all times in order to get the data. I decided to use my Raspberry Pi as a server to collect the data and have it push the data to Google Docs. Here is what I came up with:
#!/usr/bin/python import time import gdata.spreadsheet.service import urllib2 import json import time ##########Make Changes in this section #the URL for your spark core URL = 'xxxxxxx' #Your gmail email to login to google docs with email = 'xxxxxx' #you gmail password password = 'xxxxxxxx' # Find this value in the url with 'key=XXX' and copy XXX below spreadsheet_key = 'xxxxxxx' # All spreadsheets have worksheets. I think worksheet #1 by default always # has a value of 'od6' worksheet_id = 'od6' #########Done with the changes while True: response = urllib2.urlopen(URL) response = response.read() response = json.loads(response) result = response["result"] result = json.loads(result) Humidity = result["Humidity"] degF = result["degF"] degC = result["degC"] DewPoint = result["DewPoint"] DewPointSlow =result["DewPointSlow"] weight = '180' spr_client = gdata.spreadsheet.service.SpreadsheetsService() spr_client.email = email spr_client.password = password spr_client.source = 'Python runing on pi' spr_client.ProgrammaticLogin() # Prepare the dictionary to write dict = {} dict['date'] = time.strftime('%m/%d/%Y') dict['time'] = time.strftime('%H:%M:%S') dict['humidity'] = str(Humidity) dict['degf'] = str(degF) dict['degc'] = str(degC) dict['dewpoint'] = str(DewPoint) dict['dewpointslow'] = str(DewPointSlow) #print dict entry = spr_client.InsertRow(dict, spreadsheet_key, worksheet_id) if isinstance(entry, gdata.spreadsheet.SpreadsheetsList): print "Insert row succeeded." else: print "Insert row failed." time.sleep(300) # delays for 300 seconds
You need to install the gdata python library here is how to do that (thanks to http://www.mattcutts.com/blog/write-google-spreadsheet-from-python/)
mkdir ~/gdata
(download the latest Google data Python library into the ~/gdata directory)
unzip gdata.py-1.2.4.zip (or whatever version you downloaded)
sudo ./setup.py install
This script just loops forever and sends an update to the spreadsheet every 5 mins (this might be too often).
I might get fancy and make this a service but for now you can just run it in the background with &
This project is still not done for me. Next I will be working to Reduce the power consumption by using sleep mode of the core Spark.sleep(). This puts the core in low power mode for a time. In order to do this I need to change the core script to push data instead of updating all the time. To do this I will use Spark.publish() to publish an event when it takes a reading. Then the script can pull that event when ever it gets around to it.