I wanted to build a wireless arduino temperature logger. Well, I did. Except it was a rasberry pi and it logged to google drive. it was awesome. BUT! i lost the code. No worries, though. Here are my notes. I will clean this up soon...
I bought a raspberry pi and a DHT11 temperature/humidity sensor. Hooked it up and ran it. I used software/instructions from:
The sensor is accurate to 2 degrees C. It takes 2 seconds to get a reading and it isn’t as accurate as the DHT22. But, it was what was in stock at gateway so I used it and am logging data to a google spreadsheet via this code:
import sys
import gspread
import Adafruit_DHT
import time
import datetime
gc = gspread.login('daisy.wood@gmail.com', ‘password’)
sh = gc.open_by_url('https://docs.google.com/spreadsheets/d/1RIvyUzV--PWNfihjMS4OZQD9JKXuPZHTgXKVSc5GCnE/edit?usp=sharing')
worksheet = sh.get_worksheet(0)
sensor = Adafruit_DHT.DHT11
pin = 4
for x in range(2,2000):
humidity, temperature = Adafruit_DHT.read_retry(sensor, pin) if humidity is not None and temperature is not None: print 'Temp={0:0.1f}*C Humidity={1:0.1f}%'.format(temperature, humidity) worksheet.update_cell(x, 1, datetime.datetime.now().strftime("%Y-%m-%d")) worksheet.update_cell(x, 2, datetime.datetime.now()) worksheet.update_cell(x, 3, temperature) worksheet.update_cell(x, 4, humidity) else: print 'Failed to get reading. Try again!' time.sleep(60)
LOL, i need to get code formatting figured out on this site..
Here is the ardunio library and example code: http://learn.adafruit.com/dht
BUT, probably want to do what this guy is doing: http://blog.the-jedi.co.uk/tag/nrf24l01/
In fact, that’s the direction I want to go in. He does use the same temp sensor, which is guess is fine. But, he uses nrf24L01+ and an arduino so that’s the next step I suppose.
This is good, he uses protobufs: http://theboredengineers.com/2014/01/piweather-how-to-communicate-wirelessly-between-an-arduino-and-a-raspberry-pi/
As for WIRED communication between an arduino and raspberry, http://arduino.stackexchange.com/questions/1628/arduino-to-raspberry-pi-wired-communication
thank you for playing.
A month ago a friend showed me a toy he was building. Basically, he was trying to do angular without having to install all of angular. (This was just after angular announced 2.0 which he said "is going to be... weird.") He built it using microframeworks. That is, tiny javascript libraries and frameworks that do one thing really well in a small footprint. He told me to go look at microjs.com which is a catalog of these types of libraries. I did and I liked it.
I took a look at his code again today and thought I might try using one of these libraries to DIY routing in a simple app. I wanted to build a single page app that acted as a questionnaire/wizard. Almost a Choose-Your-Own-Adventure type application that, when finished going through the questions, provides some sort of insight or guidance to the user. My friend used RLite so I thought I'd do the same.
The idea is to store the questions as an object (so we can easily create a service that returns these questionnaires as JSON) and have the app guide the user through them. Start with a simple index.html:
<!doctype html> <html> <head> <link rel="stylesheet" href="/css/apptest.css"> <title>JS Webapp Playground</title> </head><body bc-cloak> <header> <nav> <ul> <li><a href="#/">Home</a></li> <li><a href="#/first_visit_type">first_visit_type</a></li> </ul> </nav> </header>
<main id="main">Loading...</main>
<footer> <p>Special thanks to <a href="https://github.com/bchociej/apptest/">Ben</a></p> </footer> <script src="/js/lib/lodash.min.js"></script> <script src="/js/rlite.js"></script> <script src="/js/apptest.js"></script> </body> </html>
Notice all that does is set up the structure of the page and include rlite, lodash, and a custom js file called apptest.js. Apptest.js simply sets up the rlite router based on the questions object which is called lower_back:
; (function () { "use strict"; var question = .template('<b>name:</b> <%= name %> <b>type:</b> <%= type %> <b>exits:</b> <% .forEach(exits, function(exit) { %><a href="#/<%- exit %>"><%- exit %></a><% }); %>'); var lowerback = { _id: 'lower_back', type: 'doineedthis', title: 'lower back', description: 'total description here', reference: 'http://www.guideline.gov/content.aspx?id=47586&search=odg', questions: [ { name: 'first_visit_type', type: 'router', description: 'select specialist type and initial result ', specialist_type: ['MD/DO', 'ORTHO', 'CHIRO', 'PAIN'], exits: ['first_without_radioplathy', 'first_with_radioplathy'] }, { name: 'first_without_radioplathy', type: 'router', description: 'firstwithout ', specialist_type: ['MD/DO', 'ORTHO', 'CHIRO', 'PAIN'], exits: ['end'] }, { name: 'first_with_radioplathy', type: 'router', description: 'first with ', specialist_type: ['MD/DO', 'ORTHO', 'CHIRO', 'PAIN'], exits: ['end'] }, { name: 'end', type: 'exit', description: 'Thank you ', exits: [] } ] }; var main = document.getElementById('main'); var views = {}; var decloak = function decloak() { document.body.removeAttribute('bc-cloak'); decloak = function decloak() { }; }; var loadQuestion = function loadQuestion(n) { return function () { main.innerHTML = question(n); decloak(); }; }; (function (r) { .forEach(lower_back.questions, function (n) { r.add(n.name, loadQuestion(n)); }); var update = function update() { var hash = location.hash || '#'; r.run(hash.substr(1)); }; window.addEventListener('hashchange', update); update(); })(new Rlite); })();
The router will replace the contents of the #main div with html created by the lodash template using lower_back for data. The "exits" array tell the template which links to create at the end of the question (that way your questions can branch based on user input.)
The code doesn't do much, but it taught me that microframeworks give you a nice, low-level approach to building a javascript app. This functionality is exactly what I wanted and it only took a few lines of code. For simple projects like this, it might not make sense to lug angular or backbone along for the ride. It just makes more sense to use HTML5 and modern browsers to get the job done.
The reason for this is not to map out an environment for a legit company, but more a way to keep a small, budget diy setup organized and complete. I want to utilize my home network and figure out the most minimalist way to host services that won't just die out. I want a few things just so I have them:
postgres elasticsearch jenkins automatic backups a way to not worry about a failure monitoring
Generally, i want to host a few things on a machine in a datacenter but be able to automatically fail over to my house if that server barfs.
service how it is monitored where it is backed up how to manage security credentials
email - get the fiftytwo.net email working via google apps
install virtualbox. make a big ubuntu 14.04 box and install:
salt: https://www.digitalocean.com/community/tutorials/how-to-install-salt-on-ubuntu-12-04 https://www.digitalocean.com/community/tutorials/how-to-create-your-first-salt-formula http://docs.saltstack.com/en/latest/topics/tutorials/quickstart.html
how to add formulas - http://docs.saltstack.com/en/latest/topics/development/conventions/formulas.html
cd /srv/formulas git clone https://github.com/saltstack-formulas/jenkins-formula.git vi /etc/salt/master service salt-master restart
what's done? git, nginx, jenkins, es, logstash, kibana, nodejs, pm2, nagios(?), ipython. jenkins can pull from github and my private git repo (i added jenkins and jordan public keys to the git user) jenkins can build and test a node project but to get angularNonsense to really test, read and implement this: http://karma-runner.github.io/0.12/plus/jenkins.html
the full salt config is in my personal git server.
now i need to install aws-cli, put keys on the server and write a cron job to automate backups to s3. things to back up: /home/git/repos (use this gist to figure out how to create the date and whatnot - https://gist.github .com/philippb/1988919 ) /usr/lib/jenkins (install scm-sync-configuration and configure it to sync to my git repo) elsaticsearch - http://www.elasticsearch.org/guide/en/elasticsearch/guide/current/backing-up-your-cluster.html and http://www.elasticsearch.org/guide/en/elasticsearch/reference/current/modules-snapshots.html and https://github.com/elasticsearch/elasticsearch-cloud-aws#s3-repository
wget -O - https://bootstrap.saltstack.com | sudo sh vi /etc/salt/minion add file_client: local vi /srv/salt/top.sls base: '*':
- git
- nginx
- jenkins
- elasticsearch
- logstash
- kibana
- nodejs
- pm2
- nagios
- ipython
- postgres
add a state tree for each state above: vi /srv/salt/git.sls git: # ID declaration pkg: # state declaration
- installed # function declaration
run salt: salt-call --local state.highstate -l debug
git - host my own git server , gitolite? have nagios check the url to make sure it responds s3cmd sync the project directory and config directory every night. ssh access only
nginx - for everything nagios commit config to git log everything to logstash
jenkins (build/deploy/multi environment) requires a jdk to run. figure out the best place for JENKINS_HOME nagios commit config to git, stored in JENKINS_HOME ssl via fiftytwo.net, password protected behind nginx
elasticsearch nagios commit {path.home}/config to git, s3cmd sync {path.home}/data and {path.home}/plugins ssl via fiftytwo.net, add basic-auth and authorization per this article - http://www.elasticsearch.org/blog/playing-http-tricks-nginx/
logstash - https://blog.devita.co/2014/09/04/monitoring-pfsense-firewall-logs-with-elk-logstash-kibana-elasticsearch/ nagios commit config to git - https://www.digitalocean.com/community/tutorials/how-to-use-logstash-and-kibana-to-centralize-and-visualize-logs-on-ubuntu-14-04 behind nginx basic auth
kibana nagios commit config to git behind nginx basic auth
nagios ? commit config to git behind nginx
ipython notebook nagios commit my environment setup to git behind nginx basic auth
python codebase commit everything to my git
pm2 for running node apps nagios? log to logstash - https://blog.devita.co/2014/09/04/monitoring-pfsense-firewall-logs-with-elk-logstash-kibana-elasticsearch/ s3cmd the startup script every night. (lib/scripts/pm2-init.sh via https://github .com/Unitech/PM2/blob/master/ADVANCED_README.md#startup-script)
postgres nagios https://wiki.postgresql.org/wiki/Automated_Backup_on_Linux then s3cmd to copy to s3, commit pg_backup.config to git
s3 fiftytwo-git fiftytwo-postgres fiftytwo-elasticsearch fiftytwo-securitycameras
security cameras are running on the host, set them up to: nagios, send logs to logstash s3cmd sync mpgs to s3 mysql nagios log to logstash auto backup to s3!
automatic updates
apt-get install fail2ban
apt-get install unattended-upgrades
vim /etc/apt/apt.conf.d/10periodic
APT::Periodic::Update-Package-Lists "1";
APT::Periodic::Download-Upgradeable-Packages "1";
APT::Periodic::AutocleanInterval "7";
APT::Periodic::Unattended-Upgrade "1";
vim /etc/apt/apt.conf.d/50unattended-upgrades
Unattended-Upgrade::Allowed-Origins {
"Ubuntu lucid-security";
// "Ubuntu lucid-updates";
};
apt-get install logwatch
vim /etc/cron.daily/00logwatch
/usr/sbin/logwatch --output mail --mailto test@gmail.com --detail high
ops domain is fiftytwo.net = get a wildcard ssl cert.
route53 for DNS.
encrypted, offsite backups
fileserver.
dev and production environments.
Claps is an unreleased hand clap generator that uses random normals to drift hand claps to give them a natural sound. The app allows you to configure the distribution, bpm, and number of actors (clappers.) Configure the pan and volume for each clapper. Of course, volume is not a value, but a distribution.
If you want to generate applause, simply set the number of actors to a distribution instead of a constant.
Claps is a command line app that generates an MP3 stream as output.
I am writing a web-based clap generator. It's on github in my angularNonsense app.