Logging router traffic and other changes

It’s been a slow couple months between the holidays, travelling, and work.  I did manage to accomplish a few things with the Home Dashboard project, though.  I redesigned the UI to move away from an exclusively mobile interface as the amount of data and type of data I’m including in the project now simply don’t all make sense to squeeze into a mobile UI.  Sometime in early January, the system broke the 2 millionth record milestone — I’m unsure what I’ll do with some of the data I’m collecting at this point but I’ve learned a lot through collecting it and I’m sure I’ll learn more analyzing it at some point in the future.
_statsThis brings the list of events I’m collecting to:

  1. Indoor temperature and humidity
  2. Amazon Echo music events
  3. DirecTV program and DVR information
  4. Cell phone location and status details
  5. Local fire and police emergency events
  6. Home lights and other Wink hub events
  7. …and now home network information

Analyzing home network information

The biggest change was the addition of network event logging.  After seeing that a foreign IP was accessing my LAN, I started logging each request to or from my home network until I was sure I had fixed the vulnerability.  After that, I found the information interesting so I just kept logging it.  For example, I was able to discover that an app on my phone was making repeated calls (~2,000 per day) to an app monitoring service (New Relic) which wasn’t doing my phone’s battery life any favors.

_routerlogs

_routerlogsbydevice

Collecting additional phone data

After launching Location to HTTP, I’ve been tinkering with additional data collection from my cellphone such as Bluetooth, WiFi, GPS, battery, and screen status.  After collecting this information for a month or so, here are some useless data points from the most recent 30 days:

  • I’ve actively used my phone for 104.5 hours (3.5hrs per day – I need to cut back on the work email…)
  • Average battery level: 61%
  • Average free memory: 516MB (12.6%)
  • Average location accuracy: +/-31FT
  • Average altitude: 154FT
  • Average speed: 1MPH
  • Average uptime: 153.3HRs
  • Maximum uptime: 437.4HRs

_phonestats

I also improved the location history mapping to show different color map markers depending on the age of the record and phone details at the time the record was made:

 

Improved home climate visuals

I added some simple graphs of MoM temperature and humidity and also updated the heat-mappings for the daily climate information.  These are a bit easier to read that those I had in the previous UI.  It’s interesting to see the effectiveness of thermostat automation and our daily routines.

More detailed emergency event

Lastly, I expanded on the emergency event information to surface the top event types and the total number of events by type:

 

Advertisements

Collecting and Handling 911 Event Data

Seattle has a pretty awesome approach to data availability and transparency through data.Seattle.gov.  The city has thousands of data sets available (from in-car police video records to land zoning to real-time emergency feeds) and Socrata, a Seattle-based company, has worked with the city (and many other cities) to allow developers to engage this data however they like.  I spent some time playing around with some of the data sets and decided it’d be nice to know when police and fire events occurred near my apartment.

I setup a script to pull the fire and police calls for events occurring within 500 meters of my apartment and started storing them into a local database (Socrata makes it so simple – amazing work by that team).  While reading it from the API, I check the proximity of the event to my address and also the type of event (burglary, suspicious person, traffic stop, etc) and trigger emails for the ones I really want to know about (such as a near by rape, burglary, shooting, vehicle theft, etc).  I decided to store all events, even traffic stops, just because.  I may find a use for it later – who knows…

After I’ve scrubbed through and sent any notifications for events I care about, I display the data in a simple table in my existing home dashboard and highlight red any rows for events which are within certain square area of my apartment.
911table.png

To add a nice visual, I also plot the most recent events on a map using the Google Maps API.  Police events are noted with blue pins, fire events are noted with red pins:
911map

Clicking the pins will give us some details about the event:
911detail.png

All told, it was a pretty simple project which helped me gain some experience with the Google Maps API and also poke around with some of the data the city provides.  I’m sure I’ll be doing a bit more of that in the future.  These two projects have been integrated back into my home automation dashboard so I can continue to build on them in the future.

Data Visualization and Demo

As mentioned previously, my goal wasn’t to just create a home controller/dashboard but to also collect as much data as possible while doing so.  So tonight, I started playing around with a few different visualizations of the data I’ve collected thus far.  It took a few hours but I’m satisfied with the current state.

I’m doing simple dumps of the most recent music played by my Amazon Echo; most recent programming watched via DirecTv; visualizing the daily average, minimum, and maximum temperature and humidity levels in my apartment; visualizing by hour of day the average, min, and max temperature for the current month vs the previous month; breaking down the amount of time I spend at home by day of week (and telling on myself that I like to leave work early on Fridays :)); and visualizing my TV watching habits by hour of day and day of week.

I recorded a video of this all and also included the DirecTv control demo at the end.

Using a Pi to measure TV habits

As noted here, I’m using the the DirecTV SHEF API and a Raspberry Pi to poll my DirecTV receivers every minute and store what they’re doing into a MySQL database. After ~6 months of storing that data, I thought it’d be interesting to analyze some of it.  After all, do I really watch enough HBO and Starz to warrant paying for them every month?

Channel Preferences

  • 25,900 minutes of TV watched (~2.25 hours per day…eek; still less than the national average!).
  • 2,612 minutes of that was recorded (10%).
  • NBC is our favorite channel (3,869 minutes watched, 15%).
  • E! is our second favorite channel (1,911 minutes watched, 7.37%). Gotta keep up with those Kardashians.
  • Premium movie channels (HBO, Starz, Encore, Cinemax, etc) were watched 6,870 minutes (26.52%) – apparently worth the money.
  • Premium movie channels were recorded only 571 minutes (lots of ad hoc movie watching, I guess).
  • NBC is our most recorded channel (479 minutes) followed by HGTV (391 minutes) and ABC (330).

Time Habits

  • Sunday is the most watched day (no surprise here) with 7,157 minutes watched (28%)
  • Saturday is the second with 5,385 (21%)
  • Wednesday is the least watch with 1,144 (4.4%)
  • April was our biggest TV month with 6,413 minutes watched (24.76%)
  • June was our lowest month with 1,197 (4.62%) — July is around 10%.  The excitement of summer faded fast, apparently.
  • 8PM is our biggest TV hour with 1,312 minutes (15.14%).  This is followed by 7pm (13%) and 6pm (10%).
  • 6AM is our lowest TV hour with 68 minutes watched (0.26%).  This is followed by 5am (0.49%) and 4am (0.90%).

This is pointless data but it’s always interesting to assess your own habits.  And if you’re wondering, it took roughly 60 minutes (0.23%) to query and publish this post :).