Projects

Code Project: Network Map Webpage

I want to start off by saying, there isn’t going to be a ton of code here, and if there is code, it’s going to be super dirty. I’m fairly good at making code for “private use” that is pretty insecure, and not so great at code that’s scrubbed up and user friendly to distribute to others.

I’ve been working a bit on some local code projects, specifically for my little private “Dashboard” that runs on my file server. One project I’ve wanted to try for a while is a dynamic network tracker tool. I’ve looked into some options available, and they all seem to run as a plug in for some complicated 3rd party analytics software that often has some goofy complicated set up procedure that’s beyond “apt-get” or even just dumping a bunch of files in a web server directory.

This project is both kind of simple and not. It was fairly simple in set up and execution, but it’s somewhat complex in design. The first job was getting a list of currently connected devices on the network. This is easily done via the command line with an arp-scan request.

sudo arp-scan --localnet

The output of which looks something like this:

Using a pipe, I can shove all of this into a text file, which contains everything above.

sudo arp-scan --localnet | scan.txt

The trick is, how to display this output on a webpage. One way would be to pull it from a database. Pulling data from MySQL is pretty easy, dumping it to a pretty looking table is also easy. The harder part is getting the output of arp-scan to MySQL in a useful manner.

This is where Python comes into play. I am sure there are other methods or languages available, but like Python, and mostly know how to use Python. Essentially, I wrote a script that would open the file, scan.txt, that was created above. I am only concerned with lines that contain IP addresses, so I used the function “is_number()” to check if the first character of each line is numeric, if it is, it runs through a couple of operations.

Firstly, the output of arp-scan is tab delimited, so I can use the “split” function on “\t”, and dump the result into an array. This gives me an array of the IP address, MAC address, and Manufacturer. This sticks a new line in with the Manufacturer, so I did a “replace” on \n in the third item of the list. Lastly, I wanted the IPs to be uniformly formatted, so I write a little function that would add in leading zeros to the IP octets.

Finally, the Python builds an SQL statement from the line’s list, and make a call to the server to insert the values. A modified version of this code that just displays the resulting SQL commands instead of executing them is below.

#!/usr/bin/python

# Open a file

def is_number(s):
        try:
                float(s)
                return True                                                         except ValueError:
                return False

def format_ip(ipstring):
        octets = ipstring.split(".")
        n=0
        for i in octets:
                while(len(i)<3):
                        i=add_zero(i)
                octets[n]=i
                n=n+1
        return octets[0]+"."+octets[1]+"."+octets[2]+"."+octets[3]
        #return ipstring

def add_zero(shortstring):                                                          return "0"+shortstring


import MySQLdb

mydb = MySQLdb.connect(
  host="localhost",
  user="YOURSQLUSERNAME",
  passwd="YOURSQLPASSWORD",
  database="YOURTARGETDATABASE"
)

mycursor = mydb.cursor()

fo = open("scan.txt", "r")
#print ("Name of the file: ", fo.name)

fo.seek(0)

# read each line of the list
for line in fo:
        #check for lines that contain IP addresses
        if is_number(line[0]):                                                              #Convert lines into list
                line_list = line.split("\t")
                #remove line delimitors
                line_list[2]=line_list[2].replace("\n","")
                #Make IP Octets 3 digits
                line_list[0] = format_ip(line_list[0])
                SQL = "INSERT INTO arpscans (ip, mac, mfg) VALUES ("+line_l$                print SQL                                                   
fo.close()

It’s not super pretty, but it was a quick way to make sure everything came out looking correct. The table I used is called “arpscans” and contains columns called, “ip”, “mac”, “mfg”, and “last_seen”. The time stamp is an automatically generated time stamp.

I then created a shell script that would run the arp-scan piped into scan.txt then runt he python script. I set up this script in the root crontab to run once every half hour. Root is required to run the arp-scan command, so my user crontab wouldn’t cut it. Everything ran fine when I manually did a run of the script using sudo. The PHP on the other end out output the latest values based on the time stamp to a webpage.

This is where I ran into my first major hurdle. The script wasn’t running in cron. After a lot of digging and futzing, I found that basically, when cron runs the script, it works off of different environmental variables. I had to specify in ,y bash file, specifically where each command existed. The end result looks something like this:

#!/usr/bin/env bash
/usr/sbin/arp-scan --localnet > /home/ramen/scripts/arp_sql/scan.txt
/usr/bin/python /home/ramen/scripts/arp_sql/arp_post.py

Eventually the scan was running and posting data automatically as expected. After a few days, I ran into my second major issue. There was simply put, way too much data for my crappy old “server” to handle. The webpage slowed to a crawl as the table contained something like 9000+ entries. It’s possible and likely that my query was also rubbish, but rather than stress more figuring it out, I modified all of the code again.

Instead of adding a new entry for every MAC address every scan, I changed it to check if there already was an entry, and simply update the last_seen time. I had originally designed the system with the idea of getting legacy data for attached devices, but decided I only really cared about a generic history.

The new webpage table now displays all devices, current and previously seen, with the last seen date.

A few issues came up on the output end as well, though none of them were super hard to correct. One, I wanted a way to sort the table by clicking the headers. There are several scripts you can toss in your code to do this online.

I also wanted more data about each device, so I added a form where I could fill in more data about each device. Specifically, the network name, if there was one, a description of what the device is, the User of the device (which family member or if it’s just a network device). This also checks and updates based on MAC address.

I also ran into an issue with MAC addresses and my Network extender. When devices are connected to the Network Extender, the first half of the MAC is replaced with the first part of the Extender’s MAC, though they retain the last half. I may eventually write some code to detect and merge these entries, but for now, I’ve simply been labeling them in the description as “(Extender)”, so I know it’s the same device on the other connection.

The final end result looks something like this:

I used to have the network super organized before I moved, but the new router doesn’t work nicely with my Pi DHCP server, so I have not gotten things quite as nicely sorted as I would like. Everything in the picture is sorted, but above .100, it’s a mess. I also can’t assign IPs to some devices at all, like the DirecTV gear or my Amazon Echos, which is really annoying.

One of my future projects will hopefully correct this, as I want to put a second router on the network with DD-WRT, between the ISP gateway and everything else.

Overall, it’s been a fun little exercise in coding that combined a lot different techniques together in a fun way.

On Moving and a lot of Small Projects

Moving into a new house tends to create a lot of little projects.  There’s been a few extra in the case of my new home, since it didn’t have any appliances, and was missing quire a few of the small trimmings.  Things like, handles on the kitchen cabinets.  It’s an easy little thing to add, a few measurements and a bit of drilling is all, but it needed done.

There’s also been quite a few larger projects.  Like the appliances.  The fridge and stove pretty much go right in and plug in, though the fridge needed a water line run to it to work the ice maker.  Then there’s the dishwasher, which needed to be hooked up to power, and needed both a water and drain line run to it.  These aren’t that bad either, punch a hole through a cabinet, drop the lines, tap into the existing sink plumbing.  This whole project also involved adding a garbage disposal to the sink however, which itself required the addition of a switch the run the thing.

Little projects.  Like hanging curtain rods on all of the windows.

Or putting in some railing on the front and back stairs, which was something required by insurance.

This is of course a little more involved, what with needed to drill into concrete and cut the railing to fit and whatnot.  Then there’s things like hanging decorations and some display racks in my wife’s office.  Or assembling shelving in the basement for my junk, which resulted in a trip to the ER when I dropped one unit on my foot while trying to tip it up, and failing to do so.

There was also the dilemma of the washer an dryer.  The closet for the laundry appliances was very small, so we had to get some smaller appliances.  The Washer hooked up easily enough, it just hooks to the hot and cold and existing drain.  The dryer was something else completely.  There wasn’t a dryer vent hole, and with the cramped space, there wasn’t a lot of room to add one.  I managed to chop a hole in the wall and go down through the floor tot he open basement, where I added a bit of duct work to go out the side of the house through an existing unused hole where a vent used to exist.

Then there were the issues.  Early on we discovered that the plumbing wasn’t draining properly.  A call to the plumber and we found that the sewer line was blocked, and after an expensive call and some digging, the sewer was fixed.  More recently there was a power issue in the basement area.  I did a bit of checking on all of the junction boxes but ultimately had to get an electrician in.  He tightened a lot fo connections in the breaker box and found a bad wire on the problem circuit.

Speaking of the basement, things are coming along well there as well.  Just before the power issue I was in the process of adding some outlets for the TV and video game corner.  Which will be pretty cool when done.

Basement – My New Space

Since the primary focus of this blog is projects, i thought it would be appropriate to talk about my new space a bit.  Last week, I’ve moved into a new home.  Mostly we wanted a home that was more on one level to accommodate my wife and kids and their mobility issues, but also we moved closer to where I work.  The new house has a pretty nice unfinished basement area, which opens out to the backyard under a deck on the main/upper floor.  Being unfinished, it’s effectively a clean slate to do what I want with.  My son is taking a corner of it for his room, but I’ve been working to plan out the rest of it to accommodate a media area, space for my toys and space for my electronics and computer.

A lot of things will probably flux a bit but I’ve got a pretty good idea of how things are going to lay out.  It may be a bit before any real work gets done, there are other projects that have funding priority over building walls in the basement, but I still wanted to throw out a baseline starting point.  I’ll do some updates from time to time as work is done.

Here’s the primary space, full of boxes at the moment.  The far space in the back will be the space for electronics, my computer, and books.  Essentially, the office space.  I also have plans to set up a little photography space in the corner using a corner unit I built several years ago.  The corner unit has a large top surface I can set up lighting and backdrops on and space underneath that I can store props and diorama parts in for use in photos.

I have some better book cases in storage that will go on the back wall.  The desk will probably continue to stick out int he room the way it is there in the center.  The space to the right where the larger boxes are I intend to build some nice shelving units to contain my collection of toys.

This will be the future media wall.  Its a bit more cleaned up than the other corner, we were without internet and TV for a bit so I pulled all of the DVDs out onto this shelf so the kids could find them more easily.  The main issue here is the lack of any power outlets, however adding outlets will be fairly easy.  The udea int he long run will be to add a second identical book case on the right full of video games, and put the TV in the middle mounted to the wall.  I also want to build or buy some sort of unit to go under the TV that will house all of my various game consoles and their controllers, all ready to be used and played.  We have a couch to go in front of all of this, but there isn’t space at the moment so it’s in the garage.

At this point I’ve mostly been working on sorting and organizing boxes.  Things are at a bit of a stand still until I can retrieve the other book cases and add some power outlets.  I also don’t want to get things too settled until I can add some actual walls in front of the concrete, I’m not a huge fan of that grungy cave look.  The ceiling will likely end up being a drop ceiling so that I can still access the underside of the floor easily.

 

 

A Myriad of Little Projects

I’ve neglected posting much lately, not so much because I haven’t been doing anything but more because I’ve been busy and not really with anything deserving of it’s own post.  I hope to remedy this a bit next year but for now I just wanted to run through some recent projects I’ve been working on.

The All New All the Same Lameazoid.com

Probably the biggest monopolizer of my time has been my other blog at Lameazoid.com.  There isn’t a lot there now, but my intention is to do a relaunch of sorts in 2017.  I’ve managed to keep up with my current regular posting, which amounts to roughly two posts per week, one Weekly Haul post and a recap of Agents of SHIELD.  I want to do much more next year.  I even made up a spreadsheet to plan everything for the year.

I have regular content set up for every day of the week.  The idea right now, is to build up a long runway.  I have the time now to crank out reviews and take photos as needed.  If all goes to plan, I will have content scheduled out through roughly May in every category.  The idea is that this content, while good is a buffer that can be shuffled as needed for NEW content to be inserted on demand.

I’ve also taken steps to try to line up content with related new content.  For example, Logan, comes out on 3/3.  So in the weeks before, for the Marvel Movie Review of those weeks, I’ll do Wolverine and The Wolverine (yeah those names are similar and dumb).  I could also pair this with some Wolverine related Marvel Legends reviews, or maybe some other Hugh Jackman reviews.

I’ve been up to a few new tech related projects lately as well.

Mail-In-A-Box

I’ll probably do a post just on Mail-in-a-Box and my set up experience.  Mail-In-A-Box is a simple install Mailserver for hosting your own email.  I’ve spun up a second VPS and attached this domain to it, since I previously didn’t have any email for this domain.  It was a little tricky but I worked things out.  The hardest bit is that Mail-In-A-Box wants to handle the DNS and core domain, but I’m hosting these things on two separate servers.

I’ve gotten a little extra cozy with DNS lately, but I also had an issue come up because Mail-In-A-Box seemed to be pushing the SSL https domain for BloggingIntensifies.

Encryption Everywhere

You might notice, I’ve enabled HTTPS on this blog.  This came out of necessity since after setting up Mail-In-A-Box, Firefox kept forcing the site to the HTTPS version, which nothing was set up for so it didn’t load.  This is a change I’ve been meaning to make anyway since the launch of LetsEncrypt!  Google is supposed to start penalizing non HTTPS sites at some point plus it’s good practice anyway.  I set up HTTPS for this blog, Lameazoid.com and Joshmiller.net.  Once I am confident in things I’ll set it up for TreasuredTidbits,.com and TheZippyZebra.com as well.

I had some issues with Joshmiller.net though because of the way Cloudflare works.

Cloudflare Integration

I also recently added Cloudflare to all of my sites.  Cloudflare is essentially a DNS provider but it also lets you mask and reroute traffic to help protect your server.  I had to pull BI off of it though to get Mail-In-A-Box to work and apparently Lameazoid.com wasn’t set up for rerouting.  I ended up having trouble with Joshmiller.net when I tried to enable SSL encryption.  Basically, as near as I can tell, the set up was looking at the Cloudflare IP and not the server IP, so things weren’t meshing or hooking up properly.  Everything corrected itself once I removed the Cloudflare rerouting.  I still need to play with this a bit before I set things up on my wife’s two blogs.

Part of why I experiment with my blogs vs hers is that I get way less traffic and I don’t like to irritate her.

Cloud At Cost VPS

I did a post on Cloud At Cost, but I wanted to mention it again as a recent project.  I have two VPSs from them, plus some.  I’m still having issues with the Windows VPS but the Linux one has been running pretty well since I got it up and running.

PLEX Server

My Synology NAS has the ability to act as a PLEX server.  I recently cleaned up a bunch of space on the NAS by throwing some spare drives into an older machine and creating a “Deep Archive” for things that I never need to access that take up a lot of space (read: My 500GB of raw video from ten years of my bi annual DVD making projects).  I also shoved some things like old ISOs and Game Install files onto the Deep Archive.  I then proceeded to start filling this new space with rips of my DVD collection.  I’m still working on the long and arduous ripping process as time allows but the idea is to run everything through PLEX to the two Firesticks I’ve set up on each TV.  This means my family doesn’t have to drag out a huge binder of DVDs to find a movie and it means I can stop worrying about discs getting scratched up and ruined.

It also gives me a nice way to watch all of the home video footage I’ve recorded over the past 10+ years.  This whole project met a bit of a roadblock when I found that I need to pre transcode all of the video in PLEX before it becomes watchable.  The NAS isn’t powerful enough to transcode it in real time.

 

Next Thing CHiP as a Twitter Bot

twitter-logoThere was a post that came across on Medium recently, How to Make a Twitter Bot in Under an Hour.  It’s pretty straight forward, though it seems to be pretty geared towards non “techie” types, mostly because it’s geared towards people making the bot on a Mac and it uses something called Heroku to run the bot.  Heroku seems alright, except that this sort of feels like an abuse of their free tier, and it’s not free for any real projects.

I already have a bunch of IOT stuff floating around that’s ideal for running periodic services.  I also have a VPS is I really wanted something dedicated.  So I adapted the article for use in a standard Linux environment.  I used one of my CHiPs but this should work on a Raspberry Pi, an Ubuntu box, a VPS, or pretty much anything running Linux.

The first part of the article is needed, set up a new Twitter account, or use one you already have if you have extras.  Go to apps.twitter.com, create an app and keys, keep it handy.

Install git and python and python’s twitter extension.

sudo apt-get install git

sudo apt-get install python-twitter

This should set up everything we’ll need later.  Once it’s done, close the repository.

git clone https://github.com/tommeagher/heroku_ebooks.git

This should download the repository and it’s files.  Next it’s time to set up the configuration files.

cd heroku_ebooks

cp local_settings_example.py local_settings.py

pico local_settings.py

This should open up an editor with the settings file open.  It’s pretty straight forwards, you’ll need to copy and paste the keys from Twitter into the file, there are 4 of them total, make sure you don’t leave any extra spaces inside the single quotes.  You’ll also need to add one or more accounts for the bot to model itself after.  You’ll also need to change DEBUG = TRUE to DEBUG = FALSE as well as adding your bot’s username to the TWEET_ACCOUNT=” entry at the bottom.

Once that is all done do a Control+O to write out the file and Control+X to exit.  Now it’s time to test out the bot with the following…

python ebooks.py

It may pause for a second while it does it’s magic.  If you get the message ” No, sorry, not this time.” it means the bot decided not to tweet, just run the command again until it tweets, since we’re testing it at the moment.  If it worked, it should print a tweet to the command line and the tweet should show up in the bot’s timeline.  If you get some errors, you may need to do some searching and troubleshooting, and double check the settings file.

Next we need to automate the Twitter Bot Tweets.  This is done using Linux’s built in cron.  But first we need to make our script executable.

 chmod 755 ebooks.py

Next, enter the following….

sudo crontab -e

Then select the default option, which should be nano.  This will open the cron scheduler file.  You’ll want to schedule the bot to run according to whatever schedule you want.  Follow the columns above as a guide.  For example:

# m h  dom mon dow   command

*/15 * * * * python /home/chip/heroku_ebooks/ebooks.py

m = minutes = */15 = every 15 minutes of an hour (0, 15, 30, 45)

h = hour = * (every hour)

dom = day of month = * = every day and so on.  The command to run, in this case, is “python /home/chip/heroku_ebooks/ebooks.py”.  If you’re running this on a Raspberry Pi, or your own server, you will need to change “chip” to be the username who’s directory has the files.  Or, if you want to put the files elsewhere, it just needs to b e the path to the files.  For example, on a Raspberry Pi, it would be “python /home/pi/heroku_ebooks/ebooks.py”.

If everything works out, the bot should tweet on schedule as long as the CHIP is powered on and connected.  Remember, by default the bot only tweets 1/8th of the time when the script is run (this can be adjusted in the settings file), so you may not see it tweet immediately.

This is also a pretty low overhead operation, you could conceivably run several Twitter Bots on one small IOT device, with a staggered schedule even.  Simply copy the heruko_ebooks directory to a new directory, change the keys and account names and set up a new cron job pointing to the new directory.