My Music Listening Habits for February 2020

Not a whole ton that’s overly exciting this month. I did listen to quite a bit of BT, whom is one of my all time favorite artists, even if my time listening to his stuff comes and goes. I think there was a bit of earlier stuff that I didn’t associate as being BT that I liked, but I remember that Never Gonna Come Back Down was the track that really got me into listening to more of BT’s music. It was on the soundtrack for Gone in 60 Seconds. In the actual movie, they play this really neat super low key remix of Never Gonna Come Back Down that I have never been able to find a copy of anywhere. It occurs when they are stealing the Ferraris from the garage.

Another older album I used to listen to on repeat that came back briefly this month was Rollergirl. Last.fm suggests Rollergirl hasn’t had a release since 2002, which is probably around the time I was listening to this music. Also, apparently Roller girl is some German dude.

Still plenty of Sigrid and Tessa Violet going on. Plus a bit more CHVRCHES this month. I have to say, the more I listen to CHVRCHES, the more I like their stuff. Not that I didn’t like it before, I just, like it more.

Anyway, I’m just gonna close out here with a little Spotify Playlist I made up that collects together the more low key Sigrid tracks available on the service.

Code Project: Network Map Webpage

I want to start off by saying, there isn’t going to be a ton of code here, and if there is code, it’s going to be super dirty. I’m fairly good at making code for “private use” that is pretty insecure, and not so great at code that’s scrubbed up and user friendly to distribute to others.

I’ve been working a bit on some local code projects, specifically for my little private “Dashboard” that runs on my file server. One project I’ve wanted to try for a while is a dynamic network tracker tool. I’ve looked into some options available, and they all seem to run as a plug in for some complicated 3rd party analytics software that often has some goofy complicated set up procedure that’s beyond “apt-get” or even just dumping a bunch of files in a web server directory.

This project is both kind of simple and not. It was fairly simple in set up and execution, but it’s somewhat complex in design. The first job was getting a list of currently connected devices on the network. This is easily done via the command line with an arp-scan request.

sudo arp-scan --localnet

The output of which looks something like this:

Using a pipe, I can shove all of this into a text file, which contains everything above.

sudo arp-scan --localnet | scan.txt

The trick is, how to display this output on a webpage. One way would be to pull it from a database. Pulling data from MySQL is pretty easy, dumping it to a pretty looking table is also easy. The harder part is getting the output of arp-scan to MySQL in a useful manner.

This is where Python comes into play. I am sure there are other methods or languages available, but like Python, and mostly know how to use Python. Essentially, I wrote a script that would open the file, scan.txt, that was created above. I am only concerned with lines that contain IP addresses, so I used the function “is_number()” to check if the first character of each line is numeric, if it is, it runs through a couple of operations.

Firstly, the output of arp-scan is tab delimited, so I can use the “split” function on “\t”, and dump the result into an array. This gives me an array of the IP address, MAC address, and Manufacturer. This sticks a new line in with the Manufacturer, so I did a “replace” on \n in the third item of the list. Lastly, I wanted the IPs to be uniformly formatted, so I write a little function that would add in leading zeros to the IP octets.

Finally, the Python builds an SQL statement from the line’s list, and make a call to the server to insert the values. A modified version of this code that just displays the resulting SQL commands instead of executing them is below.

#!/usr/bin/python

# Open a file

def is_number(s):
        try:
                float(s)
                return True                                                         except ValueError:
                return False

def format_ip(ipstring):
        octets = ipstring.split(".")
        n=0
        for i in octets:
                while(len(i)<3):
                        i=add_zero(i)
                octets[n]=i
                n=n+1
        return octets[0]+"."+octets[1]+"."+octets[2]+"."+octets[3]
        #return ipstring

def add_zero(shortstring):                                                          return "0"+shortstring


import MySQLdb

mydb = MySQLdb.connect(
  host="localhost",
  user="YOURSQLUSERNAME",
  passwd="YOURSQLPASSWORD",
  database="YOURTARGETDATABASE"
)

mycursor = mydb.cursor()

fo = open("scan.txt", "r")
#print ("Name of the file: ", fo.name)

fo.seek(0)

# read each line of the list
for line in fo:
        #check for lines that contain IP addresses
        if is_number(line[0]):                                                              #Convert lines into list
                line_list = line.split("\t")
                #remove line delimitors
                line_list[2]=line_list[2].replace("\n","")
                #Make IP Octets 3 digits
                line_list[0] = format_ip(line_list[0])
                SQL = "INSERT INTO arpscans (ip, mac, mfg) VALUES ("+line_l$                print SQL                                                   
fo.close()

It’s not super pretty, but it was a quick way to make sure everything came out looking correct. The table I used is called “arpscans” and contains columns called, “ip”, “mac”, “mfg”, and “last_seen”. The time stamp is an automatically generated time stamp.

I then created a shell script that would run the arp-scan piped into scan.txt then runt he python script. I set up this script in the root crontab to run once every half hour. Root is required to run the arp-scan command, so my user crontab wouldn’t cut it. Everything ran fine when I manually did a run of the script using sudo. The PHP on the other end out output the latest values based on the time stamp to a webpage.

This is where I ran into my first major hurdle. The script wasn’t running in cron. After a lot of digging and futzing, I found that basically, when cron runs the script, it works off of different environmental variables. I had to specify in ,y bash file, specifically where each command existed. The end result looks something like this:

#!/usr/bin/env bash
/usr/sbin/arp-scan --localnet > /home/ramen/scripts/arp_sql/scan.txt
/usr/bin/python /home/ramen/scripts/arp_sql/arp_post.py

Eventually the scan was running and posting data automatically as expected. After a few days, I ran into my second major issue. There was simply put, way too much data for my crappy old “server” to handle. The webpage slowed to a crawl as the table contained something like 9000+ entries. It’s possible and likely that my query was also rubbish, but rather than stress more figuring it out, I modified all of the code again.

Instead of adding a new entry for every MAC address every scan, I changed it to check if there already was an entry, and simply update the last_seen time. I had originally designed the system with the idea of getting legacy data for attached devices, but decided I only really cared about a generic history.

The new webpage table now displays all devices, current and previously seen, with the last seen date.

A few issues came up on the output end as well, though none of them were super hard to correct. One, I wanted a way to sort the table by clicking the headers. There are several scripts you can toss in your code to do this online.

I also wanted more data about each device, so I added a form where I could fill in more data about each device. Specifically, the network name, if there was one, a description of what the device is, the User of the device (which family member or if it’s just a network device). This also checks and updates based on MAC address.

I also ran into an issue with MAC addresses and my Network extender. When devices are connected to the Network Extender, the first half of the MAC is replaced with the first part of the Extender’s MAC, though they retain the last half. I may eventually write some code to detect and merge these entries, but for now, I’ve simply been labeling them in the description as “(Extender)”, so I know it’s the same device on the other connection.

The final end result looks something like this:

I used to have the network super organized before I moved, but the new router doesn’t work nicely with my Pi DHCP server, so I have not gotten things quite as nicely sorted as I would like. Everything in the picture is sorted, but above .100, it’s a mess. I also can’t assign IPs to some devices at all, like the DirecTV gear or my Amazon Echos, which is really annoying.

One of my future projects will hopefully correct this, as I want to put a second router on the network with DD-WRT, between the ISP gateway and everything else.

Overall, it’s been a fun little exercise in coding that combined a lot different techniques together in a fun way.

A Pile of Used Tech

I recently had an idea occur to me that I might be able to pick up used Raspberry Pis off of eBay more affordably than buying them new. I didn’t really find a ton of savings, but I did pick up an auction for a lot of various parts for fairly cheap.

I am not sure what I’m going to do with all of this, but it seemed like a deal for around $50. I was worried that it wouldn’t all be included, but it was. Not everything is what I had hoped though. Two of the three Raspberry Pi 2s seem to be dead. I’ve tried several trouble shooting methods so far. They turn on, but don’t seem to real SD cards at all.

The arduino is a genuine Arduino, which is nice, but its a fairly older model. Not a huge issue, but it is what it is. The screens were a nice bonus. I’ve been looking into getting a screen of some sort of my Pis, possibly for a RetroPi handheld build. I have not tested the larger screen yet, it seems to work off of a funky daisy chain of an extra board and some cables. I did get the smaller screen working… ish.

It’s a nice little touch screen that fits nicely on top of the Pi. I have not had a chance to properly troubleshoot it, but the touch works kind of funky on it. For one, it seems to function more like a track pad than a straight touch. Two, the mouse cursor only wants to move along a diagonal axis across the screen.

This all kind of feels like a configuration issue however, so there is some hope. Plus I am not sure I really need a touch interface for a RetroPi handheld build.

There’s some other fun stuff that I have not had a chance to mess with yet. There were a ton of ultra sonic sensors. I’m not sure what exactly these could be useful for, but I am wondering if they would be able to do 3D imaging of an object or a space.

There’s some funky board with a digital display on it that seems to be some sort of power board. I am not sure I’m going to have a use for this at all.

Lastly, there is a Raspberry Pi camera module. I have not had a chance to test it out yet, but like the screen, this was something I’ve been wanting to try out.

Reading for January 2020

I’ve had an up and down relationship with being motivated to read. I pushed myself back into reading round 2016 and had a lot of success at 35 books for the year, but then it fizzled out again with like, 1 book in 2018. It doesn’t help that I have a huge backlog of books to read. I feel more guilty about my unread books than my unread video games.

Possibly part of my problem is the refusal to accept that I’m not enjoying a book and move on. I’m not really over that one quite yet. As part of my motivation, I’ve decided to to book posts as a sort of parallel to my music posts. This is all also part of a larger effort to motivate myself to write more, about more singular topics. I am thinking, if I keep these up, that started books will mostly just be a list, and maybe a bit more detail once I finish, or abandon, a book.

Anyway, for January, I only finished one book, and started a few others. Started this month:

  • NBA Jam, by Reyan Ali, from Boss Fight Books, a look at the history of arcades and the NBA Jam video game franchaise.
  • Teenage Mutant ninja Turtles, the Ultimate Black and White collection Vol 1, by Kevin Eastman. A rerelease of the classic original Ninja Turtles Comics.
  • The Wendy by Erin Michelle Sky and Steven Brown. A sort of re-imagining of Peter Pan.

The one book I did finish was Leviathan Wakes, by James S.A. Corey. Also known as Book One of The Expanse.

I really really enjoyed the show and wanted to pick up the books after watching through it. Conveniently, like a week later, I found most of the books in hard copy at a garage sale for like a quarter each. Then maybe a month later, they were all on sale for Kindle in ebook format, so I decided that I could round out the holes and make them more convenient to read by getting the ebooks.

The story of the first book and the show are surprisingly close. The show adds a bunch of stuff going on with Earth that isn’t in the book at all, but it seems like it was kind of a necessary step for narrative purposes. The author of this is actually two people I believe, and both are assistants to George R.R. Martin of A Song of Ice and Fire fame. Leviathan Wakes follows a very similar style as the books from A Song of Ice and Fire with the alternating POV chapters.

I would definitely recommend the book, but if you have watched the show already, you pretty much know the story. There isn’t anything particularly new going on here, unlike with A Game of Thrones, which adds a lot of extra detail and history that’s not in the show.

As a wrap up, I am going to add a brief list of what I bought this month. As a sort of remind of how far behind I am, and how much more behind I am getting.

  • Interface by Neal Stephenson
  • The Last Wish: Introduction to the Witcher by Andrzej Sapkowski
  • Star Wars Trilogy Boxed Book Volume 2 by George Lucas, Donald Glut, and James Kahn
  • The God’s Eye View by Barry Eisler
  • The Princess Diarist by Carrie Fisher
  • Body Double: A Rizzoli and Isles Novel by Tess Gerritsen
  • Illidan: World of Warcraft by William King
  • God Particle: If the Universe is the Answer, What’s the Question by Leon Lederman and Dick Teresi
  • Mimic (Shapeshifter Chronicles Omnibus 1, 2, and 3) by James David Victor

My Music Listening Habits for January 2020

I keep wanting to change these up to be by artist instead of album but there doesn’t seem to be a 5×5 generator that does artists and has images. Sounds like an excuse to do some coding, but I’m not sure it’s possible because it feels like something someone would have done.

So anyway, it’s back to albums. Which gets a little sloppy looking this month.

So, there has been one major change this month. Around the turn of the year, I got an email through work for 6 months of Spotify Premium. I usually don’t really go for limited time offers but 6 months is a pretty good chunk of time, so I decided to go in on that. I’m enjoying using Spotify, but I doubt I keep the subscription after the 6 months are over. In general, I prefer to buy music. It has been pretty nice for discovery however.

Most of that discovery doesn’t show up on this 5×5 grid though. It’s hard to make it to the top monthly list when you get played maybe 2-3 times within a playlist of others played 2-3 times. Going by the numbers, it didn’t really increase my overall monthly Scrobbles either.

I am honestly a little surprised that Sigrid is still my most listened to artist. She has consistently held that spot since I started listening to her music, and she has become my most listened to artist of all time. I’ve gone through several phases of listening to her music which has helped. There was a time listening to the Sucker Punch album, which is Sigrid’s most recent album. Then I was listening to tracks from live shows that are currently unreleased, some on Youtube (which also gets scrobbled). Then there was a period of listening to the previous two EP releases, Ray and Don’t Kill my Vibe. More recently, with Spotify, I’ve found a cache of tracks that are only on Spotify.

I want to roll of Sigrid to Amanda Tenfjord. Her music came recomended on some Sigrid fan channels due to her similarity in overall style to Sigrid. The music sounds similar, the album art sounds similar, she is also Norwegian. There were jokes that she was secretly Sigrid, though there is a definite difference in the vocals. I am probably not an expert enough to properly describe it, but Amanda Tenfjord has less range and sounds a bit more Tenor… maybe? Like there’s more low end going on in her voice. Plus there is a slight difference in their accents.

Moving on.

Still a lot of Tessa Violet sprinkled throughout the playlist. I mentioned last month that I expect her to stick around for a while, though I’m starting to wonder just how long. I don’t really like all of her songs like I do other artists who stick around for a while. Another one that’s all over this 5×5 is Carly Rae Jepsen. I’ve enjoyed Carly’s music for a while, but Spotify has kind of opened up a nice little world of alternate takes and songs from her library. I particularly like this take on No Doubt’s Don’t Speak.

It’s not a super interesting take, but It’s a weird contrast to the usual super upbeat music of Carly Rae Jepsen. Also, back in the day, I used to listed to Tragic Kingdom a lot, so I have an underlying love for No Doubt as well.

The only thing left that’s particularly notable here is the soundtrack to Gris, coming in at number 4. Gris is a video game I was playing earlier this year and both the visuals and the soundtrack are excellent. It’s got a really nice ambient sort of piano vibe going that’s great for background music.