Ramen Junkie

Dead Hard Drive and My Process

So, I have been pretty sure for a while that the main Hard Drive in my desktop was going out.  It’s probably the oldest drive I own and occasionally it got feisty during reboots.  For a variety of reasons, I kept putting off replacing it.

… and putting it off…. and putting it off…

Then I went down to my office one day, the wife and kids were heading out for the weekend, I had grand plans to waste away my weekend on Overwatch and Battlefield 1.  Those plans came to an abrupt halt because I was greeted with a GRUB error.  My first assumption was that, as has happened before, Windows 10 did an update and screwed things up again.  A good while back I encountered  similar problem and after some troubleshooting I found that my Linux partition had been wiped out.  I reinstalled Ubuntu there and everything booted just fine.  Windows 10 had just done and update and after some searching online there were sporadic reports of similar issues.  After some troubleshooting trying to use a Windows disk to do a Master Boot Record fix and then trying to reinstall Ubuntu again, it became apparent that instead, the drive had failed.

This complicates things a bit.  I pulled out my SpinRite disc and threw it in the drive, hoping it would find and correct the error.  It instead threw out an error partway through the scan.  It’s an older disc, I’m honestly not sure if it’s compatible with the newer set up.  Instead I tried a copy of Norton Ghost to clone the drive to a spare 1TB drive I had in the cabinet.  It looked promising as well, though it also listed that it would take nearly 50 hours to finish.

I guess that meant no Battlefield but Overwatch runs fine on the laptop so a weekend of Overwatch and Netflix it would be.

Unfortunately, the clone crapped out as well after about an hour.

The final solution was to simply reinstall Windows 10, on a new drive.  I never use Ubuntu on the desktop so I opted not to bother reinstalling it.  I downloaded the official Windows 10 recover ISO and ran through the install.  During the install I skipped over entering the CD Key, Windows 10 is supposed to activate itself based on account credentials and hardware on the same machine, time to test that concept out.  The install finishes up and Windows 10 loads up just fine.  It’s even activated as promised in all of the Windows 10 feature lists.

The next task involves getting things back up and running order.

In recent years I’ve pushed a lot of my data off onto either my NAS or into Cloud accessible storage.  This makes this whole task much much easier.  I keep very little irrecoverable data on any one machine these days.  There are a few folders that I will need to recover from the old drive, but nothing super important, and I should be able to simply hook the drive up using a USB drive bay and do normal recovery operations to get to my data.

More interesting through, I ended up saving a ton of time and bandwidth with the games I had on the machine.  At one point I had nearly all of my 1000 Steam Games downloaded and installed, all of my GOG galaxy games and all of my Origin games installed.  These games are spread across several drives of varying size in this machine.  Once I reinstalled Steam, I set up Steam to use each of these drives and it simply detected all of the downloaded games, automatically.  The same happened with GOG Galaxy.  I didn’t see a way to make Origin to reattach to it’s old data so I just dumped that folder and redownloaded things as needed.

Honestly, ultimately this whole debacle has been a bit of a godsend.  I now have a fresh clean Windows 10 install, not one from my Windows 7 upgraded to Windows 10.  I also have a slightly nicer and faster drive as the main drive, which helps performance a bit.  It also gave me an excuse to purge out a lot of cruft I wasn’t really using.  I’ve shifted a lot of my computer use to my laptop, the desktop is primarily used for gaming, so it doesn’t really need anything else installed that doesn’t serve that purpose.

A Myriad of Little Projects

I’ve neglected posting much lately, not so much because I haven’t been doing anything but more because I’ve been busy and not really with anything deserving of it’s own post.  I hope to remedy this a bit next year but for now I just wanted to run through some recent projects I’ve been working on.

The All New All the Same Lameazoid.com

Probably the biggest monopolizer of my time has been my other blog at Lameazoid.com.  There isn’t a lot there now, but my intention is to do a relaunch of sorts in 2017.  I’ve managed to keep up with my current regular posting, which amounts to roughly two posts per week, one Weekly Haul post and a recap of Agents of SHIELD.  I want to do much more next year.  I even made up a spreadsheet to plan everything for the year.

I have regular content set up for every day of the week.  The idea right now, is to build up a long runway.  I have the time now to crank out reviews and take photos as needed.  If all goes to plan, I will have content scheduled out through roughly May in every category.  The idea is that this content, while good is a buffer that can be shuffled as needed for NEW content to be inserted on demand.

I’ve also taken steps to try to line up content with related new content.  For example, Logan, comes out on 3/3.  So in the weeks before, for the Marvel Movie Review of those weeks, I’ll do Wolverine and The Wolverine (yeah those names are similar and dumb).  I could also pair this with some Wolverine related Marvel Legends reviews, or maybe some other Hugh Jackman reviews.

I’ve been up to a few new tech related projects lately as well.

Mail-In-A-Box

I’ll probably do a post just on Mail-in-a-Box and my set up experience.  Mail-In-A-Box is a simple install Mailserver for hosting your own email.  I’ve spun up a second VPS and attached this domain to it, since I previously didn’t have any email for this domain.  It was a little tricky but I worked things out.  The hardest bit is that Mail-In-A-Box wants to handle the DNS and core domain, but I’m hosting these things on two separate servers.

I’ve gotten a little extra cozy with DNS lately, but I also had an issue come up because Mail-In-A-Box seemed to be pushing the SSL https domain for BloggingIntensifies.

Encryption Everywhere

You might notice, I’ve enabled HTTPS on this blog.  This came out of necessity since after setting up Mail-In-A-Box, Firefox kept forcing the site to the HTTPS version, which nothing was set up for so it didn’t load.  This is a change I’ve been meaning to make anyway since the launch of LetsEncrypt!  Google is supposed to start penalizing non HTTPS sites at some point plus it’s good practice anyway.  I set up HTTPS for this blog, Lameazoid.com and Joshmiller.net.  Once I am confident in things I’ll set it up for TreasuredTidbits,.com and TheZippyZebra.com as well.

I had some issues with Joshmiller.net though because of the way Cloudflare works.

Cloudflare Integration

I also recently added Cloudflare to all of my sites.  Cloudflare is essentially a DNS provider but it also lets you mask and reroute traffic to help protect your server.  I had to pull BI off of it though to get Mail-In-A-Box to work and apparently Lameazoid.com wasn’t set up for rerouting.  I ended up having trouble with Joshmiller.net when I tried to enable SSL encryption.  Basically, as near as I can tell, the set up was looking at the Cloudflare IP and not the server IP, so things weren’t meshing or hooking up properly.  Everything corrected itself once I removed the Cloudflare rerouting.  I still need to play with this a bit before I set things up on my wife’s two blogs.

Part of why I experiment with my blogs vs hers is that I get way less traffic and I don’t like to irritate her.

Cloud At Cost VPS

I did a post on Cloud At Cost, but I wanted to mention it again as a recent project.  I have two VPSs from them, plus some.  I’m still having issues with the Windows VPS but the Linux one has been running pretty well since I got it up and running.

PLEX Server

My Synology NAS has the ability to act as a PLEX server.  I recently cleaned up a bunch of space on the NAS by throwing some spare drives into an older machine and creating a “Deep Archive” for things that I never need to access that take up a lot of space (read: My 500GB of raw video from ten years of my bi annual DVD making projects).  I also shoved some things like old ISOs and Game Install files onto the Deep Archive.  I then proceeded to start filling this new space with rips of my DVD collection.  I’m still working on the long and arduous ripping process as time allows but the idea is to run everything through PLEX to the two Firesticks I’ve set up on each TV.  This means my family doesn’t have to drag out a huge binder of DVDs to find a movie and it means I can stop worrying about discs getting scratched up and ruined.

It also gives me a nice way to watch all of the home video footage I’ve recorded over the past 10+ years.  This whole project met a bit of a roadblock when I found that I need to pre transcode all of the video in PLEX before it becomes watchable.  The NAS isn’t powerful enough to transcode it in real time.

 

Sometimes it Just Takes a Reset to Clean up Your Phone

wp_ss_20161101_0007I’m not sure what it is about mobile operating systems, they just don’t always clean up after themselves and seem to be awful about eating up their limited space sometimes.  I can only assume that there is some sort of glitch and a large batch of updates or temporary files don’t get deleted properly.  In Windows or Linux on “real” computer, It’s the sort of thing that I’d easily track down and delete on my own.  Mobile operating systems tend to be locked down way more preventing users from poking around in the system files, or anywhere beyond the basic documents folders really.

A while ago, my wife kept having issues with her Kindle Fire tablet filling up with space.  Even after cleaning off Photos and Videos, which she had quite a few of, there still was never quite enough space.  It’s only 8gig to start with, which isn’t much, so choices for apps and such have to be carefully weighed.  Eventually in frustration I did a factory reset and voila, problem solved.  The “System” block went from close to 6 gig down to somewhere around 3-4 gig, considerably more manageable to be sure.

I had a similar experience on my Windows Phone recently as well.  It kept filling up despite my effort to prune more and more apps.  Eventually it stopped taking screen shots and it had tons of weird freeze ups.  Once again, in desperation, I did a reset.  Now it’s floating around 5 gig of space used (of 8 gig) and I’ve reloaded most of the apps I had previously needed to prune.

It also runs much more smoothly.

This isn’t a process to be taken lightly however.  In my case, I keep most of my data backed up through One Drive or Amazon to my NAS, and Apps can easily be redownloaded (often automatically).  Probably the biggest hurdle I had with my phone was dealing with my 2 Factor Authentication App.  It doesn’t back up or sync since that would be a security issue, and I have a ton of services running through it.  In many cases I simply changed the 2 Factor Auth to run through SMS instead of the App, in others it was easiest just to temporarily disable it.

This all needed to be done before hand.  Many services won’t let you easily disable or change your 2 factor settings without the current codes, for good reason.  If you wipe out your authenticator, you’ll have no way to get those codes.  I had to deal with this first hand after the SD card I was using crapped out on me, taking my authenticator with it.  In at least one case I had to call into support and talk to a person to recover my account.

My suggestion, from doing this some in the past with other devices, start making a list of Apps you want to reinstall.  Then remove them.  This lets you actively track if there is anything like an Authenticator that may need to be dealt with.  After you can’t uninstall anymore apps, start checking whats left, photo galleries, Email, SMS, Call logs, checking for loose ends as you go.

It can be a pain but doing a factory refresh on an ailing space strained device can really help out to clear the cruft that seems to build up around the edges.

Next Thing CHiP as a Twitter Bot

twitter-logoThere was a post that came across on Medium recently, How to Make a Twitter Bot in Under an Hour.  It’s pretty straight forward, though it seems to be pretty geared towards non “techie” types, mostly because it’s geared towards people making the bot on a Mac and it uses something called Heroku to run the bot.  Heroku seems alright, except that this sort of feels like an abuse of their free tier, and it’s not free for any real projects.

I already have a bunch of IOT stuff floating around that’s ideal for running periodic services.  I also have a VPS is I really wanted something dedicated.  So I adapted the article for use in a standard Linux environment.  I used one of my CHiPs but this should work on a Raspberry Pi, an Ubuntu box, a VPS, or pretty much anything running Linux.

The first part of the article is needed, set up a new Twitter account, or use one you already have if you have extras.  Go to apps.twitter.com, create an app and keys, keep it handy.

Install git and python and python’s twitter extension.

sudo apt-get install git

sudo apt-get install python-twitter

This should set up everything we’ll need later.  Once it’s done, close the repository.

git clone https://github.com/tommeagher/heroku_ebooks.git

This should download the repository and it’s files.  Next it’s time to set up the configuration files.

cd heroku_ebooks

cp local_settings_example.py local_settings.py

pico local_settings.py

This should open up an editor with the settings file open.  It’s pretty straight forwards, you’ll need to copy and paste the keys from Twitter into the file, there are 4 of them total, make sure you don’t leave any extra spaces inside the single quotes.  You’ll also need to add one or more accounts for the bot to model itself after.  You’ll also need to change DEBUG = TRUE to DEBUG = FALSE as well as adding your bot’s username to the TWEET_ACCOUNT=” entry at the bottom.

Once that is all done do a Control+O to write out the file and Control+X to exit.  Now it’s time to test out the bot with the following…

python ebooks.py

It may pause for a second while it does it’s magic.  If you get the message ” No, sorry, not this time.” it means the bot decided not to tweet, just run the command again until it tweets, since we’re testing it at the moment.  If it worked, it should print a tweet to the command line and the tweet should show up in the bot’s timeline.  If you get some errors, you may need to do some searching and troubleshooting, and double check the settings file.

Next we need to automate the Twitter Bot Tweets.  This is done using Linux’s built in cron.  But first we need to make our script executable.

 chmod 755 ebooks.py

Next, enter the following….

sudo crontab -e

Then select the default option, which should be nano.  This will open the cron scheduler file.  You’ll want to schedule the bot to run according to whatever schedule you want.  Follow the columns above as a guide.  For example:

# m h  dom mon dow   command

*/15 * * * * python /home/chip/heroku_ebooks/ebooks.py

m = minutes = */15 = every 15 minutes of an hour (0, 15, 30, 45)

h = hour = * (every hour)

dom = day of month = * = every day and so on.  The command to run, in this case, is “python /home/chip/heroku_ebooks/ebooks.py”.  If you’re running this on a Raspberry Pi, or your own server, you will need to change “chip” to be the username who’s directory has the files.  Or, if you want to put the files elsewhere, it just needs to b e the path to the files.  For example, on a Raspberry Pi, it would be “python /home/pi/heroku_ebooks/ebooks.py”.

If everything works out, the bot should tweet on schedule as long as the CHIP is powered on and connected.  Remember, by default the bot only tweets 1/8th of the time when the script is run (this can be adjusted in the settings file), so you may not see it tweet immediately.

This is also a pretty low overhead operation, you could conceivably run several Twitter Bots on one small IOT device, with a staggered schedule even.  Simply copy the heruko_ebooks directory to a new directory, change the keys and account names and set up a new cron job pointing to the new directory.

Cleaning up My Password Security

encryption-imageIt seems like there is an increasing amount of hacks and leaks lately.  These also seem to be larger and higher profile targets more and more.  Recently I’ve been seeing stories about Last.fm and Dropbox accounts apparently being compromised as well as a vulnerability in vBulleten, a popular Message Board hosting tool.  For the most part, a lot of these hacks are going to be harmless, for now.  Any website that actually matters is probably (they better be) using salted passwords, making a password dump mostly useless.  Though in Last.fm’s case, apparently 96% of the passwords were decrypted because their encryption algorithm was shoddy.  Still, it seemed like a good time to check over my Password Security.

Beware, those music scrobbles you see might actually be the music taste of some Russian or Chinese hacker!  Seriously though, I don’t really see the point with hacking Last.fm, I’m not entirely sure they even have any sort of financial data.  I imagine the email list is sort of useful for spam accounts.  I suppose there is also the issue of people using the same passwords everywhere.

The good side of these hacks, the lists get put on-line, on hacker sites or TOR sites, and there are several places that take these lists of leaked accounts, dump them in a database and allow you to search to see if your account shows up in a list and for which site, if available.  With all of these recent lists I went through and checked my primary email addresses and found about 20 entries between the two of them that had been compromised.  Most of those were vBulleten Boards that I had signed up for 10 years ago, never posted to, and had forgotten even existed.

I mentioned the problem of using the same password repeatedly.  I’ve got several “layers” I use for how much complexity I put into my passwords.  Financial sites, large buying sites (eBay, Amazon, etc), all get unique passwords.  I just remember those.  The next level, things like Facebook and Twitter, also get unique passwords, but I have some basic algorithms I use to generate them, mentally, so I can remember those as well while keeping them unique.  Sites like the ones that were compromised, tiny one off bulletin boards with little risk to me if they get hacked, I admit, I use the same few passwords on a lot of those.  Especially older ones from ten years ago, before I got serious about my online security.

Ironically, these sites are now possibly my most secure passwords.  Because I used Lastpass to generate the passwords.  Lastpass is a plug in for pretty much every browser.  It remembers your passwords, and syncs them across your Lastpass account.  I’ve used it for years to store and sync passwords, but I never really bothered with the generated passwords feature.  The best practice at the moment, for passwords, are long strings of random characters, lastpass can create these, and then remember them, so you don’t have to.  I don’t know what my new password is for the PPCGeeks message board, but I don’t need to, because when I visit, Lastpass will enter it and log me in.  It’s long and complex.  I mostly avoided this feature before because it pretty much meant I would never be able to log in via mobile since I would have to manually type the password in.  Lastpass now has a mobile solution, but I also just sort of accepted that, I’m never going to visit many of these sites on mobile anyway.

The even better solution, when available, is to use 2 Factor Authorization.  Something you know, a password, something you have, an Authenticator.  Every mobile platform has an authenticator App.  If you happen to be one of the 1% using Windows Phone like me, the Microsoft Authenticator works just like the Google Authenticator when setting it up.  When I want to log into say, Dropbox, I enter my username and password, like normal, and then I am prompted to enter the generated code from my Authenticator.  It doesn’t matter if someone else has my password, because they don’t have the Authenticator, which is randomly generated and can’t be duplicated.  I use this for any site that has it, which is almost all of the “big ones”, Microsoft, Google, Dropbox, etc.  I actually get frustrated when it’s not available, like when my Rockstar Games account got stolen 6 months ago or with Playstation Network, which has had like 3 or 4 hacks now.