Technology

Organizing Digitally – The NAS

I want to do a sort of series about how I have my digital world organized but I was sort of trying to decide the best place to start. I wanted to run down some file structure methods, and I want to run down Office 365 use, and previous backup methods, but ultimately, the core of everything, is my NAS.

So this is also sort of a followup to that last set of articles about my Synology NAS. I am sure there are other ways to do a lot of what the Synology does, but there are a lot of simple to use built in features that are nicely integrated into my workflow. It’s a little pricey to set up initially, with the box and the drives, but the reality it, any good solution will be.

Features I Use

These aren’t in any particular order, but I wanted to touch on the aspects of the NAS that I use pretty regularly.

  • OpenVPN – I used to go to a lot of hassle opening up firewall ports on my home network to different devices and machines, so I could access web cams or SSH to different servers and blah blah blah. This is a bit of a security problem, since it means lots of open target points as well. I’ve long since dumped that in favor of OpenVPN, which is built into the NAS. I connect through my laptop or my phone to my home network, then I connect to whatever network drive or SSH connection I need to. It works perfectly and requires way less hassling with the firewall.
  • Download Station – This is essentially a Tor downloader, though I think it can handle a lot of other url types. I don’t really directly interact with this, I keep a folder for incoming files that I occasionally sort and a watch folder for Torrent files that it pulls from. The fun part is syncing the watch folder using One Drive, so I can dump Torrent files to it from anywhere. And for what it’s worth, I don’t use this for piracy, primarily I use it for downloading Humble Bundle purchases. A bundle often has 20+ items, so I will bulk download the torrents (to save HB some bandwidth) and then dump them into the watch folder.
  • Video Station/DS Video – I tried running Plex for watching digital movies from the NAS but it was flaky as hell since there isn’t an official Synology app and Plex is increasingly pushing their subscription nonsense instead of just being a client/server self hosted application. Fortunately, there are Synology Apps for Fire TV (Which I use for streaming on both TVs). So I’ve sorted all of my home movies into the Videos folder and (for a future blog post) encoded them to be easily accessible and compatible.
  • Photo Station – Ok, I don’t actually use this… yet… but I want to revisit it going forward. I want to do a separate post on photos with more details, but basically, I wasn’t using the Photos folder for backup purposes, and that situation has changes recently.
  • Audio Station – I have a ton of music from different sources compiled and sorted together. It’s not my primary GoTo for music, but I want to get more organized playlists going so I can more easily use this for playing my music. For the most part, I am fine with just sticking music ON my phone though.
  • Mail Station – I don’t use Mail Station for actually sending emails, but I did set up the Mail Station server and I use it as a deep archive of emails. I essentially have all my email I have ever sent, going back to the 90s, pulled forward through various email clients, and now it’s all dumped into a Mail Server in a sorted, searchable archive.
  • Cloud Sync – Cloud Sync lets you hook your Synology to various cloud drive services and sync them to your local drives. I’ve got several Dropbox accounts that I have used in the past (Personal, server syncing, each family member) and now a couple of One Drive accounts for backup and personal document sync all linked. It even does Google Drive.

Features I Stopped Using

There aren’t a lot of features I have stopped using, but there are a couple.

  • Web Station – The Synology comes with an optional Webserver and a weird WordPress system that can be enabled. This has been weirdly buggy since day one and I already have plenty of experience managing LAMP stack servers. I recently disted off one of my older Pis, set it up with WordPress and moved the primary use I was using the Synology Web Station for to the Pi. Mostly, It was just a WordPress Archive of all of my old blog posts from various blogs. The links were weird and didn’t work properly because it didn’t quite understand subdirectories or something. The images were present but they didn’t always work because they pointed to old URLs and working the SQL system to change them always came off as wonky. Basically, I didn’t need this archive to be on the NAS and it was an easy thing to just offload to another device.
  • Cloud Station Server – This is a back up system for devices and computers. It will sync specific local folders to a folder on the NAS as a backup. Maybe I was doing something wrong but it always felt really flaky as well, so I just sort of stopped using it. I had it on every laptop in the family for a while but as laptops were replaced, then things started getting weird and getting others to grok how to pull back their files wasn’t super easy either. The better solution I have found is to just give everyone a shared folder specific to them that they can shove files they want to keep into. For my personal use it was just redundant because my entire workflow for years has essentially been cloud based with Dropbox or One Drive keeping everything backed up by default.
  • Surveillance Station – I still sort of use this, but all of my webcams died except one, which doesn’t have night mode anymore. So, it exists and I would use it, but I don’t really use it much anymore. Also. frankly, there was never anything worth seeing on the recordings.

Workflow

The real workflow from the NAS comes from shared folders. Everyone has access to the Family Photos folder mapped to their laptops. I created a shared folder for all of the Blog graphics my wife was using for her blog work that everyone can access since my daughters both helped her with that. They use a shared drive for all of the Ebay and Mercari photos they work on.

I keep folders for photos, and videos and ebooks. I keep folders for important family documents like Tax Returns. All of this can easily be synced to a backup in the cloud and I have a couple of USB keys and loose drives that I do periodic manual backups to, that get stuck in a fire proof safe.

It also lets me map other network drives in as well, for shuffling files around. I have a whole second Linux box set up that has another 4TB or so space in it across several drives, that I use to store less important files like Installable programs and games, ISOs, temporary files for video editing projects, a mountain of internet memes and images saved over the years, music concerts I’ve downloaded, etc. Plus I can map things like, the web root for my Raspberry Pi, or set up a one way(ish) SSH tunnel to my Webserver for pulling backups through.

The box itself sits behind the TV upstairs, and if there ever was a fire or something, it’s likely one of the things I might try to grab on the way out the door, but I’d like to thing my system is robust enough that even if it were lost anything important would be recoverable.

Migrating Mail-In-A-Box to a New VPS

A few years ago, I started running my own mail server using Mail-In-A-Box. Four years or so actually, if the age of my old server was accurate. I have several different email addresses, mostly to better segment out content. I have done this with Reddit, and Twitter, and TT-RSS, and probably other things. In my Mail-In-A-Box I run email for 3 domains, two of mine, one for my wife’s. Overtime I may eventually migrate all of my email to it, at this point, I am a little worried about being blacklisted, so I mostly use it for secondary, receive only, email aggregation.

For a while I’ve been putting off migrating the system to a new VPS. It’s been running on Ubuntu 14.04 since it was created. Newer MiaB won’t run on 14.04 and I can’t distro update the machine. The only choice is to roll a new VPS and migrate the mail.

I use Digital Ocean for my online services, feel free to sign up with the link in the side bar if you want, I get a little kickback if you do. It’s easy to use and affordable. Plus in cases like this, I can spin up an extra VPS, then easily destroy it and spin up a new one, when I discover that MiaB only works up through 18.04, so 20.04, which I used initially, won’t work. Also having the extra server just means a temporary bump in my billing for the month.

The basic process for migrating Mail-In-A-Box is here, in the official documentation. I had a few hiccups along the way but I got them ironed out.

First step was creating the new machine. I mentioned above, I first made a 20.04 machine, but found that doesn’t work, so I killed that and made a new 18.04 machine. Before anything else, I did a few security based housecleaning tasks. The server was creating with Shared Keys log in set up, but it only had a root account. So I created a new user and made them a sudoer. I also copied the SSH keys from root to the user.

adduser Username
usermod -aG sudo Username
cp ~/.ssh /home/Username
chown Username:Username /home/Username/.ssh -R

Next step was to add the new user to the SSH users and secure up that access.

sudo pico /etc/ssh/sshd_config

Then edit:

#Port 22

To a custom port and change:

PermitRootLogin no

Finally add:

AllowUsers Username

Lastly restart the ssh server with sudo service sshd restart. Then test the connection using the regular user. If that works, then disconnect from the root session and continue on the regular user.

I was doing an upgrade but the fresh install guide is here. All I needed was the set up line really, which takes a minute to run but does an initial set up of Mail-in-a-Box.

curl -s https://mailinabox.email/setup.sh | sudo -E bash

The next part was the trickiest bit. I linked the migration article above but I ended up trying to simplify things a bit. On the old machine, I stopped the mailinabox service, so no new mail would come in, then ran the backup python script as described int he article above. I found it was easiest to just connect to the server using Filezilla using SSH FTP, which meant importing my keys to Filezilla. It’s in the settings under SFTP. Something to keep in mind if you set a custom port is you’ll need to add sftp:// before the IP address.

Things are a little tricky here, since root owns the backup folder. I ended up doing a sudo copy into my user home directory, then a chown on the folder to give my user account access to the folder. This meant Filezilla could see the folder and download it to my local machine. There are way to directly transfer between the new and old server, but between custom ports and SSH keys and permissions, I found it was easiest just to download to my local laptop. Afterwards, I connected with SFTP to the NEW server, and pushed the backup folder to the new server. You need the whole folder with the “secret_key” text file and the encrypted folder and files. Basically, this is all the settings and emails.

Next step was to ssh into the New Server, go to the freshly uploaded backup directory, and import the old files, as described in the link. This is two commands run, separately.

export PASSPHRASE=$(cat secret_key.txt)

sudo -E duplicity restore --force file:///home/Username/backup/encrypted /home/user-data/

This takes a minute to run. The next step listed is to rerun the mailinabox set up with “sudo mailinabox”.

I had trouble here. Nginx would not restart. After sound troubleshooting I found it was an issue with SSL. Basically what seemed to happen was the restore, pulled the old SSL certs. Or maybe it was looking for the old SSL certs. Whatever the case, the fix was this process.

rm -rf /home/user-data/ssl/*

The fix was to delete the SSL certificates. then run “sudo mailinabox”. Everything started up. I verified I could log into the control panel and the mailbox using the UP address of the new server. I verified that all my custom DNS records existed, these are needed since the Glue Records point to the Mail-In-A-Box machine but because I host my websites on a separate machine, I have to have DNS records set up appropriately.

One thing I noticed was the SSL Certificates seemed to be wrong, which meant things worked, but would cause annoying security messages. I am not sure if this was related to deleting the certs above, or just that it was still looking for the old IP address. Whatever the case, I did a manual update with certbox for my MiaB Subdomain using

sudo certbot certonly --force-renewal -d Subdomain.Domain.comHere

Another minor issue I ran into, doing this needs to drop a file either in the webroot folder, or spin up a temporary web server to host it’s own file. I couldn’t find the webroot for the custom MiaB set up (it was not /var/www/html) so I temporarily ran “sudo service nginx stop”, then ran the above certbox command, using a temporary webserver option, then “sudo service nginx start” to restart Nginx. NGinx had to be stopped since otherwise it is using Port 80, and the temporary server can’t start to runt he certificate verification process.

Another note, I am not sure if the –force-renewal option is needed above. It didn’t throw out any errors and it fixed the issue, so I left it.

The final step was to go to my Domain Registrar and update the name servers and Glue Records to point to the new Server IP. After a short bit of waiting, eventually the mail server URL connected to the admin and web consoles. I did some test send and receive of emails between my server and gmail to verify everything was working properly. One nice bit, the newer MiaB has a different interface for Roundcube webmail, so I could easily tell if I was going to the new or old server.

Once everything was satisfactory, i went back to Digital Ocean and powered down the old server. If everything is still working in a few days, I will destroy the old server, so I don’t have to keep paying upkeep on it. One thing to keep in mind, both the old and new servers require a specific hostname, so they will be named the same, so double check that you are powering down and deleting the correct server. some easy ways to verify are IP address, or server age. The old server is several years old but the new server is several days old.

A Second Hard Drive in My Aspire E 15

Recently I purchased an SSD for my wife’s Thinkpad. It wasn’t a big one, 256 gig, but her laptop is a little slow all around and the bottleneck seems to be mostly in the drive, which I am pretty sure is still old school spinning platters.

Unfortunately, the drive in her laptop is 320 Gb, so I couldn’t straight clone the drives. I could have done some partition size adjustments and made it work but she was already fussing and worrying I was going to lose some of her files so I decided I’d just wait and get a larger one later.

I’d already planned to pick up a second one of these drives to add to my Laptop. The main drive is one of those funky newer styles that’s basically a circuit board, but it has an empty bay for a laptop drive. I stuck the new SSD in and went about using it. Nothing hard here at all.

To my surprise, the drive vanished a week or so later. Thankfully I didn’t stick it in my wife’s laptop, it was apparently bad. Or was it?

Turns out that because the drive is a little on the small size, even for a 2.5″ drive, and there isn’t any mechanism inside the laptop to secure the drive itself, it ended up coming lose and losing it’s connection.

It’s probably not the cleanest fix, but I stripped off a half a sheet of paper and accordion folded it and slipped it in between the drive and the Laptop chassis. This applies pressure to the drive, holding it in place.

I haven’t had any trouble with the drive since. Still it’s kind of a crummy design.

A Tale of Two PCs

As a bit or a change of pace, I did a bit of work on the two actual PCs I am currently running recently.  I’ve gone through a lot of desktops over the years, some getting more use than others, for a while I had like 5 or 6 old ones I had picked up here and there just sort of sitting around collecting dust but I’ve purged a lot of that out.  Most of what I used to do with those extra PCs I can now do with Raspberry Pis or on my VPS.  Everyone in the family uses a laptop, so no more need for a “Family Desktop”.  I am down to two boxes now, ok, techniclly 3 but the third is an old PowerMAC G4 that I mostly keep around because I think the case is cool.

First off, my personal desktop.  At the moment it just runs Windows 10, it’s sitting on a handful of drives for a total storage of 4 GB, mostly filled with games.  I built this machine almost 7 years ago.  It’s nothing particularly special, and I have bumped up the RAM since then considerably.  PC computing power really hasn’t gotten much better in the past few years and what it mostly needed was a bump up in graphics power.  So I swapped out the Radeon 6950 for an NVidia GTX 1050ti card.  It’s not a top of the line super card, but it was within my price range and the performance boost is reasonably noticeable.

The biggest change is that I can run pretty much everything at maxed out graphics settings.  So far I’ve tested it on Overwatch, World of Warcraft, Grand Theft Auto V, and Battlefield 1.  Battlefield 1 in particular used to throw out an error about my GPU not being supported and GTA V had some screwy artifacting when it rained in game.  Also, maybe it’s a placebo effect, but I have noticed that I do better in Overwatch with heroes like Hanzo and Widowmaker who both require more precise long distance aiming.

That work was pretty easy, though I was sort of worried that the newer card wouldn’t work with my older Chip and Board.

On my other desktop tower, which is primarily used as a file storage server to supplement my Synology, I replaced a couple of dying hard drives.  I don’t really remember where this tower came from but it’s at least the same vintage as my main PC.  It’s set up running Xubuntu with a collection of drives I’ve collected over time from various places and discarded PCs.  It’s been complaining for a while on boot that one of the drives was bad, and another would give read errors occasionally.  I copied everything off the read error drive, that one was easy.  The other bad drive turned out to be the main drive which finally gave up the ghost and stopped booting on me.  I ended up making this problem worse when trying to clone the drive, because I apparently accidentally overwrite the drive as a ZFS pool file system.  This is mostly notable because I’m not sure how it even happened.  I have used ZFS briefly int he past when I was testing FreeNAS but that system was a way bigger chore to use than just Ubuntu with Samba shares so I scrapped it.  So I’m not sure what was even cloned to create a 500GB ZFS partition.

Fortunately there wasn’t any important data actually on the main filesystem drive.  I think at worst I may have lost am unused Minecraft server set up and maybe a few webpages I had set up messing around with webdev stuff.

So after a ton of reboots on a live CD to determine which physical drive was witch in the machine, I pulled out the two bad drives and replaced them with two “mostly good” drives.  I then reloaded Xubuntu.  I then, reloaded Xubuntu again because an encrypted file system seemed like a good idea but I don’t want the hassle of entering a password every time the machine boots.

The real hassle here is getting everything configured.  A quick rundown of the steps needed to get things to a basic level of use.

  • Set up the proprietary drivers for the GPU and motherboard, easy
  • Set a static IP that puts the machine where it’s supposed to be on the network, mostly easy.
  • Reinstall Synergy.  Mostly easy, though I still need to get it to stat on boot.
  • Install and set up SSH, easy
  • Reinstall Samba, easy
  • Get the system to auto mount the other hard drives on boot, mostly easy
  • Configure Samba to share those drives, mostly easy
  • Reinstall the LAMP stack

Fortunately, everything went pretty smoothly, other than I havn’t quite figure out the right method to get Synergy to start on boot.  This is actually pretty critical, since unless the machine just boots up to a desktop with Synergy, I have to keep a keyboard and mouse attached.  Part of the point here is that this box can just be squired away behind the desk and hooked to a monitor.  It may already be set up but I’ll probably set up Python on it as well.  I still like to be able to putz around with scripts and web stuff so it’s handy to have.

PS, feel free to judge the dusty ass inside of that tower up there.

Fixing my New 3DS …

So, the title there says New 3DS, which is is the “New 3DS” but it’s also New, in the sense that I bought it at the start of January.  It’s like 3 or 4 weeks old.  Then I broke it on accident.  It’s taken me years to get around to saving the money to buy one of these, not so much because I don’t have the money, but because I wanted to know it was at a point in it’s console life cycle that it was worth buying.  If it has been say, $50 cheaper, I’d have bought one years ago.  Also, unlike all of my previous Nintendo based hand helds, I wanted to wait to get the “improved version”.  I have a Game Boy, a Game Boy Advanced, a Nintendo DS, but they were all the first gen models.  I didn’t care for the 3D gimmick, so I passed on the 3DS.  Well now they have the horribly branded “New 3DS” and in my case, it’s the XL version, because I like having a big honking screen.

I’ve played it pretty regularly since buying it.  This is kind of the problem with my argument of “waiting for the right moment”.  My track record with Nintendo hand helds has always been amazing.  I measure this by one real metric, how many games do I finish on them?  I’m pretty sure I’ve completed every game I own for Nintendo’s handhelds since the Game Boy, and I own a lot.  Contrast this with say, Steam, where I’ve beaten like 20% of my 1000+ game Steam Library.

On a side note story, that’s related here, the water out of the faucet at work is awful, there’s crap floating in it and it’s scummy and possibly not healthy (probably is for legal reasons though).  So in order to make my coffee each morning, I carry a metal water bottle with water from home, usually in my lunch box.

Anyway, a week or so ago, I put my 3DS in my bag I carry and went off to work.  On this particular day, I didn’t bring a lunch, so I just had my laptop bag and no lunch box.  I stuck my water bottle in the side pocket of my bag and drove off to work.  Upon arrival I found the bottom of my bag was wet and the top of the water bottle was not fully attached.  Incidentally, the water bottle was also empty.

I headed into the office to assess the damage here.  Headphones, some notes, the bottom of the bag and my 3DS were all wet.  Bummer.  I took everything that was damp and strung it around to dry out for a while and went about my day.  Later, as things were drying I tested the 3DS.  This was my fatal mistake, it turned on, but the cursor on the bottom of the screen was flipping constantly (due to water inside making an electrical contact) and when I tried to turn if off I got a cryptic message flashed up about “There is some kind of problem something something) before it turned off, for good.

I set it out to dry some more, hoping this would correct the issue.  It still didn’t turn on some time later, so I set about opening it up.  I’d already removed the back cover and battery, now it was time to crack open the case.  It turns out it’s a pretty simple task thankfully, there’s maybe 8 small screws holding the case shut. You’ll also need to remove any SD cards.

Side not to anyone trying to do this, there are two small ribbon cables along the top edge of the system that come off with the back cover.  These operate the shoulder buttons.  To actually remove the cover, you must lift the top edge gently a bit, so they the whole thing can slide down and over the headphone jack, then the cover rolls/flips towards the upper side of the 3DS, minding these cables along the way.  The cables can be removed and may even pop off, this is ok so long as they don’t get damaged.  They ultimately need to be removed anyway, using some small pliers or a screw driver, to remove the black square from the main part of the 3DS.  These connectors are designed to be removed and reinserted easily.

After removing the cover, I had a nice view of the inside of the 3DS.

On the plus side, once inside, things were not as bad as they might seem.  The way the handheld sits in my bag, only one end of it got any sort of water (the left end shown above).  On the minus side, there was a lot of water, like I had to get paper towels and dry it up all over inside, including removing the face buttons.  To get to the underside water, I had to also remove that board on the left side, it has 5 screws, 4 in the central area, and one near the bottom ribbon cable.

Once everything was dried, I reassemble it and tried to turn it on again, with no success.  So I opened it up again for a deeper inspection.  This was when I found what I should have noticed originally, the painfully obvious blown out components on the board.

Nothing else inside seemed to be damaged at all and all of the moisture was on this end of the console.  So I figured I’d look into replacing this power board (the batter connects to this board).  I figure spending $50 on a new board would be better than $200 on a new 3DS.  Fortunately, these boards can be found all over online, and even more fortunately, it only cost me around $15 to order one, a real deal non knockoff one too.

A week later, I had the new power board, time to swap.  It’s pretty straight forward as well, I removed all of the screws first.  Next there are two small ribbon cables that attach to the board, the one broad orange one and another smaller one at the top for the secondary nubbin that is on the New 3DS models.  The large orange one was simple, since the new board came with a new ribbon cable, potentially damaging the old one wasn’t a problem.  I was still careful to slide it out of the end on the main part of the 3DS.  After removing it, the gray bar is able to flip up so the new cable can be slid in and aligned, then the gray bar snaps back down to secure it.

The second smaller cable was a bit trickier, but due to it’s small size, I was able to flip the bar piece holding it down up using a small screw driver.  Once these cables were swapped, the new board gets screwed in.  Carefully reattach the two cables on the case cover for the ribbon cables, there is a natural orientation to these when the cover is attached, though it’s slightly twisted around with it removed.  Once everything was reassembled and screwed down, I reinserted the battery for the moment of truth of powering the system back on.

Which was successful!

I’m not saying this will fix any broken 3DS, there’s all sorts of other issues that could come up, especially with water.  This is more just how I was troubleshooting and fixed mine.