Technology

Fixing Cron Not Executing

Recently I encountered an issue I hadn’t run into before. Specifically, my Cron Jobs were not running. Everything seemed correct and I could manually run the commands at the CLI. I’ve had some issue before with getting things to run because I wasn’t using the complete path for programs but this seemed to be something different.

The problem I found was that the root password needed to be changed. Running the following:

sudo  grep CRON /var/log/syslog

Would output a long list of the same issue repeating over and over.

May 27 10:30:01 Webserver CRON[12943]: Authentication token is no longer valid; new one required
May 27 10:39:01 Webserver CRON[12978]: Authentication token is no longer valid; new one required
May 27 10:39:01 Webserver CRON[12977]: Authentication token is no longer valid; new one required
May 27 10:40:01 Webserver CRON[13049]: Authentication token is no longer valid; new one required

When running the following command:

 sudo chage -l root

Would output something like:

Password expires               : never
Password inactive              : never
Account expires                : never

Which suggests the root password has never changed. So I ran the following command:

sudo passwd root

And set a new root password (which was the same as the old root password) and suddenly everything started working again. It felt like a really odd issue, especially considering I didn’t actually change the password, and as far as I could tell I had a root password. Plus the password wasn’t set to expire at all.

Anyway, I wrapped it off by doing an (optional) truncation of the system log. Since the file had become unwieldingly huge with the following.

sudo truncate -s 0 /var/log/syslog

New Desktop Upgrade

My new desktop I mentioned at the end of my last post arrived. I am fairly good about adjusting my workflow when changes or upgrades happen, but this particular change in it’s nature, is incredibly disruptive. My whole process kind of stems out from a source, which for the most part, is either my laptop or my desktop. If I were to get a new laptop, not much would change. Most of that workflow runs off of shared drives or cloud files. On the base level, the desktop works the same way, the real trick I get to deal with is the ripple effect downstream, which I will get more into in a bit.

The machine itself runs fine. It’s much much quieter than I expected it to be, which is nice. The crazy Neon lights aren’t as slightly annoying as I expected, especially once I figured out how to adjust them to be a more toned down, non pulsating, single, cooler color. The day after ordering, I realized that I might need new monitor cables, I checked and sure enough, all of the outputs save one are Display Port. My old set up consisted of a display port adapter, a DVI port and an HDMI out, all three to DVI on the monitor end. I may update the monitors later but it’s not necessary now.

The inside is crazy empty and clean. I’m not quite sure yet where I would even put additional hard drives, though I suspect they mount to the sides.

It certainly handles every game I’ve thrown at it incredibly smoothly even with maxed out graphical settings. My current plan is to keep this machine relegated mostly for gaming (a bit more on this later), so i will keep it fairly clean and free from excess software. So far I’ve tried it out with Forza Horizon 4, Overwatch, Black Desert Online, Minecraft, Control, Quake 2, and Quake 2 RTX.

The RTX is really quite remarkable. The reflections are really neat and the shadows work very well. I look forward to finding more titles that take advantage of the ray tracing capabilities.

The Ripple Effect

Where the real change is happening is down the line. For the sake of maybe alleviating some confusion, I am going to go ahead and use the Network names for my computers. The old workflow consisted of my Windows 10 Desktop, Squall, that I originally put together back in 2012. It still runs everything perfectly fine, since 2012, I’ve bumped up the RAM quite a bit, updated to an SSD, and updated the video card to a 1050ti. Sitting under the desk next to that was Rinoa, am even older box, that I am pretty sure I got second hand somewhere, running Xubuntu Linux. Xubuntu because, it’s only 32bit hardware. It’s primary function was being a web server, for my Dashboard, and a file server. It’s got several old drives in it all shared on the network to dump less important files to, because I’m a digital packrat.

The new machine is Cloud. See a pattern here yet. If it helps my laptop is Selphie, my old laptop was Rikku, my old project server years ago was Quistis and before that Yuna, my family’s laptops are Ivine, Barret,and RedXIII. They are all Final Fantasy characters.

Rinoa running 32bit hardware has been a problem for a while. Several interesting projects I have found needed to run on a 64bit system to get up and running. It’s also woefully under powered for anything robust, like running a Minecraft Server. Rinoa has been desperately in need to replacing for a while. Which is where Squall comes in now. Squall, will become the “new Rinoa”. Squall will become the new project server.

Making this change isn’t easy, it’s still not done, and I’ve been working on it for the last week. Squall is also a much more capable machine, so it changes the workflow a bit. Where Rinoa ran headless, I’ve decided to keep Squall on one of my three monitors for now, and work with it using Synergy. I can offload the load of doing things like, running the web browser off of Cloud to Squall if I want. I also can use Squall for Discord and IRC.

The first thing I decided to do was to move the Web server aspect to a Raspberry Pi. I already had a Pi running a LAMP stack to host my WordPress Archive blob. Moving the basic Dashboard was easy. Copy the files, import/export the database, and it worked, no problem. The harder part was moving the backend processes. I’ve started doing a lot of combination projects, that often consist of some sort of Python or Bash script running that dumps data to a database, and a web based GUI. Like the Network Map, or my rudimentary Twitter Scheduler or the web based Download Queue system for a particular web video downloading software that shall not be named. Getting these to work on the Pi is trickier. Partially because I’d forgotten some steps. For example, I created some environmental variables to open the database with scripts, so I didn’t have to put raw log in credentials in them. I forgot how I had done that, so I converted them back to raw log in credentials for now. I have others that are looking for commands from packages that need to be installed that I’m not sure are available on the Pi.

I’ll get it worked out, I just need a bit more time.

I’m also not real sure I want to move TT-RSS to the same Raspberry Pi, just because it’s constantly polling. I am not sure I want to run that level of read/write on an SD card and risk losing my other files. I will probably just set up a second Pi JUST to run the TT-RSS Server.

The other major thing to move is the files. I started off by consolidating everything, for simplicity’s sake. I converted and consolidated up my video files on my NAS recently, which freed up a lot of space. I’ve been meaning to re-allocate some files off of Rinoa back to the NAS and I used the move as an excuse to do just that. At the same time I consolidated the remaining files onto the largest of the various drives in Rinoa, so I can start off by just moving one drive to Squall, to reshare on the network. Moving the webserver to the Pi also meant giving the Pi an extra USB drive, for more storage. The Video Downloader that Shall Not Be Named, pulls video files, which are larger. I set up a new network share from the Pi for “Working files” and moved all of the “working Files” folders from Rinoa to the Pi.

The process overall isn’t complicated, it’s just time consuming with large data moves and some configuration changes.

Physical Set Up

Another aspect to adjust was the physical set up. Initially I just pulled Rinoa out and stuck Cloud in it’s place on the floor. This was partially done because this was the best solution for cable lengths available until the new Display Port cables arrived. I’ve got some pretty good cable management going on and I am not a fan of cables just hanging all over, a side effect of 15 years of working around equipment racks with impeccable cable management standards. Once the monitor cables arrived, I undid everything and reran all the cables, putting Squall on the floor and Cloud up on the little floor shelf under my desk. The main downside in the end is that the cleaner cable solution puts the large side window on the new PC against the backside, so I don’t get to see inside my PC all the time. Thankfully, I’m not a big fan of this aspect of the machine to start with.

Rinoa is going to just sit behind the monitors on a shelf for a bit until I finish with her, but in the end, the plan is to retire that machine out.

Going Forward

I am actually almost more excited about the prospect of molding Squall into the new Project server than I am the flashy new Gaming Rig. The 64 bit hardware and 24gb of RAM means I can do a whole hell of a lot more than I ever could with Rinoa. I can set up a massive Minecraft server with a whole heap of worlds available. I can run Docker and everything that goes along with that, I can set up a robust and speedy OpenSIM world if I want. I also plan to continue to use Squall as my Video Edit machine, no need to bog down Cloud with all that extra overhead in software and disk space. I also can much more easily start playing around with VMs.

I could have done a lot of this before of course, but I find keeping all of that up and going on a machine you are also using for day to day use and gaming gets distracting, and you start running into resource use issues much more quickly.

A Long and Short Journey with Dogecoin

Something something, I am not a financial advisor and am not giving financial advice.

Everyone is talking about Dogecoin in the last few days. I’m not sure why exactly, there are a thousand different Crypto Currencies out there and DOGE isn’t new. Personally, I do not follow Cryptocurrency at all. It really feels like it’s just a digital MLM scam. Sure, people are making money but the long term use and sustainability isn’t going to last. People like it now because it’s all “deregulated” but at some point, if it ever became big enough to be mainstream, it’s going to become regulated, which would kill the appeal. It doesn’t help that your regular everyday person has no idea what Crypto even is. The best quote for “what is crypto” I’ve seen floating around…

“Imagine if keeping your car idling 24/7 produced solved sudokus you could trade for heroin.”

One of the most common points I see is that “Crypto is just as real as the dollar (or any other currency).” This is of course because the dollar is “worth” what it is “only because the US government says it is”. Which honestly is kind of a misrepresentation of why a dollar is worth a dollar, but ok, lets go with that. Dogecoin, or any crypto, is “worth” something, because a bunch of speculators trying to get rich quick and not be holding the bag when everything collapses say it is worth something. The Dollar is worth something because the entire economy of the US, and a lot of the world, and the Government, backed by it’s people and an army and a general long term trust, says it’s worth something. So technically the original statement is true, but in reality, the level of backing on the dollar is much larger than that of a bunch of random kooks online.

Anyway…

Back in 2014, Dogecoin got big on Reddit. It was literally just a meme making fun of Bitcoin and people were using it as a sort of upvote/award system with Dogetipbot. Which was fun. I bought around 5000 DOGE for around $5. I got some tips, I gave some tips, eventually I stuck in all in a local wallet and I think the Dogetipbot broke so the whole thing kind of gotten forgotten about.

Then, this year, for some reason, Dogecoin started booming. Fortunately I had my old wallet. Unfortunately, syncing the last 7 years of blockchain takes a long time. It took something like a month to get my wallet in a usable state, and the software felt really buggy, requiring many restarts to kickstart the sync. Suddenly my 5000 DOGE was “worth” about $1500. I admit I got a little caught up and figured, I’d see how far it would go. Everyone says they want to take it to a dollar per DOGE. Let’s see if that happens.

I could see the hype taking it to a dollar. I really can’t see it every getting to Bitcoin levels of thousands. At the very minimum, it would take long enough that by the time it happens, the Governments of the world would have started regulating crypto to the point where it all mostly died off anyway. DOGE kept climbing, fifty cents, sixty cents, seventy-five cents. At some point, I decided that if it managed to reach a dollar, I’d go ahead and buy myself a new PC. I’ve been considering updating my desktop, some of the components are around ten years old (the Processor specifically), and what better to spend my sudden surge of fake money on. I also decided to buy a pre-build machine. The main thing keeping me from working towards an upgrade was how hard components are starting to be to acquire, a lot thanks (no thanks) to crypto mining.

I found I could buy directly from NewEgg using BitPay, so I prepped up half my DOGE into a Bitpay wallet. When it got up to seventy-five cents I was really close to just going ahead and pulling the trigger. I got to a point where I cared more about my PC upgrade than the DOGE. I even dumped a bit more into my BitPay, ready to use. I also figured I should pull the trigger at eighty or ninety cents, since it seemed like a lot of people were planning to dump at $1.

But then I saw news about Elon Musk going on SNL, and being a mechanism to propel DOGE to $1. I’m not a fan of Elon Musk at all, for reasons I’m not getting into here, but I figure if anyone can make this go up, it’s probably Elon Musk. So the decision was made to just wait for SNL.

Then Elon was on SNL, and DOGE dropped from $0.70 to around $0.50. It seems to have leveled off around this point as well, even dropping a bit at times. At this point, I had to remind myself, that I am not a “cyrpto guy” and I have no interest in becoming a “crypto guy”. This was just a dumb meme that I had jumped on years ago.

Long story short, I dumped the rest of my DOGE into BitPay, and ordered up a new kick ass gaming desktop PC.

I am now super looking forward to running games in blazing FPS with RTX and all the Ultra Super Graphical bells and whistles. In the meantime I’ve got to work on shuffling some files around to make everything work once the new desktop arrives. I currently run two desktops, one is my “main PC” and the other is an old 32bit Linux PC that mostly serves as a file server and a web project server. I don’t need to run 3 PCs and the 32bit PC has been a thorn in my side for a while with it’s 32bit limitations, so I plan to shift my current main desktop to be the new Project server and dump the old Linux machine. I look forward to being able to actually host a sweet Minecraft server and OpenSIM server and getting the chance to better learn how to use Docker.

Organizing Digitally – The NAS

I want to do a sort of series about how I have my digital world organized but I was sort of trying to decide the best place to start. I wanted to run down some file structure methods, and I want to run down Office 365 use, and previous backup methods, but ultimately, the core of everything, is my NAS.

So this is also sort of a followup to that last set of articles about my Synology NAS. I am sure there are other ways to do a lot of what the Synology does, but there are a lot of simple to use built in features that are nicely integrated into my workflow. It’s a little pricey to set up initially, with the box and the drives, but the reality it, any good solution will be.

Features I Use

These aren’t in any particular order, but I wanted to touch on the aspects of the NAS that I use pretty regularly.

  • OpenVPN – I used to go to a lot of hassle opening up firewall ports on my home network to different devices and machines, so I could access web cams or SSH to different servers and blah blah blah. This is a bit of a security problem, since it means lots of open target points as well. I’ve long since dumped that in favor of OpenVPN, which is built into the NAS. I connect through my laptop or my phone to my home network, then I connect to whatever network drive or SSH connection I need to. It works perfectly and requires way less hassling with the firewall.
  • Download Station – This is essentially a Tor downloader, though I think it can handle a lot of other url types. I don’t really directly interact with this, I keep a folder for incoming files that I occasionally sort and a watch folder for Torrent files that it pulls from. The fun part is syncing the watch folder using One Drive, so I can dump Torrent files to it from anywhere. And for what it’s worth, I don’t use this for piracy, primarily I use it for downloading Humble Bundle purchases. A bundle often has 20+ items, so I will bulk download the torrents (to save HB some bandwidth) and then dump them into the watch folder.
  • Video Station/DS Video – I tried running Plex for watching digital movies from the NAS but it was flaky as hell since there isn’t an official Synology app and Plex is increasingly pushing their subscription nonsense instead of just being a client/server self hosted application. Fortunately, there are Synology Apps for Fire TV (Which I use for streaming on both TVs). So I’ve sorted all of my home movies into the Videos folder and (for a future blog post) encoded them to be easily accessible and compatible.
  • Photo Station – Ok, I don’t actually use this… yet… but I want to revisit it going forward. I want to do a separate post on photos with more details, but basically, I wasn’t using the Photos folder for backup purposes, and that situation has changes recently.
  • Audio Station – I have a ton of music from different sources compiled and sorted together. It’s not my primary GoTo for music, but I want to get more organized playlists going so I can more easily use this for playing my music. For the most part, I am fine with just sticking music ON my phone though.
  • Mail Station – I don’t use Mail Station for actually sending emails, but I did set up the Mail Station server and I use it as a deep archive of emails. I essentially have all my email I have ever sent, going back to the 90s, pulled forward through various email clients, and now it’s all dumped into a Mail Server in a sorted, searchable archive.
  • Cloud Sync – Cloud Sync lets you hook your Synology to various cloud drive services and sync them to your local drives. I’ve got several Dropbox accounts that I have used in the past (Personal, server syncing, each family member) and now a couple of One Drive accounts for backup and personal document sync all linked. It even does Google Drive.

Features I Stopped Using

There aren’t a lot of features I have stopped using, but there are a couple.

  • Web Station – The Synology comes with an optional Webserver and a weird WordPress system that can be enabled. This has been weirdly buggy since day one and I already have plenty of experience managing LAMP stack servers. I recently disted off one of my older Pis, set it up with WordPress and moved the primary use I was using the Synology Web Station for to the Pi. Mostly, It was just a WordPress Archive of all of my old blog posts from various blogs. The links were weird and didn’t work properly because it didn’t quite understand subdirectories or something. The images were present but they didn’t always work because they pointed to old URLs and working the SQL system to change them always came off as wonky. Basically, I didn’t need this archive to be on the NAS and it was an easy thing to just offload to another device.
  • Cloud Station Server – This is a back up system for devices and computers. It will sync specific local folders to a folder on the NAS as a backup. Maybe I was doing something wrong but it always felt really flaky as well, so I just sort of stopped using it. I had it on every laptop in the family for a while but as laptops were replaced, then things started getting weird and getting others to grok how to pull back their files wasn’t super easy either. The better solution I have found is to just give everyone a shared folder specific to them that they can shove files they want to keep into. For my personal use it was just redundant because my entire workflow for years has essentially been cloud based with Dropbox or One Drive keeping everything backed up by default.
  • Surveillance Station – I still sort of use this, but all of my webcams died except one, which doesn’t have night mode anymore. So, it exists and I would use it, but I don’t really use it much anymore. Also. frankly, there was never anything worth seeing on the recordings.

Workflow

The real workflow from the NAS comes from shared folders. Everyone has access to the Family Photos folder mapped to their laptops. I created a shared folder for all of the Blog graphics my wife was using for her blog work that everyone can access since my daughters both helped her with that. They use a shared drive for all of the Ebay and Mercari photos they work on.

I keep folders for photos, and videos and ebooks. I keep folders for important family documents like Tax Returns. All of this can easily be synced to a backup in the cloud and I have a couple of USB keys and loose drives that I do periodic manual backups to, that get stuck in a fire proof safe.

It also lets me map other network drives in as well, for shuffling files around. I have a whole second Linux box set up that has another 4TB or so space in it across several drives, that I use to store less important files like Installable programs and games, ISOs, temporary files for video editing projects, a mountain of internet memes and images saved over the years, music concerts I’ve downloaded, etc. Plus I can map things like, the web root for my Raspberry Pi, or set up a one way(ish) SSH tunnel to my Webserver for pulling backups through.

The box itself sits behind the TV upstairs, and if there ever was a fire or something, it’s likely one of the things I might try to grab on the way out the door, but I’d like to thing my system is robust enough that even if it were lost anything important would be recoverable.

Migrating Mail-In-A-Box to a New VPS

A few years ago, I started running my own mail server using Mail-In-A-Box. Four years or so actually, if the age of my old server was accurate. I have several different email addresses, mostly to better segment out content. I have done this with Reddit, and Twitter, and TT-RSS, and probably other things. In my Mail-In-A-Box I run email for 3 domains, two of mine, one for my wife’s. Overtime I may eventually migrate all of my email to it, at this point, I am a little worried about being blacklisted, so I mostly use it for secondary, receive only, email aggregation.

For a while I’ve been putting off migrating the system to a new VPS. It’s been running on Ubuntu 14.04 since it was created. Newer MiaB won’t run on 14.04 and I can’t distro update the machine. The only choice is to roll a new VPS and migrate the mail.

I use Digital Ocean for my online services, feel free to sign up with the link in the side bar if you want, I get a little kickback if you do. It’s easy to use and affordable. Plus in cases like this, I can spin up an extra VPS, then easily destroy it and spin up a new one, when I discover that MiaB only works up through 18.04, so 20.04, which I used initially, won’t work. Also having the extra server just means a temporary bump in my billing for the month.

The basic process for migrating Mail-In-A-Box is here, in the official documentation. I had a few hiccups along the way but I got them ironed out.

First step was creating the new machine. I mentioned above, I first made a 20.04 machine, but found that doesn’t work, so I killed that and made a new 18.04 machine. Before anything else, I did a few security based housecleaning tasks. The server was creating with Shared Keys log in set up, but it only had a root account. So I created a new user and made them a sudoer. I also copied the SSH keys from root to the user.

adduser Username
usermod -aG sudo Username
cp ~/.ssh /home/Username
chown Username:Username /home/Username/.ssh -R

Next step was to add the new user to the SSH users and secure up that access.

sudo pico /etc/ssh/sshd_config

Then edit:

#Port 22

To a custom port and change:

PermitRootLogin no

Finally add:

AllowUsers Username

Lastly restart the ssh server with sudo service sshd restart. Then test the connection using the regular user. If that works, then disconnect from the root session and continue on the regular user.

I was doing an upgrade but the fresh install guide is here. All I needed was the set up line really, which takes a minute to run but does an initial set up of Mail-in-a-Box.

curl -s https://mailinabox.email/setup.sh | sudo -E bash

The next part was the trickiest bit. I linked the migration article above but I ended up trying to simplify things a bit. On the old machine, I stopped the mailinabox service, so no new mail would come in, then ran the backup python script as described int he article above. I found it was easiest to just connect to the server using Filezilla using SSH FTP, which meant importing my keys to Filezilla. It’s in the settings under SFTP. Something to keep in mind if you set a custom port is you’ll need to add sftp:// before the IP address.

Things are a little tricky here, since root owns the backup folder. I ended up doing a sudo copy into my user home directory, then a chown on the folder to give my user account access to the folder. This meant Filezilla could see the folder and download it to my local machine. There are way to directly transfer between the new and old server, but between custom ports and SSH keys and permissions, I found it was easiest just to download to my local laptop. Afterwards, I connected with SFTP to the NEW server, and pushed the backup folder to the new server. You need the whole folder with the “secret_key” text file and the encrypted folder and files. Basically, this is all the settings and emails.

Next step was to ssh into the New Server, go to the freshly uploaded backup directory, and import the old files, as described in the link. This is two commands run, separately.

export PASSPHRASE=$(cat secret_key.txt)

sudo -E duplicity restore --force file:///home/Username/backup/encrypted /home/user-data/

This takes a minute to run. The next step listed is to rerun the mailinabox set up with “sudo mailinabox”.

I had trouble here. Nginx would not restart. After sound troubleshooting I found it was an issue with SSL. Basically what seemed to happen was the restore, pulled the old SSL certs. Or maybe it was looking for the old SSL certs. Whatever the case, the fix was this process.

rm -rf /home/user-data/ssl/*

The fix was to delete the SSL certificates. then run “sudo mailinabox”. Everything started up. I verified I could log into the control panel and the mailbox using the UP address of the new server. I verified that all my custom DNS records existed, these are needed since the Glue Records point to the Mail-In-A-Box machine but because I host my websites on a separate machine, I have to have DNS records set up appropriately.

One thing I noticed was the SSL Certificates seemed to be wrong, which meant things worked, but would cause annoying security messages. I am not sure if this was related to deleting the certs above, or just that it was still looking for the old IP address. Whatever the case, I did a manual update with certbox for my MiaB Subdomain using

sudo certbot certonly --force-renewal -d Subdomain.Domain.comHere

Another minor issue I ran into, doing this needs to drop a file either in the webroot folder, or spin up a temporary web server to host it’s own file. I couldn’t find the webroot for the custom MiaB set up (it was not /var/www/html) so I temporarily ran “sudo service nginx stop”, then ran the above certbox command, using a temporary webserver option, then “sudo service nginx start” to restart Nginx. NGinx had to be stopped since otherwise it is using Port 80, and the temporary server can’t start to runt he certificate verification process.

Another note, I am not sure if the –force-renewal option is needed above. It didn’t throw out any errors and it fixed the issue, so I left it.

The final step was to go to my Domain Registrar and update the name servers and Glue Records to point to the new Server IP. After a short bit of waiting, eventually the mail server URL connected to the admin and web consoles. I did some test send and receive of emails between my server and gmail to verify everything was working properly. One nice bit, the newer MiaB has a different interface for Roundcube webmail, so I could easily tell if I was going to the new or old server.

Once everything was satisfactory, i went back to Digital Ocean and powered down the old server. If everything is still working in a few days, I will destroy the old server, so I don’t have to keep paying upkeep on it. One thing to keep in mind, both the old and new servers require a specific hostname, so they will be named the same, so double check that you are powering down and deleting the correct server. some easy ways to verify are IP address, or server age. The old server is several years old but the new server is several days old.