Linux & Open Source

Linux Post Install Clean-Up

So, now that I’ve returned to Linux again, I’ve come across several sort of, clean-up tasks that needed to be completed to get things working fully.  A lot of my activities are, by design, machine agnostic.  That is to say, they run off “the cloud”, either through a service or something I am hosting.

One big one I use is One Drive.  I don’t NEED one drive running locally, but it’s convenient and nice to have.  Aside from just syncing and backing up all my writing through it, I also use it to do things like, sync blog graphics files and screen shots.  I’ve found this One Drive Linux Client, which seems promising, I’ve gotten it set up easily enough, but I have not quite worked out how to get it fully working with a selective sync.  I don’t need everything off my One Drive, and don’t have the drive space for that anyway.  So this one is pending a bit.

That hasn’t really slowed me down, I already also use GitHub for a lot of my writing as a secondary place with versioning, etc.  I made sure everything was up to date in Windows, then did a pull from the three remote repositories I care about, my Journal, my Digital Notes library, and my Web Clips library.  I made a few updates and made sure I had the workflow down for keeping things synced.  This also prompted the creation of a simple script to push everything at once.

#!/bin/bash  
git add -A  
git commit -m "Updated via Simple CLI Push"  
git push

I thought about adding the option to add a custom commit message, but these are all private repositories so I don’t really care about what the commit messages are. I also added this to the shell so I can just run it with “gitpush” from anywhere.

This also meant properly setting up SSH keys in Github, so I could actually pull the libraries. I also realized I would need to set up my SSH Keypairs for my web server space, which wasn’t hard but was mildly inconvenient because account based SSH is disabled. The simple solution was to reenable it using the Digital Ocean console, add the keys, then disable it again.

Probably the biggest hassle I had was getting the two NTFS partitions, one on the old primary Windows Drive, and a second on the same physical secondary drive as the system. I mostly use this drive for “working files”. Ebooks to read, monthly file dumps off my phone, programming projects, etc.

It’s just files.

I could manually mount both drives when I started, but any reboot would unmount them. I went out and looked up the fstab settings to use, and had no luck. In fact, I had the opposite of luck because at one point, I couldn’t mount the secondary storage drive at all in Linux. Only in Windows. I tried many options in both OSes, and finally just, backed everything up and wiped the partition in favor of a native ext4 format.

Since I had all this space now anyway, I remapped my /home/ folder to it, which is kind of good practice anyway, then copied everything from the old working files drive into a folder in my own home folder.

This ended up being a weird hassle too, because at one point I had “pre copied” the working files, before the migration, only to discover they had vanished when the /home/ folder was moved. I think what was happening, was they were not part of the encrypted blob, so the system simply, ignored them. So I had to unmount everything, reboot, which failed because now there are no user settings, drop to a recovery console, move the files OUT of the personal home folder, remount it all, then copy the files, again, from inside the OS, so they would receive the proper encryption and show up properly.

What a hassle, but it’s done.

The only real missing element here is that my copy of Affinity Photo is only licensed for Windows, so I’ll need to buy the Linux version. I don’t mind, I have been meaning to upgrade to version 2 anyway. I think Version 2 even has a new sytle liscence that is OS agnostic.

Another last one I’d like to do is automount the network shares from my NAS and file server on boot, if present. I don’t always use the laptop at home though, which means this could be weird when it can’t access them. But I also have an Open VPN tunnel to get to my home network, so there is probably a way to set it up in a way that connects through that always.

Linux, Again

So, I am back on Linux again. On my laptop at least. I never really STOPPED using Linux, I use it on Raspberry Pis and my Webserver and in the Windows Subsystem for Linux all the time.

Just for the record, I just went with boring Mint Linux.

But I have moved back to using it full time again on my laptop. I still have the Windows partition for now, but I seriously doubt I am ever going to go back to it again. And FWIW, I have previously used only Linux on laptops in the past. The concept is not at all alien to me.

I have wanted to make the switch back for a while but was a bit hobbled. I had been using an app to tether connectivity off my phone over USB. I could never get it to reliably work in Linux. I discovered recently that my cell plan now just includes regular WiFi based hotspotting, so I can just connect via WiFi when needed.

This was literally the only hurdle.

Another motivation though is the end of life for Windows 10 coming next year. I really feel like this is going to get pushed out because Windows 10 is really going strong still. But just in case, I need to get off of it.

Side note, Microsoft is really over estimating just how often people upgrade their PC hardware with their push of Windows 11 from 10. The blocking factors of the upgrade are extremely arbitrary and most of the PCs that can’t be updated to Windows 11, still function just fine for 90% of use cases for “regular people.”

Anyway, I plan to keep using my laptop for the foreseeable future. So moving to Linux is the best option.

I also want to use it as a bit of a test bed for migrating my project PC to Linux as well, mostly I have questions relating to Docker and the easiest solution will just be to test it.

Anyway, the migration itself was surprisingly smooth. Most of my workflow has been shifting to be very “floaty”. Almost all my writing for example, is Joplin and local files, which are in a private Github repository.

Joplin just worked, and then I set up Git and pulled the repository down. Visual Studio Code has a Linux option, I think it was even pre-installed I think. I already have been using it as a txt editor so I am familiar with the best ways to set things up.

The real missing piece is OneDrive syncing, but its something I can work around, especially since these folders already sync via my Synology.

Most everything else I use on the Laptop was just a matter of making a list in Windows and then downloading them in Linux. Mostly I just use the thing for writing and for sorting image files off my phone.

Dead Memory Cards and Using Docker

More often that it feels like it should, something in technology breaks or fails. I find that this can be frustrating, but often ultimately good, especially for learning something new, and forcing myself to clean up something I’ve been meaning to clean up. I have a Raspberry Pi I’ve been using for a while for several things as a little web server. It’s been running probably for years, but something gave out on it. I’m not entirely sure it’s the SD card or the Pi itself honestly, because I’ve been having a bit of trouble trying to recover through both. It’s sort of pushed me to try a different approach a bit.

But first I needed a new SD card. I have quite a few, most are “in use”. I say “in use” because many are less in use and more, underused. This has resulted in doing a bit of rebuild on some other projects to make better use of my Micro SD cards. The starting point was a 8 GB card with just a basic Raspbian set up on it.

So, for starters, I found that the one I have in my recently set up music station Raspberry Pi is a whopping 128gb. Contrary to what one might thing, I don’t need a 128gb card in my music station, the music is stored on the NAS over the network. It also has some old residual projects on it that should really be cleaned out.

So stuck the 8GB card in that device and did the minor set up needed for the music station. Specifically, configure VLC for Remote Control over the network, then add the network share. Once I plugged it back into my little mixer and verified I could remote play music, I moved on.

This ended up being an unrelated side project though, because I had been planning on getting a large, speedy, Micro SD card to stick in my Retroid Pocket. So I stuck that 128GB card in, the Retroid and formatted it. This freed up a smaller, 32GB card.

I also have a 64GB that is basically not being used in my PiGrrl Project I decided to recover back for use. The project was fun, but the Retroid does the same thing 1000x better. So now it’s mostly just a display piece on a shelf. Literally an overpriced paperweight. I don’t want to lose the PiGrrl configuration though, because it’s been programmed up to work with the small display and IO Control Inputs. So I imaged that card off.

In the end though, I didn’t end up needing those Micro SD cards though, I opted for an alternative option to replace the Pi, with Docker on my secondary PC. I’ve been meaning to better learn Docker, though I still find it to be a weird and obtuse bit of software. There are a handful of things I care about restoring that I used the Pi for.

  • Youtube DL – There seem to be quite a few nice Web Interfaces for this that will work much better than my old custom system.
  • WordPress Blog Archives – I have exported data files from this but I would like to have it as a WordPress Instance again
  • FreshRSS – My RSS Reader. I already miss my daily news feeds.

YoutubeDL was simple, they provided a nice basic command sequence to get things working.

The others were a bit trickier. Because the old set up died unexpectedly, The data isn’t easily exported for import, which means digging out and recovering off of the raw database files. This isn’t the first time this has happened, but its a lot bigger pain, which isn’t helped by not being entirely confident in how to manipulate Docker.

I still have not gotten the WordPress archive working actually. I was getting “Connection Reset” errors and now I am getting “Cannot establish Database connection” issues. It may be for nothing after the troubles I have had dealing with recovering FreshRSS.

I have gotten FreshRSS fixed though. Getting it running in Docker was easy peasy. Getting my data back, was… considerably less so. It’s been plaguing me now when I try to fix it for a few weeks now, but I have a solution. It’s not the BEST solution, but it’s… a solution. So, the core thing I needed were the feeds themselves. Lesson learned I suppose, but I’m going to find a way to automate a regular dump of the feeds once everything is reloaded. I don’t need or care about favorited articles or the articles contents. These were stored in a MySQL database. MySQL, specifically seems to be what was corrupted and crashed out on the old Pi/Instance because I get a failed message on boot and i can’t get it to reinstall or load anymore.

Well, more, I am pretty sure the root cause is the SD card died, but it affected the DB files.

My struggle now, is recovering data from these raw files. I’ve actually done this before on a surver crash years ago, but this round has lead to many many hurdles. One, 90% of the results looking up how to do it are littered with unhelpful replies about using a proper SQL dump instead. If I could open MySQL, I sure as hell would so that. Another issue seems to be that the SQL server running on the Pi was woefully out of date, so there have been file compatibility problems.

There is also the issue that the data may just flat out BE CORRUPTED.

So I’ve spun up and tried to manually move the data to probably a dozen instances of MySQL and MariaDB of various versions, on Pis, in Docker, on WSL, in a Linux install. Nothing, and I mean NOTHING has worked.

I did get the raw data pulled out though.

So I’ve been brute forcing a fix. Opening the .ibd file in a text editor gives a really ugly chuck of funny characters. But, strewn throughout this, is a bunch of URLs for feeds and websites and well, mostly that. i did an open “Replace” in Notepad++ that stripped out a lot of the characters. Then I opened up Pycharm, I did a find and replace with blanks on a ton of other ugly characters. Then I write up this wuick and dirty Python Script:

# Control F in Notepad++, replace, extended mode "\x00"
# Replace "   " with " "
# replace "https:" with " https:"
# rename to fresh.txt

## Debug and skip asking each time
file = "fresh.txt"
## Open and read the Log File supploed
with open(file, encoding="UTF-8") as logfile:
    log = logfile.read()

datasplit = log.split(" ")
links = []

for each in datasplit:
    if "http" in each:
        links.append(each)

with open("output.txt", mode="w", encoding="UTF-8") as writefile:
    for i in links:
        writefile.write(i+"\n")

Which splits everything up into an array, and skims through the array for anything with “http” in it, to pull out anything that is a URL. This has left me with a text file that is full of duplicates and has regular URLs next to Feed URLS, though not in EVERY case because that would be too damn easy. I could probably add a bunch of conditionals to the script to sort out anything with the word “feed” “rss”, “atom” or “xml” and get a lot of the cruft removed, but Fresh RSS does not seem to have a way to bulk import a text list, so I still get to manually cut and paste each URL in and resort everything into categories.

It’s tedious, but it’s mindless, and it will get done.

Afterwards I will need to reset up my WordPress Autoposter script for those little news digests I’ve been sharing that no one cares about.

Slight update, I added some filtering ans sorting to the code:

# Control F in Notepad++, replace, extended mode "\x00"
# Replace "   " with " "
# replace "https:" with " https:"
# rename to fresh.txt


## Debug and skip asking each time
file = "fresh.txt"
## Open and read the Log File supploed
with open(file, encoding="UTF-8") as logfile:
    log = logfile.read()

datasplit = log.split(" ")
links = []

for each in datasplit:
    if "http" in each:
        if "feed" in each or "rss" in each or "default" in each or "atom" in each or "xml" in each:
            if each not in links:
                links.append(each[each.find("http"):])

links.sort()

with open("output.txt", mode="w", encoding="UTF-8") as writefile:
    for i in links:
        writefile.write(i+"\n")

SQL Woes

For the most part, managing my web server is pretty straightforward, especially because I don’t really get a ton of traffic. Its mostly just keeping things up to date through standard channels.

Occasionally I have a bit of a brain fart moment. I recently was doing regular Linux updates on the server. I noticed a message I had seen before about some packages being held back. Occasionally I will go through and update these, because I am not real sure why they are being held back, but don’t really see any reason they should be.

Then MySQL broke.

So I went digging in some logs and searching for solutions, and decided I needed to roll back the version. Following a guide I found, I discovered… I had done this before, which I now vaguely remebered. Because the old .deb file was still there from last time I broke it.

Anyway, this didn’t fix it, MySQL still was not launching.

I decided that maybe it was time to just switch to MariaDB, which I believe is the spiritual successor to MySQL. And the process was simple enough, I would not even need to dump my Databases. So I uninstalled MySQL, installed MariaDB and… It worked!

Then it stopped working.

I restarted the SQL service and it worked!

Then it…. Stopped working… Again…

So I checked logs again and corrected some issues there and again it worked, then a half hour or so later it stopped working.

One thing I had come across in troubleshooting the original MySQL issue was that there was a command, mysql_upgrade that needed to be run to change how some tables are configured. I couldn’t do that before because I couldn’t even get MySQL to run. But I could get MariaDB to run at least for a bit, and had successfully gotten this upgrade command to run.

So I decided to, once again, try MySQL again, so I uninstalled MariaDB, and purged everything out, rebooting a few times to be sure. And MySQL would not even install anymore, so more purging, this time, including the Databases.

One thing I was glad I had decided to do, “Just in Case” when MariaDB was “working” was dump the databases out with backups. I was glad I did at this point. So with absolutely everything purged, MySQL installed and was working.

I set about recreating the databases from the dumps, and while I was at it updated all the passwords, since I had to recreate the user accounts used by WordPress anyway.

And now everything is working smoothly again.

A couple of links that were actually helpful in solving my problem.

https://stackoverflow.com/questions/67564215/problems-installing-mysql-on-ubuntu-20-04

https://learnubuntu.com/install-mysql/

Fixing Cron Not Executing

Recently I encountered an issue I hadn’t run into before. Specifically, my Cron Jobs were not running. Everything seemed correct and I could manually run the commands at the CLI. I’ve had some issue before with getting things to run because I wasn’t using the complete path for programs but this seemed to be something different.

The problem I found was that the root password needed to be changed. Running the following:

sudo  grep CRON /var/log/syslog

Would output a long list of the same issue repeating over and over.

May 27 10:30:01 Webserver CRON[12943]: Authentication token is no longer valid; new one required
May 27 10:39:01 Webserver CRON[12978]: Authentication token is no longer valid; new one required
May 27 10:39:01 Webserver CRON[12977]: Authentication token is no longer valid; new one required
May 27 10:40:01 Webserver CRON[13049]: Authentication token is no longer valid; new one required

When running the following command:

 sudo chage -l root

Would output something like:

Password expires               : never
Password inactive              : never
Account expires                : never

Which suggests the root password has never changed. So I ran the following command:

sudo passwd root

And set a new root password (which was the same as the old root password) and suddenly everything started working again. It felt like a really odd issue, especially considering I didn’t actually change the password, and as far as I could tell I had a root password. Plus the password wasn’t set to expire at all.

Anyway, I wrapped it off by doing an (optional) truncation of the system log. Since the file had become unwieldingly huge with the following.

sudo truncate -s 0 /var/log/syslog