Linux & Open Source

Windows Removed

A while back I mentioned a return to Linux in a Dual Boot situation on my Laptop. The motivation here, primarily, is that it’s a perfectly good laptop, and Windows 10 is “end of life” and going away. Also, frankly, I miss using Linux more. I ran Linux on my last laptop before it was eventually replaced.

I’ve takent he next step and completely removed Windows.

I’ve done this for one reason, which is mayt not fix, but it might. I have, occasionally, been getting random system freezes. I never got this in Windows. My “theory” here is as follows:

  • My laptop has two physical drives, one is one of the newer M2 whatever drives that looks like a long ram stick chip, the other is an old school 3.5″ SSD. It seems really fucking weird to call an SSD “old school” like that, but whatever.
  • The SSD probably isn’t lose, but I have always felt like it fits a little loose inside the bay. I even have a piece of folded paper crammed inside to help give it some cushon to sty in place.
  • I am worried that the SSD is slipping out a bit and causing issues.
  • Also it’s possible the SSD is failing, but I doubt it.

Whatever the case, my ultimate goal was to move Linux to the original M2 drive and replace Windows, and I’ve completed that task. It was surprisingly painless, but not without hiccups. I was alos pretty nervous at first that I would hose up the laptop’s boot ability. As such, I wanted to give a general rundown of the process.

Just for a baseline of relevant points. With Windows my laptop:

  • Had 1 original M2 Drive with Windows on it, in Linux, this shows as “sdb”
  • Ad added SSD with 2 partitions, one with the Linux Mint file system, “sda3”
  • One Parition mapped as /home labeled “sda2”
  • A handful of other smaller patitions on both drives
  • Boot was set up for UEFI Enabled, Secure Boot Disabled

My first step was getting rid of the Windows File System. I simply opened “Disks” in Mint, and deleted the Windows Partition and the Recovery partition on sdb. This ended up being unecesary, but I did it anyway. Just for the sake of my sanity, I also rebooted to make sure I could still boot to Linux Mint. I also made a note that “sda” is 240GB and “sdb” is 256GB.

I then downloaded Clonezilla and write it to a USB drive to boot from. This ended up beingn the most complicated step. All of the tools I usually use to make writable USB drives from ISO files, Yumi, Rufus, Ventoy, all seem to only work from Windows. I also didn’t have my Yumi USB drive handy to just put the ISO there and boot.

I came across this command but it didn’t seem to actually boot.sudo dd bs=4M if=/path/to/file.iso of=/dev/sdX status=progress oflag=sync

It’s possible this did work, but I’ll touch at that later.

Ultimately, I discovered, that simply “right clicking” the .iso file, had a menu option to “Burn to Bootable USB”.

So I did that.

I also found that in order to boot from USB, I had to enter the BIOS, and turn off UEFI boot in favor of Legacy boot.

I booted to Clonezilla. I started to do a partition to partition clone but there didn’t seem to be an obvious way to clone sda3 (the file system) into the free/empty space of :”sdb”. So instead I just did a full disk clone of sda into sdb.

On reboot, I discovered that, I needed to reenable UEFI in order for things to work. After changing it back, I was in, Linux Mint booted just fine.

Next step was to clean up the duplicate partitions and resize each remainign parition to consume their respective drives. Actually my actual next step was to verify which copy of Mint I was running. I was going to drop some place holder test files, then reboot the machine and putz with the BIOS boot order. It turns out, I got lucky and I was already in the “new” copy on sdb. I was able to verify this because when I opebned Disks, the “old/original filesystem” partition, was not active. I simpley deleted it.

I also deleted the extra copy of “Home” that I had created on sdb.

Just for my own sanity, I did another reboot. Everything worked fine, and my home files mounted properly as expected. I now had the file system on the M2 drive sdb, and my home folder on the second drive of sda.

So now, I was ready to resize the partitions. Resizing the Homes drive was easy, I, once again, opened Disks, then did a resize and now it’s 240GB total.

The Filesystem was a bit less easy. Weirdly, I could, in the active file system, expand it to consime the 16GB or so at the “end” of the disk where the extra space now was (the new drive was 256GB vs the old SSD which was 240). What I couldn’t do was “pull it forward” to consime the 140GB or so that used to be the “home” partition.

So I went otu and downloaded a copy of Gparted this time, burned it to my USB stick, and rebooted. Now I was able to reize the file system to consume the entire remaining drive space.

This also meant toggling UEFI off and back on again.

Just for shits and giggles, I also decided to see what would happen if I enabled secure boot, which just entered into a weird boot loop. So I disabled that again, and finally, booted back into my now fully set up file system.

I can’t vouch for if it actually fixed my freeeze up issue yet. That may just be me over using it. I did also pick up some new memory for it, bumping it up to 16GB from 8GB, and also at a slightly faster clock speed.

Linux Post Install Clean-Up

So, now that I’ve returned to Linux again, I’ve come across several sort of, clean-up tasks that needed to be completed to get things working fully.  A lot of my activities are, by design, machine agnostic.  That is to say, they run off “the cloud”, either through a service or something I am hosting.

One big one I use is One Drive.  I don’t NEED one drive running locally, but it’s convenient and nice to have.  Aside from just syncing and backing up all my writing through it, I also use it to do things like, sync blog graphics files and screen shots.  I’ve found this One Drive Linux Client, which seems promising, I’ve gotten it set up easily enough, but I have not quite worked out how to get it fully working with a selective sync.  I don’t need everything off my One Drive, and don’t have the drive space for that anyway.  So this one is pending a bit.

That hasn’t really slowed me down, I already also use GitHub for a lot of my writing as a secondary place with versioning, etc.  I made sure everything was up to date in Windows, then did a pull from the three remote repositories I care about, my Journal, my Digital Notes library, and my Web Clips library.  I made a few updates and made sure I had the workflow down for keeping things synced.  This also prompted the creation of a simple script to push everything at once.

#!/bin/bash  
git add -A  
git commit -m "Updated via Simple CLI Push"  
git push

I thought about adding the option to add a custom commit message, but these are all private repositories so I don’t really care about what the commit messages are. I also added this to the shell so I can just run it with “gitpush” from anywhere.

This also meant properly setting up SSH keys in Github, so I could actually pull the libraries. I also realized I would need to set up my SSH Keypairs for my web server space, which wasn’t hard but was mildly inconvenient because account based SSH is disabled. The simple solution was to reenable it using the Digital Ocean console, add the keys, then disable it again.

Probably the biggest hassle I had was getting the two NTFS partitions, one on the old primary Windows Drive, and a second on the same physical secondary drive as the system. I mostly use this drive for “working files”. Ebooks to read, monthly file dumps off my phone, programming projects, etc.

It’s just files.

I could manually mount both drives when I started, but any reboot would unmount them. I went out and looked up the fstab settings to use, and had no luck. In fact, I had the opposite of luck because at one point, I couldn’t mount the secondary storage drive at all in Linux. Only in Windows. I tried many options in both OSes, and finally just, backed everything up and wiped the partition in favor of a native ext4 format.

Since I had all this space now anyway, I remapped my /home/ folder to it, which is kind of good practice anyway, then copied everything from the old working files drive into a folder in my own home folder.

This ended up being a weird hassle too, because at one point I had “pre copied” the working files, before the migration, only to discover they had vanished when the /home/ folder was moved. I think what was happening, was they were not part of the encrypted blob, so the system simply, ignored them. So I had to unmount everything, reboot, which failed because now there are no user settings, drop to a recovery console, move the files OUT of the personal home folder, remount it all, then copy the files, again, from inside the OS, so they would receive the proper encryption and show up properly.

What a hassle, but it’s done.

The only real missing element here is that my copy of Affinity Photo is only licensed for Windows, so I’ll need to buy the Linux version. I don’t mind, I have been meaning to upgrade to version 2 anyway. I think Version 2 even has a new sytle liscence that is OS agnostic.

Another last one I’d like to do is automount the network shares from my NAS and file server on boot, if present. I don’t always use the laptop at home though, which means this could be weird when it can’t access them. But I also have an Open VPN tunnel to get to my home network, so there is probably a way to set it up in a way that connects through that always.

Linux, Again

So, I am back on Linux again. On my laptop at least. I never really STOPPED using Linux, I use it on Raspberry Pis and my Webserver and in the Windows Subsystem for Linux all the time.

Just for the record, I just went with boring Mint Linux.

But I have moved back to using it full time again on my laptop. I still have the Windows partition for now, but I seriously doubt I am ever going to go back to it again. And FWIW, I have previously used only Linux on laptops in the past. The concept is not at all alien to me.

I have wanted to make the switch back for a while but was a bit hobbled. I had been using an app to tether connectivity off my phone over USB. I could never get it to reliably work in Linux. I discovered recently that my cell plan now just includes regular WiFi based hotspotting, so I can just connect via WiFi when needed.

This was literally the only hurdle.

Another motivation though is the end of life for Windows 10 coming next year. I really feel like this is going to get pushed out because Windows 10 is really going strong still. But just in case, I need to get off of it.

Side note, Microsoft is really over estimating just how often people upgrade their PC hardware with their push of Windows 11 from 10. The blocking factors of the upgrade are extremely arbitrary and most of the PCs that can’t be updated to Windows 11, still function just fine for 90% of use cases for “regular people.”

Anyway, I plan to keep using my laptop for the foreseeable future. So moving to Linux is the best option.

I also want to use it as a bit of a test bed for migrating my project PC to Linux as well, mostly I have questions relating to Docker and the easiest solution will just be to test it.

Anyway, the migration itself was surprisingly smooth. Most of my workflow has been shifting to be very “floaty”. Almost all my writing for example, is Joplin and local files, which are in a private Github repository.

Joplin just worked, and then I set up Git and pulled the repository down. Visual Studio Code has a Linux option, I think it was even pre-installed I think. I already have been using it as a txt editor so I am familiar with the best ways to set things up.

The real missing piece is OneDrive syncing, but its something I can work around, especially since these folders already sync via my Synology.

Most everything else I use on the Laptop was just a matter of making a list in Windows and then downloading them in Linux. Mostly I just use the thing for writing and for sorting image files off my phone.

Dead Memory Cards and Using Docker

More often that it feels like it should, something in technology breaks or fails. I find that this can be frustrating, but often ultimately good, especially for learning something new, and forcing myself to clean up something I’ve been meaning to clean up. I have a Raspberry Pi I’ve been using for a while for several things as a little web server. It’s been running probably for years, but something gave out on it. I’m not entirely sure it’s the SD card or the Pi itself honestly, because I’ve been having a bit of trouble trying to recover through both. It’s sort of pushed me to try a different approach a bit.

But first I needed a new SD card. I have quite a few, most are “in use”. I say “in use” because many are less in use and more, underused. This has resulted in doing a bit of rebuild on some other projects to make better use of my Micro SD cards. The starting point was a 8 GB card with just a basic Raspbian set up on it.

So, for starters, I found that the one I have in my recently set up music station Raspberry Pi is a whopping 128gb. Contrary to what one might thing, I don’t need a 128gb card in my music station, the music is stored on the NAS over the network. It also has some old residual projects on it that should really be cleaned out.

So stuck the 8GB card in that device and did the minor set up needed for the music station. Specifically, configure VLC for Remote Control over the network, then add the network share. Once I plugged it back into my little mixer and verified I could remote play music, I moved on.

This ended up being an unrelated side project though, because I had been planning on getting a large, speedy, Micro SD card to stick in my Retroid Pocket. So I stuck that 128GB card in, the Retroid and formatted it. This freed up a smaller, 32GB card.

I also have a 64GB that is basically not being used in my PiGrrl Project I decided to recover back for use. The project was fun, but the Retroid does the same thing 1000x better. So now it’s mostly just a display piece on a shelf. Literally an overpriced paperweight. I don’t want to lose the PiGrrl configuration though, because it’s been programmed up to work with the small display and IO Control Inputs. So I imaged that card off.

In the end though, I didn’t end up needing those Micro SD cards though, I opted for an alternative option to replace the Pi, with Docker on my secondary PC. I’ve been meaning to better learn Docker, though I still find it to be a weird and obtuse bit of software. There are a handful of things I care about restoring that I used the Pi for.

  • Youtube DL – There seem to be quite a few nice Web Interfaces for this that will work much better than my old custom system.
  • WordPress Blog Archives – I have exported data files from this but I would like to have it as a WordPress Instance again
  • FreshRSS – My RSS Reader. I already miss my daily news feeds.

YoutubeDL was simple, they provided a nice basic command sequence to get things working.

The others were a bit trickier. Because the old set up died unexpectedly, The data isn’t easily exported for import, which means digging out and recovering off of the raw database files. This isn’t the first time this has happened, but its a lot bigger pain, which isn’t helped by not being entirely confident in how to manipulate Docker.

I still have not gotten the WordPress archive working actually. I was getting “Connection Reset” errors and now I am getting “Cannot establish Database connection” issues. It may be for nothing after the troubles I have had dealing with recovering FreshRSS.

I have gotten FreshRSS fixed though. Getting it running in Docker was easy peasy. Getting my data back, was… considerably less so. It’s been plaguing me now when I try to fix it for a few weeks now, but I have a solution. It’s not the BEST solution, but it’s… a solution. So, the core thing I needed were the feeds themselves. Lesson learned I suppose, but I’m going to find a way to automate a regular dump of the feeds once everything is reloaded. I don’t need or care about favorited articles or the articles contents. These were stored in a MySQL database. MySQL, specifically seems to be what was corrupted and crashed out on the old Pi/Instance because I get a failed message on boot and i can’t get it to reinstall or load anymore.

Well, more, I am pretty sure the root cause is the SD card died, but it affected the DB files.

My struggle now, is recovering data from these raw files. I’ve actually done this before on a surver crash years ago, but this round has lead to many many hurdles. One, 90% of the results looking up how to do it are littered with unhelpful replies about using a proper SQL dump instead. If I could open MySQL, I sure as hell would so that. Another issue seems to be that the SQL server running on the Pi was woefully out of date, so there have been file compatibility problems.

There is also the issue that the data may just flat out BE CORRUPTED.

So I’ve spun up and tried to manually move the data to probably a dozen instances of MySQL and MariaDB of various versions, on Pis, in Docker, on WSL, in a Linux install. Nothing, and I mean NOTHING has worked.

I did get the raw data pulled out though.

So I’ve been brute forcing a fix. Opening the .ibd file in a text editor gives a really ugly chuck of funny characters. But, strewn throughout this, is a bunch of URLs for feeds and websites and well, mostly that. i did an open “Replace” in Notepad++ that stripped out a lot of the characters. Then I opened up Pycharm, I did a find and replace with blanks on a ton of other ugly characters. Then I write up this wuick and dirty Python Script:

# Control F in Notepad++, replace, extended mode "\x00"
# Replace "   " with " "
# replace "https:" with " https:"
# rename to fresh.txt

## Debug and skip asking each time
file = "fresh.txt"
## Open and read the Log File supploed
with open(file, encoding="UTF-8") as logfile:
    log = logfile.read()

datasplit = log.split(" ")
links = []

for each in datasplit:
    if "http" in each:
        links.append(each)

with open("output.txt", mode="w", encoding="UTF-8") as writefile:
    for i in links:
        writefile.write(i+"\n")

Which splits everything up into an array, and skims through the array for anything with “http” in it, to pull out anything that is a URL. This has left me with a text file that is full of duplicates and has regular URLs next to Feed URLS, though not in EVERY case because that would be too damn easy. I could probably add a bunch of conditionals to the script to sort out anything with the word “feed” “rss”, “atom” or “xml” and get a lot of the cruft removed, but Fresh RSS does not seem to have a way to bulk import a text list, so I still get to manually cut and paste each URL in and resort everything into categories.

It’s tedious, but it’s mindless, and it will get done.

Afterwards I will need to reset up my WordPress Autoposter script for those little news digests I’ve been sharing that no one cares about.

Slight update, I added some filtering ans sorting to the code:

# Control F in Notepad++, replace, extended mode "\x00"
# Replace "   " with " "
# replace "https:" with " https:"
# rename to fresh.txt


## Debug and skip asking each time
file = "fresh.txt"
## Open and read the Log File supploed
with open(file, encoding="UTF-8") as logfile:
    log = logfile.read()

datasplit = log.split(" ")
links = []

for each in datasplit:
    if "http" in each:
        if "feed" in each or "rss" in each or "default" in each or "atom" in each or "xml" in each:
            if each not in links:
                links.append(each[each.find("http"):])

links.sort()

with open("output.txt", mode="w", encoding="UTF-8") as writefile:
    for i in links:
        writefile.write(i+"\n")

SQL Woes

For the most part, managing my web server is pretty straightforward, especially because I don’t really get a ton of traffic. Its mostly just keeping things up to date through standard channels.

Occasionally I have a bit of a brain fart moment. I recently was doing regular Linux updates on the server. I noticed a message I had seen before about some packages being held back. Occasionally I will go through and update these, because I am not real sure why they are being held back, but don’t really see any reason they should be.

Then MySQL broke.

So I went digging in some logs and searching for solutions, and decided I needed to roll back the version. Following a guide I found, I discovered… I had done this before, which I now vaguely remebered. Because the old .deb file was still there from last time I broke it.

Anyway, this didn’t fix it, MySQL still was not launching.

I decided that maybe it was time to just switch to MariaDB, which I believe is the spiritual successor to MySQL. And the process was simple enough, I would not even need to dump my Databases. So I uninstalled MySQL, installed MariaDB and… It worked!

Then it stopped working.

I restarted the SQL service and it worked!

Then it…. Stopped working… Again…

So I checked logs again and corrected some issues there and again it worked, then a half hour or so later it stopped working.

One thing I had come across in troubleshooting the original MySQL issue was that there was a command, mysql_upgrade that needed to be run to change how some tables are configured. I couldn’t do that before because I couldn’t even get MySQL to run. But I could get MariaDB to run at least for a bit, and had successfully gotten this upgrade command to run.

So I decided to, once again, try MySQL again, so I uninstalled MariaDB, and purged everything out, rebooting a few times to be sure. And MySQL would not even install anymore, so more purging, this time, including the Databases.

One thing I was glad I had decided to do, “Just in Case” when MariaDB was “working” was dump the databases out with backups. I was glad I did at this point. So with absolutely everything purged, MySQL installed and was working.

I set about recreating the databases from the dumps, and while I was at it updated all the passwords, since I had to recreate the user accounts used by WordPress anyway.

And now everything is working smoothly again.

A couple of links that were actually helpful in solving my problem.

https://stackoverflow.com/questions/67564215/problems-installing-mysql-on-ubuntu-20-04

https://learnubuntu.com/install-mysql/