Technology

Windows Removed

A while back I mentioned a return to Linux in a Dual Boot situation on my Laptop. The motivation here, primarily, is that it’s a perfectly good laptop, and Windows 10 is “end of life” and going away. Also, frankly, I miss using Linux more. I ran Linux on my last laptop before it was eventually replaced.

I’ve takent he next step and completely removed Windows.

I’ve done this for one reason, which is mayt not fix, but it might. I have, occasionally, been getting random system freezes. I never got this in Windows. My “theory” here is as follows:

  • My laptop has two physical drives, one is one of the newer M2 whatever drives that looks like a long ram stick chip, the other is an old school 3.5″ SSD. It seems really fucking weird to call an SSD “old school” like that, but whatever.
  • The SSD probably isn’t lose, but I have always felt like it fits a little loose inside the bay. I even have a piece of folded paper crammed inside to help give it some cushon to sty in place.
  • I am worried that the SSD is slipping out a bit and causing issues.
  • Also it’s possible the SSD is failing, but I doubt it.

Whatever the case, my ultimate goal was to move Linux to the original M2 drive and replace Windows, and I’ve completed that task. It was surprisingly painless, but not without hiccups. I was alos pretty nervous at first that I would hose up the laptop’s boot ability. As such, I wanted to give a general rundown of the process.

Just for a baseline of relevant points. With Windows my laptop:

  • Had 1 original M2 Drive with Windows on it, in Linux, this shows as “sdb”
  • Ad added SSD with 2 partitions, one with the Linux Mint file system, “sda3”
  • One Parition mapped as /home labeled “sda2”
  • A handful of other smaller patitions on both drives
  • Boot was set up for UEFI Enabled, Secure Boot Disabled

My first step was getting rid of the Windows File System. I simply opened “Disks” in Mint, and deleted the Windows Partition and the Recovery partition on sdb. This ended up being unecesary, but I did it anyway. Just for the sake of my sanity, I also rebooted to make sure I could still boot to Linux Mint. I also made a note that “sda” is 240GB and “sdb” is 256GB.

I then downloaded Clonezilla and write it to a USB drive to boot from. This ended up beingn the most complicated step. All of the tools I usually use to make writable USB drives from ISO files, Yumi, Rufus, Ventoy, all seem to only work from Windows. I also didn’t have my Yumi USB drive handy to just put the ISO there and boot.

I came across this command but it didn’t seem to actually boot.sudo dd bs=4M if=/path/to/file.iso of=/dev/sdX status=progress oflag=sync

It’s possible this did work, but I’ll touch at that later.

Ultimately, I discovered, that simply “right clicking” the .iso file, had a menu option to “Burn to Bootable USB”.

So I did that.

I also found that in order to boot from USB, I had to enter the BIOS, and turn off UEFI boot in favor of Legacy boot.

I booted to Clonezilla. I started to do a partition to partition clone but there didn’t seem to be an obvious way to clone sda3 (the file system) into the free/empty space of :”sdb”. So instead I just did a full disk clone of sda into sdb.

On reboot, I discovered that, I needed to reenable UEFI in order for things to work. After changing it back, I was in, Linux Mint booted just fine.

Next step was to clean up the duplicate partitions and resize each remainign parition to consume their respective drives. Actually my actual next step was to verify which copy of Mint I was running. I was going to drop some place holder test files, then reboot the machine and putz with the BIOS boot order. It turns out, I got lucky and I was already in the “new” copy on sdb. I was able to verify this because when I opebned Disks, the “old/original filesystem” partition, was not active. I simpley deleted it.

I also deleted the extra copy of “Home” that I had created on sdb.

Just for my own sanity, I did another reboot. Everything worked fine, and my home files mounted properly as expected. I now had the file system on the M2 drive sdb, and my home folder on the second drive of sda.

So now, I was ready to resize the partitions. Resizing the Homes drive was easy, I, once again, opened Disks, then did a resize and now it’s 240GB total.

The Filesystem was a bit less easy. Weirdly, I could, in the active file system, expand it to consime the 16GB or so at the “end” of the disk where the extra space now was (the new drive was 256GB vs the old SSD which was 240). What I couldn’t do was “pull it forward” to consime the 140GB or so that used to be the “home” partition.

So I went otu and downloaded a copy of Gparted this time, burned it to my USB stick, and rebooted. Now I was able to reize the file system to consume the entire remaining drive space.

This also meant toggling UEFI off and back on again.

Just for shits and giggles, I also decided to see what would happen if I enabled secure boot, which just entered into a weird boot loop. So I disabled that again, and finally, booted back into my now fully set up file system.

I can’t vouch for if it actually fixed my freeeze up issue yet. That may just be me over using it. I did also pick up some new memory for it, bumping it up to 16GB from 8GB, and also at a slightly faster clock speed.

Windows 10 End of Life

I got a notice today on my secondary Desktop about Windows 10 going out of support later this year. Microsoft really wants people to upgrade to Windows 11.

Like, a lot.

I mean, I get it, and I have no problem with updating. I use Windows 11 at work, I use it on my main desktop. Aside from the annoyance that the Task Bar can’t be docked on the side or top, I don’t really notice.

But I can’t, not on this PC, due to…. Reasons…? Windows 11 is essentially just, Windows 10 under the hood, it seems really weird that I can’t update this machine. Its most likely due to age. But this kind of leads me to another point.

This PC works just fine.

Its just my previous desktop, off to the side. It does everything I need it to do, just fine. It could even do more than I need it to do, just fine. I mostly use it to run Docker Containers and to host files. I occasionally use it to run a second Fortnite instance to play Bot Matches. It has gobs of memory for doing whatever task I throw at it, I have considered getting a better GPU for it to do AI stuff with it (it already has a very nice GPU, just not, AI nice).

Probably, at some point, I will just blow it out and load Ubuntu on it. I will lose my Fortnite ability probably, but I am kind of done with that anyway. I have already been slowly winding down my use needs that are Windows dependant there. I need to figure out the process of transferring my Docker containers over first. I may test things out using my Laptop first, which already runs Linux.

Which is another set of contention here. I went ahead and just replaced Windows 10 on my 10 year old Laptop. It still, ran just fine. I primarily use it for writing and coding, but I do play some games on it too. Thankfully, Steam has made great strides in getting Linux support in the gaming world.

But its not just my laptop. All 3 of my kids and my wife have laptops. My son has a desktop as well. Only one of these laptops is Windows 11 compatible. I have no idea on my son’s desktop. But all of these PCs work plenty fine.

I know I keep pushing this, “It works fine” point, but part of that is because the Windows 11 “requirements” really feel like a weird appeasement to PC makers to try to “encourage people to upgrade hardware.” I feel like these people are greatly overestimating just how often people buy new hardware. I had a neighbor at my old place with a Windows XP machine he would ask me to work on sometimes. The reality is, an XP machine would work just fine for what he needed.

PC power basically just, plateaued in usefulness a decade or so ago. It kind of feels like why there is such a big push for AI crap as well. “Get the new PC with an NPU! Get AI locally somit can make stuff up without the cloud!”

I could put Linux on some of these machines, but I already get grief over having to occasionally fix things on my family’s laptops as it is, I don’t really need that extra layer of grief AND a learning curve. I know its all much easier now, but like the upgrade cycle, its a “regular people” thing. Its a “Why doesn’t this scanner software work” thing, or a “why can’t Install my SIMS game” thing.

Though I will say this, for my case. Linux Mint runs 1000x better than Windows 10 did. I do get occasional weird lock ups though, which is annoying. Seems to be some sort of memory issue with Firefox because it happens if I get too ambitious with tabs, which I very often do. Its fine fine, then suddenly, the shole system is unresponsive.

But it also helps that I know how to use Linux already. I have been using it on some level now for like 25 years. I failed to install it the first time while at college in the early 2000s. But I have used it since. And prefer it.

Linux Post Install Clean-Up

So, now that I’ve returned to Linux again, I’ve come across several sort of, clean-up tasks that needed to be completed to get things working fully.  A lot of my activities are, by design, machine agnostic.  That is to say, they run off “the cloud”, either through a service or something I am hosting.

One big one I use is One Drive.  I don’t NEED one drive running locally, but it’s convenient and nice to have.  Aside from just syncing and backing up all my writing through it, I also use it to do things like, sync blog graphics files and screen shots.  I’ve found this One Drive Linux Client, which seems promising, I’ve gotten it set up easily enough, but I have not quite worked out how to get it fully working with a selective sync.  I don’t need everything off my One Drive, and don’t have the drive space for that anyway.  So this one is pending a bit.

That hasn’t really slowed me down, I already also use GitHub for a lot of my writing as a secondary place with versioning, etc.  I made sure everything was up to date in Windows, then did a pull from the three remote repositories I care about, my Journal, my Digital Notes library, and my Web Clips library.  I made a few updates and made sure I had the workflow down for keeping things synced.  This also prompted the creation of a simple script to push everything at once.

#!/bin/bash  
git add -A  
git commit -m "Updated via Simple CLI Push"  
git push

I thought about adding the option to add a custom commit message, but these are all private repositories so I don’t really care about what the commit messages are. I also added this to the shell so I can just run it with “gitpush” from anywhere.

This also meant properly setting up SSH keys in Github, so I could actually pull the libraries. I also realized I would need to set up my SSH Keypairs for my web server space, which wasn’t hard but was mildly inconvenient because account based SSH is disabled. The simple solution was to reenable it using the Digital Ocean console, add the keys, then disable it again.

Probably the biggest hassle I had was getting the two NTFS partitions, one on the old primary Windows Drive, and a second on the same physical secondary drive as the system. I mostly use this drive for “working files”. Ebooks to read, monthly file dumps off my phone, programming projects, etc.

It’s just files.

I could manually mount both drives when I started, but any reboot would unmount them. I went out and looked up the fstab settings to use, and had no luck. In fact, I had the opposite of luck because at one point, I couldn’t mount the secondary storage drive at all in Linux. Only in Windows. I tried many options in both OSes, and finally just, backed everything up and wiped the partition in favor of a native ext4 format.

Since I had all this space now anyway, I remapped my /home/ folder to it, which is kind of good practice anyway, then copied everything from the old working files drive into a folder in my own home folder.

This ended up being a weird hassle too, because at one point I had “pre copied” the working files, before the migration, only to discover they had vanished when the /home/ folder was moved. I think what was happening, was they were not part of the encrypted blob, so the system simply, ignored them. So I had to unmount everything, reboot, which failed because now there are no user settings, drop to a recovery console, move the files OUT of the personal home folder, remount it all, then copy the files, again, from inside the OS, so they would receive the proper encryption and show up properly.

What a hassle, but it’s done.

The only real missing element here is that my copy of Affinity Photo is only licensed for Windows, so I’ll need to buy the Linux version. I don’t mind, I have been meaning to upgrade to version 2 anyway. I think Version 2 even has a new sytle liscence that is OS agnostic.

Another last one I’d like to do is automount the network shares from my NAS and file server on boot, if present. I don’t always use the laptop at home though, which means this could be weird when it can’t access them. But I also have an Open VPN tunnel to get to my home network, so there is probably a way to set it up in a way that connects through that always.