PC Hardware

Hard Drive Woes Part 2

This post is a follow up to my previous Dead Hard Drive post.

I used to hassle with PC hardware a LOT more than I currently do. I’ve kind of worked my way out of that gig honestly. I am at a point where I can afford shit for starters, mostly, so I’m not trying to cobble together workable machines from random parts. I also got tired of doing tech support for people, so I basically just, sort of hide that I can, because when people find out you can “fix computers”, now you’re vacuuming out 50 years of dust from a Pentium 1 in your backyard for a neighbor who refuses to just buy literally any cheapest machine at Wal-Mart for an infinite performance boost.

“Back in my day!” (fist shaking), you could pretty much just slap any drive with an Operating system in any machine and it would boot. Sometimes it would boot into an ugly driverless environment because it was ripped from another machine, but that was fixable. Things seem more complicated these days. I’m not blaming UEFI, and all that more secure BIOS stuff, but it’s a likely culprit. I think that better security is good, it just, is also part of why I can’t more conveniently fix my damn PC.

I say Conveniently, because that’s the core issue. I can still EASILY do it. It’s just… not convenient.

Shortly after messing with Linux a bit for troubleshooting, I did a bit of set up to use it as the main driver but, decided to just go back to Windows. I downloaded a fresh recovery image, sliced the Linux partition down to 500GB and reinstalled Windows.

I like Linux. I use Linux, almost daily, if not daily. It’s great for automation tasks and running server software and all that. It, kind of really sucks as a desktop OS. Don’t get me wrong, it’s usable, especially for simpler needs (literally anything not Gaming or Video/Photo Editing). I have run Linux as the sole OS on many machines, mostly laptops, and lots of Pis and Servers. I’ve used Linux off and on for over 20 years now. The problem here is, the main use case for my “Kick ass gaming rig” is well, gaming. Half the games I had slated “to play” from Steam are not available in Linux. I set up Hero Launcher for GOG and Epic, but like, my cloud saves didn’t work, and Fortnite doesn’t work and the whole thing felt a little off. Graphics also felt a little off, even though I did switch to using the official proprietary NVidia drivers.

Anyway, I went back to Windows. I spent an eternity downloading drivers and doftware and getting things set up properly. Unfortunately, the secondary drive I was now using as my primary, is just too slow to handle the needs of a lot of games as well. I had to roll Fortnite back to DirectX 11 for example, because it would take like 10 minutes to drop into a match because it would load shaders or some shit. For anyone not aware of how Fortnite works, it’s online, in an arena of players. If you drop in 10 minutes late, your character will have already landed in the map and probably be dead or dying.

So I bit the bullet and bought a new NVME drive. I planned to eventually, I just, did it sooner.

I went and downloaded Clonezilla to just mirror the Hard Drive to the NVME drive, which worked, but things would not boot.

There are plenty of possible solutions online, with recovery mode. I tried a few of them. But in the end, I have opted to just, reinstall Windows, again.

Which means redownloading drivers and shit… again….

I might be able to pull the Steam Downloads over before wiping the secondary drive, but I am not sure Epic will let me do that. Unfortunately, the larger games are from Epic, with Fortnite, Death Stranding and Final Fantasy 7R in that list.

It’s all, very easy.

It’s just all, very inconvenient.

Also, just because, and maybe for future reference, the install needs:

  • Network Driver – For some reason it doesn’t work on the generic.
  • TUF Gaming Amoury Crate – The motherboard seems to load this, and it find and installs all the drivers, which is nice, despite the cheesy name.
  • Windows Update
  • Color Scheme to Dark, no transparency
  • Firefox – Browser of choice, then log into sync and let it pull all my stuff in.
  • Steam
  • Epic
  • Visual Studio Code
  • Change One Drive settings to not sync everything but only some things.
  • Log into the Microsoft Account so One Drive and Office work, since no network driver means local account log in only at first
  • Share X – For Screenshots to folders
  • Display Fusion – For rotating desktop wallpaper
  • Synergy KVM – So I can connect to my other PC\
  • EVGA Flow Control – For the cooler
  • Remove all the cruft from the start menu, remove the apps list and recent files
  • Add a dozen network drives to File Exporer
  • Discord
  • Firestorm Viewer

A Dead Hard Drive

I came down last night to drop some stuff off in the basement and shut the curtains, and sat down to check on something at my desktop PC, I don’t even remember what, and was slightly surprised to see that it was sitting at the BIOS Screen and not the Windows lock screen. My first assumption was, it did an update or something, and the cat was sitting on the keyboard, and cause it to enter the BIOS. They don’t usually sit on the keyboard, but it’s possible.

I rebooted the PC, and…. it just loaded the BIOS again.

Clearly something more than a cat issue.

Both the 1TB M2 NVMe drive and the 2TB add on drive were showing in the BIOS menu, but no boot options were showing available. In fact, it even specifically said something along the lines of “No boot options.” I tried resetting the BIOS settings back to factory default, I had toggled a few things so I could do virtualization, and it was no help.

I dug out a USB key with Linux and booted to that. I mostly wanted to see if I could access the drive still at all. This had to be done with an extremely weird and annoying bright yellow screen where everything was washed out. The Live OS would boot fine and look fine, until it actually got to the point of letting me do anything, when it suddenly seemed to give up it’s video driver causing everything to go wonky.

I managed to squint my way through it and the drive shows up, but it’s not accessible at all.

So I swapped out Ubuntu for a Windows Recovery USB Key. The recovery options (restore, recovery, etc) all failed. These gave a bit more information that the drive was “locked”. I tried a few more options at the command line that I found,

  • bootrec /fixMBR
  • bootrec /fixBoot
  • bootrec /rebuildBCD

But none of these changed anything. I could probably download and run a Windows ISO, but for the moment, I’ve decided on a different route. I booted back into Ubuntu, and just installed that on my 2TD spare drive. It would not take on the 1TB NVMe drive, and the 2TB secondary drive was just all games anyway, so nothing of value would be lost by wiping it clean.

I might, MIGHT just try running this way for a while, though it does have some disadvantages. Mostly, games. Almost everything I’ve been actively playing lately was through the Epic store. And a lot of the games I planned to get to in Steam, don’t work in Linux. There are ways to get them to work though, which I want to look into, but I have not had time yet. I do know Fortnite is flat out not going to work. Not a huge loss, I am kind of getting tired of it again anyway. It has some strict Anti-Cheat which won’t run in any sort of emulated environment.

I also still have my old desktop too I can use. So well, Fortnite may not be out completely, it just, won’t run quite as nice. In fact, I can probably run most of the stuff I want on that machine that won’t work directly in Ubuntu.

Another thing worth mentioning, I am not really out anything file wise. A handful of downloaded files for “TODO” projects that I could download again. I basically never work with files directly on any particular system anymore, it’s always files on the NAS or files in One Drive. The only thing I really lost were the handful of custom Stable Diffusion Embeddings I had created, and I have been meaning to try to rebuild better versions of those anyway.

It will be interesting to see how performance is compared to Windows though. This PC is a pre built gaming PC, so I am sure it’s been somewhat optimized for use with Windows. I have not had a chance to really test it out in a Linux environment at all yet, but I’m interested to see the results. I’d already been toying with the idea of running Linux on this machine but I was worried about how it would handle things like the water cooler. I already don’t have the ability to control the lights on my Keyboard and Mouse, but there may be software to do that available if I look into it.

All in all, I am irritated that the drive died, but I’ve taken it much more in stride than one might expect. I will probably poke at the Windows system some more as well though. The drive doesn’t really act like it’s dead, more like, it’s got some sort of software glitch going on.

New Desktop Upgrade

My new desktop I mentioned at the end of my last post arrived. I am fairly good about adjusting my workflow when changes or upgrades happen, but this particular change in it’s nature, is incredibly disruptive. My whole process kind of stems out from a source, which for the most part, is either my laptop or my desktop. If I were to get a new laptop, not much would change. Most of that workflow runs off of shared drives or cloud files. On the base level, the desktop works the same way, the real trick I get to deal with is the ripple effect downstream, which I will get more into in a bit.

The machine itself runs fine. It’s much much quieter than I expected it to be, which is nice. The crazy Neon lights aren’t as slightly annoying as I expected, especially once I figured out how to adjust them to be a more toned down, non pulsating, single, cooler color. The day after ordering, I realized that I might need new monitor cables, I checked and sure enough, all of the outputs save one are Display Port. My old set up consisted of a display port adapter, a DVI port and an HDMI out, all three to DVI on the monitor end. I may update the monitors later but it’s not necessary now.

The inside is crazy empty and clean. I’m not quite sure yet where I would even put additional hard drives, though I suspect they mount to the sides.

It certainly handles every game I’ve thrown at it incredibly smoothly even with maxed out graphical settings. My current plan is to keep this machine relegated mostly for gaming (a bit more on this later), so i will keep it fairly clean and free from excess software. So far I’ve tried it out with Forza Horizon 4, Overwatch, Black Desert Online, Minecraft, Control, Quake 2, and Quake 2 RTX.

The RTX is really quite remarkable. The reflections are really neat and the shadows work very well. I look forward to finding more titles that take advantage of the ray tracing capabilities.

The Ripple Effect

Where the real change is happening is down the line. For the sake of maybe alleviating some confusion, I am going to go ahead and use the Network names for my computers. The old workflow consisted of my Windows 10 Desktop, Squall, that I originally put together back in 2012. It still runs everything perfectly fine, since 2012, I’ve bumped up the RAM quite a bit, updated to an SSD, and updated the video card to a 1050ti. Sitting under the desk next to that was Rinoa, am even older box, that I am pretty sure I got second hand somewhere, running Xubuntu Linux. Xubuntu because, it’s only 32bit hardware. It’s primary function was being a web server, for my Dashboard, and a file server. It’s got several old drives in it all shared on the network to dump less important files to, because I’m a digital packrat.

The new machine is Cloud. See a pattern here yet. If it helps my laptop is Selphie, my old laptop was Rikku, my old project server years ago was Quistis and before that Yuna, my family’s laptops are Ivine, Barret,and RedXIII. They are all Final Fantasy characters.

Rinoa running 32bit hardware has been a problem for a while. Several interesting projects I have found needed to run on a 64bit system to get up and running. It’s also woefully under powered for anything robust, like running a Minecraft Server. Rinoa has been desperately in need to replacing for a while. Which is where Squall comes in now. Squall, will become the “new Rinoa”. Squall will become the new project server.

Making this change isn’t easy, it’s still not done, and I’ve been working on it for the last week. Squall is also a much more capable machine, so it changes the workflow a bit. Where Rinoa ran headless, I’ve decided to keep Squall on one of my three monitors for now, and work with it using Synergy. I can offload the load of doing things like, running the web browser off of Cloud to Squall if I want. I also can use Squall for Discord and IRC.

The first thing I decided to do was to move the Web server aspect to a Raspberry Pi. I already had a Pi running a LAMP stack to host my WordPress Archive blob. Moving the basic Dashboard was easy. Copy the files, import/export the database, and it worked, no problem. The harder part was moving the backend processes. I’ve started doing a lot of combination projects, that often consist of some sort of Python or Bash script running that dumps data to a database, and a web based GUI. Like the Network Map, or my rudimentary Twitter Scheduler or the web based Download Queue system for a particular web video downloading software that shall not be named. Getting these to work on the Pi is trickier. Partially because I’d forgotten some steps. For example, I created some environmental variables to open the database with scripts, so I didn’t have to put raw log in credentials in them. I forgot how I had done that, so I converted them back to raw log in credentials for now. I have others that are looking for commands from packages that need to be installed that I’m not sure are available on the Pi.

I’ll get it worked out, I just need a bit more time.

I’m also not real sure I want to move TT-RSS to the same Raspberry Pi, just because it’s constantly polling. I am not sure I want to run that level of read/write on an SD card and risk losing my other files. I will probably just set up a second Pi JUST to run the TT-RSS Server.

The other major thing to move is the files. I started off by consolidating everything, for simplicity’s sake. I converted and consolidated up my video files on my NAS recently, which freed up a lot of space. I’ve been meaning to re-allocate some files off of Rinoa back to the NAS and I used the move as an excuse to do just that. At the same time I consolidated the remaining files onto the largest of the various drives in Rinoa, so I can start off by just moving one drive to Squall, to reshare on the network. Moving the webserver to the Pi also meant giving the Pi an extra USB drive, for more storage. The Video Downloader that Shall Not Be Named, pulls video files, which are larger. I set up a new network share from the Pi for “Working files” and moved all of the “working Files” folders from Rinoa to the Pi.

The process overall isn’t complicated, it’s just time consuming with large data moves and some configuration changes.

Physical Set Up

Another aspect to adjust was the physical set up. Initially I just pulled Rinoa out and stuck Cloud in it’s place on the floor. This was partially done because this was the best solution for cable lengths available until the new Display Port cables arrived. I’ve got some pretty good cable management going on and I am not a fan of cables just hanging all over, a side effect of 15 years of working around equipment racks with impeccable cable management standards. Once the monitor cables arrived, I undid everything and reran all the cables, putting Squall on the floor and Cloud up on the little floor shelf under my desk. The main downside in the end is that the cleaner cable solution puts the large side window on the new PC against the backside, so I don’t get to see inside my PC all the time. Thankfully, I’m not a big fan of this aspect of the machine to start with.

Rinoa is going to just sit behind the monitors on a shelf for a bit until I finish with her, but in the end, the plan is to retire that machine out.

Going Forward

I am actually almost more excited about the prospect of molding Squall into the new Project server than I am the flashy new Gaming Rig. The 64 bit hardware and 24gb of RAM means I can do a whole hell of a lot more than I ever could with Rinoa. I can set up a massive Minecraft server with a whole heap of worlds available. I can run Docker and everything that goes along with that, I can set up a robust and speedy OpenSIM world if I want. I also plan to continue to use Squall as my Video Edit machine, no need to bog down Cloud with all that extra overhead in software and disk space. I also can much more easily start playing around with VMs.

I could have done a lot of this before of course, but I find keeping all of that up and going on a machine you are also using for day to day use and gaming gets distracting, and you start running into resource use issues much more quickly.

A Second Hard Drive in My Aspire E 15

Recently I purchased an SSD for my wife’s Thinkpad. It wasn’t a big one, 256 gig, but her laptop is a little slow all around and the bottleneck seems to be mostly in the drive, which I am pretty sure is still old school spinning platters.

Unfortunately, the drive in her laptop is 320 Gb, so I couldn’t straight clone the drives. I could have done some partition size adjustments and made it work but she was already fussing and worrying I was going to lose some of her files so I decided I’d just wait and get a larger one later.

I’d already planned to pick up a second one of these drives to add to my Laptop. The main drive is one of those funky newer styles that’s basically a circuit board, but it has an empty bay for a laptop drive. I stuck the new SSD in and went about using it. Nothing hard here at all.

To my surprise, the drive vanished a week or so later. Thankfully I didn’t stick it in my wife’s laptop, it was apparently bad. Or was it?

Turns out that because the drive is a little on the small size, even for a 2.5″ drive, and there isn’t any mechanism inside the laptop to secure the drive itself, it ended up coming lose and losing it’s connection.

It’s probably not the cleanest fix, but I stripped off a half a sheet of paper and accordion folded it and slipped it in between the drive and the Laptop chassis. This applies pressure to the drive, holding it in place.

I haven’t had any trouble with the drive since. Still it’s kind of a crummy design.

A Tale of Two PCs

As a bit or a change of pace, I did a bit of work on the two actual PCs I am currently running recently.  I’ve gone through a lot of desktops over the years, some getting more use than others, for a while I had like 5 or 6 old ones I had picked up here and there just sort of sitting around collecting dust but I’ve purged a lot of that out.  Most of what I used to do with those extra PCs I can now do with Raspberry Pis or on my VPS.  Everyone in the family uses a laptop, so no more need for a “Family Desktop”.  I am down to two boxes now, ok, techniclly 3 but the third is an old PowerMAC G4 that I mostly keep around because I think the case is cool.

First off, my personal desktop.  At the moment it just runs Windows 10, it’s sitting on a handful of drives for a total storage of 4 GB, mostly filled with games.  I built this machine almost 7 years ago.  It’s nothing particularly special, and I have bumped up the RAM since then considerably.  PC computing power really hasn’t gotten much better in the past few years and what it mostly needed was a bump up in graphics power.  So I swapped out the Radeon 6950 for an NVidia GTX 1050ti card.  It’s not a top of the line super card, but it was within my price range and the performance boost is reasonably noticeable.

The biggest change is that I can run pretty much everything at maxed out graphics settings.  So far I’ve tested it on Overwatch, World of Warcraft, Grand Theft Auto V, and Battlefield 1.  Battlefield 1 in particular used to throw out an error about my GPU not being supported and GTA V had some screwy artifacting when it rained in game.  Also, maybe it’s a placebo effect, but I have noticed that I do better in Overwatch with heroes like Hanzo and Widowmaker who both require more precise long distance aiming.

That work was pretty easy, though I was sort of worried that the newer card wouldn’t work with my older Chip and Board.

On my other desktop tower, which is primarily used as a file storage server to supplement my Synology, I replaced a couple of dying hard drives.  I don’t really remember where this tower came from but it’s at least the same vintage as my main PC.  It’s set up running Xubuntu with a collection of drives I’ve collected over time from various places and discarded PCs.  It’s been complaining for a while on boot that one of the drives was bad, and another would give read errors occasionally.  I copied everything off the read error drive, that one was easy.  The other bad drive turned out to be the main drive which finally gave up the ghost and stopped booting on me.  I ended up making this problem worse when trying to clone the drive, because I apparently accidentally overwrite the drive as a ZFS pool file system.  This is mostly notable because I’m not sure how it even happened.  I have used ZFS briefly int he past when I was testing FreeNAS but that system was a way bigger chore to use than just Ubuntu with Samba shares so I scrapped it.  So I’m not sure what was even cloned to create a 500GB ZFS partition.

Fortunately there wasn’t any important data actually on the main filesystem drive.  I think at worst I may have lost am unused Minecraft server set up and maybe a few webpages I had set up messing around with webdev stuff.

So after a ton of reboots on a live CD to determine which physical drive was witch in the machine, I pulled out the two bad drives and replaced them with two “mostly good” drives.  I then reloaded Xubuntu.  I then, reloaded Xubuntu again because an encrypted file system seemed like a good idea but I don’t want the hassle of entering a password every time the machine boots.

The real hassle here is getting everything configured.  A quick rundown of the steps needed to get things to a basic level of use.

  • Set up the proprietary drivers for the GPU and motherboard, easy
  • Set a static IP that puts the machine where it’s supposed to be on the network, mostly easy.
  • Reinstall Synergy.  Mostly easy, though I still need to get it to stat on boot.
  • Install and set up SSH, easy
  • Reinstall Samba, easy
  • Get the system to auto mount the other hard drives on boot, mostly easy
  • Configure Samba to share those drives, mostly easy
  • Reinstall the LAMP stack

Fortunately, everything went pretty smoothly, other than I havn’t quite figure out the right method to get Synergy to start on boot.  This is actually pretty critical, since unless the machine just boots up to a desktop with Synergy, I have to keep a keyboard and mouse attached.  Part of the point here is that this box can just be squired away behind the desk and hooked to a monitor.  It may already be set up but I’ll probably set up Python on it as well.  I still like to be able to putz around with scripts and web stuff so it’s handy to have.

PS, feel free to judge the dusty ass inside of that tower up there.