Server updates

I thought this will be the last update for this server this year but then I remembered I will update the SSD storage soon.

I don’t remember if I’ve mentioned this but I bought two cheap sticks of ECC RAMs. Annoyingly the cheaper one turned out to be rather defective so I went on getting another cheap stick. Thankfully my second gamble worked and now the system is running on ECC in most of the stacks. The end cost is about 24k including the loss from defective stick.

In the process I realized the system has been running with just two fans – one intake and one exhaust. I got that fixed by adding three more fans so the total is now three intakes (two at the front, one at bottom) and two exhaust (back and side).

I’ll probably move the side exhaust to the top sometime later because it’s a bit too close to the bottom intake fans. Moving to the top may cause another problem though as it’ll be blowing warm air to my router which is located on top of the case. The top case is already rather crowded so I can’t move it anywhere else.

Now I think about it, I’ll probably keep it like this for now.

The other upgrade (a pair of 500GB SSD) coming soon should fix the lack of space for my home directory (which is shared with root filesystem whoops).

Server upgrade (part 2)

So FreeBSD ran stable on my latest Ryzen setup for at least 18 hours. I guess it’s safe to say it’s stable now.

The x1 graphics card has arrived, and it works without problem. It sure is nice having a tiny graphics card. Maybe I should get more of this.

There were small hitch after I changed back the SATA cabling to connect to the HBA. Some of the drives weren’t detected properly for some reason. Fiddled with the cables a bit and thankfully everything came back up normal.

With this, the upgrade is mostly done. Apart of the ethernet card which I’m still waiting for arrival of the ones with correct bracket and ECC RAM which will be a while until I save enough money for.

There’s also SMR drives situation. I recently learned that manufacturers have started switching to SMR drives which has relatively low random write speed. That explained why the resilver time was so horrible back when I upgraded the pool. Thankfully it can buffer some burst load although it’s not always enough for ZFS operations. The read speed should be mostly fine so I’m thinking of keeping these drives until they die which then I’ll buy normal PMR drives for the replacements. Unfortunately, those will be a bit expensive.

Oh, I almost forgot one more upgrade coming whenever this pandemic situation is over. I’m thinking of taking home a pair of 500GB-ish SSD currently sitting idle in work dev server. I can use them for my home partition as the (NVMe) disk is getting full at 70% and my home directory somehow accounts for 100GB (20%) of it. Slightly slower home directory will be a bit sad but it’s better than having full disk.

Unrelated, but looking again, I considered getting X470 motherboard instead of X570. It’s a bit cheaper and doesn’t need fan. It also has better PCIe configuration at x8/x8 instead of my current X570’s x16/x4. Too bad one of the M.2 NVMe slots is only PCIe Gen 2 x2 (the cheap one). That’s a bit on the slower side for NVMe disk. Still, I probably should’ve gone with that one. I might even be able to fit x16 graphics card on the x1 slot. It’s too late now. RIP me.

There’s also the ASRock Ryzen server motherboard which price quite a lot lower than I thought at 30k. But that one is, well, more expensive. And involves buying directly overseas.

Server upgrade

Not the final form

I’ve been considering this on and off quite a long time ago as I noticed the Intel part post-Ivy Bridge isn’t going to get much cheaper. And then during my 10Gbit upgrade a while back, I learned my server could barely handle half of 10Gbit available. There’s also problem with I need a bit more RAM but I don’t want to buy any more DDR3 sticks as it’s a dead platform by now.

Thankfully Ryzen continued AMD’s tradition (?) of not locking ECC feature on most systems so I upgraded to it three years ago. And my server crashed. A lot. It was unstable. I then tried again two years ago but it’s still crashing. I ended up selling the system and bought a cheap Ivy Bridge server board from ebay last year. It held up pretty well. It even got NVMe upgrade earlier this year.

The thing is, just like for desktop, I sure could use faster CPU. Ryzen 3000 series brought along a lot of per-core performance (IPC) increase. A lot more compared to 1000 series. And it has gotten pretty cheap, at least on 6 cores realm.

I upgraded my desktop end of last year and now it’s turn for the server. Except unlike the desktop one, there’s no good deal this time around. It didn’t help I need more PCIe slots than usually available on cheap motherboard. And I actually wondered if I should wait for B550 and see how it goes especially considering X570 requires fan for its southbridge.

But I ended up getting X570 anyway because I don’t want to wait longer 🙂 I’ve resumed doing some hobby dev work recently and sure could use some upgrade. Especially as my plan for VM on desktop system with NFS-backed storage didn’t go quite well.

Anyway, I upgraded the motherboard (ASRock X570 Pro4) and CPU (AMD Ryzen 5 3500). For RAM I took two sticks from my desktop which currently has a bit too many. Those will need to be upgraded to ECC sometime later when the budget permits.

For graphics card, as this isn’t a server board and there’s no onboard GPU on the CPU, I’m getting a cheap PCIe x1 GT 710 1GiB from Zotac. It costed a bit under 5k on Amazon Outlet. It’s second hand but should be fine. I hope. It’s not arrived yet so I’m currently using another fanless GPU I have but it’s using one x16 slot as even though the PCIe x1 slots are open-ended, there isn’t enough clearance for x16 card.

With one of the only two x16 slots used by the graphics card, I stuck with using my HBA at x1 slot. Thankfully the motherboard has loads of SATA ports (8) so I only need two from the HBA. There’s no cable management though as 1) it’s temporary; and 2) it’s way more annoying with 10 SATA cables in total instead of just two SFF-8087 and two SATA. That should be fixed this weekend.

There’s also network card bracket problem. The brackets I mentioned weeks ago have finally arrived but the size didn’t match. Good job, Fujitsu. I couldn’t find the brackets for those cards either so I’m getting another pair of cards. Assuming they will actually arrive as they’ve been stuck in China for a bit over two weeks now. I just hope they actually arrive. And that they actually work. That would be nice.

The unused board and CPU and RAM will be repurposed for my work dev server. My current one is pretty similar just one generation behind (E3-1235 vs E3-1230v2). I need a new case and PSU though but those shouldn’t be too expensive. Combined, the server will have plenty of RAM (32GiB).

That said, I don’t know if this 3000 series of Ryzen is finally stable enough for FreeBSD. That’s actually the most important thing as otherwise I’ll be forced back to the old system and maybe figure out what to do with this board and CPU. I’ll report back when I got the correct graphics card, I guess. Or earlier if it still crashes.

Bonus photo:

This is definitely not how to install a card. It does work though

Realtek and Hyper-V weirdness

Earlier this morning I finally bothered to install OpenBSD VM in my main desktop to be used as main dev system. With NFS from main server as its main storage. Except it was kind of slow so that project is scrapped and now being replaced with Debian. Or maybe I’ll try FreeBSD later.

That’s not the point of this post though. The problem is soon after I boot the new VMs, my network started acting up. It randomly disconnects every once in a while. It disconnects so often it’s not even funny.

Here’s the error message in Windows Event Log:

ネットワーク インターフェイス "Realtek PCIe GbE Family Controller" はリセットを開始しました。ハードウェア リセットの間にネットワーク接続が一時的に中断します。理由: The network driver detected that its hardware has stopped responding to commands。このネットワーク インターフェイスは、最後に初期化されてから 57 回リセットされました。

I tried various drivers and none of them seem to help. It’s mostly fine if I’m only running single Windows VM.

The only other ethernet card I have is yet another Realtek but I figured might as well try it and see how it goes.

And pretty well so far it goes. No disconnect so far. Hopefully this will continue to work until my 10GbE cards arrive. Which then it’ll be FUN TIME 🌈

I’m getting it from a US seller through ebay though so it’ll take a while until it arrives.

Realtek GbE performace

I tried the RTL8111C which was half the speed under FreeBSD in Windows.

Using Windows’ built-in driver (Windows 8) resulted in similar performance (900 Mbps send, 600 Mbps receive) but installing the driver from Realtek website resulted in way better performance at 900 Mbps both ways.

I got similar result under Windows 10 for a different system with onboard Realtek chip. The lesson is make sure to install manufacturer’s driver it seems.

Also interesting the driver name says TP-Link gigabit card instead of Realtek. The card is indeed a TP-Link card but I don’t often see brand-specific name come up from a generic chip driver.

Now I wonder how it performs under other operating systems. Maybe I’ll try it sometime later.

Even more SSDs

Earlier this week I noticed the two mirrored SSDs in my main server is close to dying.

One of them has started reallocating sectors even though it’s only got a bit over 150 TB written. It’s specced at 150 TBW warranty though so I guess it’s just about time.

That said, the other one has over 220 TB written yet it still has no problem. This one is rated for 204 TBW so it also already passed its specification.

The reason for those stupid write number is because I probably have used them as ZFS cache drives. I think I killed two other drives in similar fashion last year. I stopped doing it a while back but it was a bit too late.

I haven’t installed them yet as the server doesn’t have NVMe slots and the PCI-e to NVMe adapters I ordered have not arrived. Thanks, Amazon 😐

I don’t actually know if they will work at all. I don’t think the BIOS supports NVMe boot and I’m not sure if the commonly suggested method of using Clover Boot will work for FreeBSD. It theoretically should be fine but who knows…

Upgrade Log 3

The last one for this batch! Everything arrived, assembled, and finished without much problem.

Windows 10 is even more annoying than ever. Disabling Cortana now must be done using Group Policy. Great. I have to slowly live it up because this is the future of Windows and I don’t see myself using another operating system for desktop for foreseeable future.

Also, don’t disable universal app background process if you want a functional start menu search.

<insert a bunch of other tweaks here>

Up next

Closest upgrade I can think of is getting an extra 6+To drive so I have 6 drives raidz2 instead of current 5 which is quite a waste. I’m not sure how to migrate the data though. That’ll cost about 25k?

And I remembered about my netbook only having 2Gio of RAM. Surely can be upgraded to 8Gio for maximum lulz. Or just more useful. I remember it’s much more usable when it’s running on 4Gio of RAM. I don’t exactly remember when and why it’s only 2 now. It already has SSD so the RAM upgrade would pretty much max out upgrades for this system. Not counting higher capacity/performance SSD because I don’t think it won’t make much difference apart of having more storage – faster SSD won’t help the slow CPU much. 5k for RAM.

After that, I can certainly use more storage for my main desktop. A 1To SSD would certainly be nice. A bit expensive at 33k.

With storage out of the way (and moves the 525Go drive to office desktop), I think my office server can also use some storage upgrade. Just like current home server, it can certainly use two more drives for optimum raidz2. That means a controller, HDD cage, and one extra HDD (because I already have one spare 3To HDD). The total would be about 51k.

There’s VGA card upgrade for main desktop but I’m still not sure about that. I don’t really need it but certainly would be nice! Let’s pretend it’ll cost 40k for whatever card at that budget whenever the upgrade is happening.

Talking about VGA card, there’s also a would-be-nice upgrade for my office desktop VGA. It’s currently running GT730 which is not quite fast. Limited to 45W, current choice is limited to GT1030 at 10k.

At this point there isn’t much left to be upgraded. So let’s upgrade the server RAM to 32Gio from currently pitiful 12Gio. I would like to pretend it’s cheap but it really isn’t even now. I was pretty lucky last time getting two sticks of 8Gio for just 10k but it won’t happen often. So maybe about 25k I’d be willing to spend.

I think there is no more after this. I probably won’t reach this far until at least next year or even later anyway and something may break in the meantime, requiring change of plan.

  1. (5k) RAM: 8Gio PC3-12800S
  2. (25k+) Storage: 6+To HDD
  3. (33k) Storage: 1To SSD
  4. Storage:
    • (4k) Controller: LSI SAS 9212
    • (7k) Misc: HDD Cage 2 5.25″ to 3 3.5″
    • (15k?) Storage: 3+To HDD
  5. (40k) VGA card: ???
  6. (10k) VGA card: GT1030 (or better)
  7. (25k) RAM: 32Gio PC3-12800E

Total: 164k.

…maybe this will happen sooner than expected ( ゚◡゚)

Upgrade Log 2

“New” “server” has arrived. So have the SATA/SAS controller and hdd backplane.

Unfortunately the 5.25″ bay separator is a bit too big so I had to “fix” it.

The cage works complete with hot swap.

So is the SAS card. Flashed to P20 IT mode without much problem. Someone mentioned it might fail on UEFI motherboard booted to DOS in BIOS mode but I didn’t encounter such problem.

Updated the system BIOS as well.

Processor installed without much problem – finally another server with Ivy Bridge processor. SAS card seems to be a bit problematic when system boot support is enabled. I just disable it and everything seems fine. Ethernet card also installed without problem. SSD thankfully detected without hitch and the OS from previous server boots fine.

Still quite a lot of restructuring needed thanks to two servers being merged but there’s nothing else to do hardware side (unless I decided to buy extra drive to round up the data pool to 6 drives raidz2).

Office server also done the rearrangement and now has more threads but much less memory.

Office desktop is currently gimped a bit with just E3-1225 but that will be fixed once the DDR4 memory arrives. And then the graphics card will get a downgrade from GTX660 to GT730.

Now waiting for the memory. I hope it arrives this month so I can say goodbye to this memory-starved system as soon as possible.

That reminds me, I should put up old stuff for auction…

HP Z210 SFF review

I got this last year for a total of 6026 yen with shipping. It didn’t come with CPU though. And I don’t remember it came with RAM either. Thankfully everything works.

First on motherboard. I don’t remember it has standard fan headers. At all. Well, it’s only got a total of one fan anyway. And that doubles as CPU fan, no less. Okay, I lied, there’s another one inside PSU which has weird form factor.

At least they are relatively silent. I have no complaint on noise department.

CPU support is limited to Sandy Bridge series. No v2 CPU or i3 3xxx series. Don’t bother trying. I did.

RAM support is okay. It only supports maximum of PC-10600. There’s ECC support though and at 4 slots, it maxes out at 32Gio.

GPU support is the worst thing about this PC. The PCIe slot being limited to low profile isn’t too bad nowadays with proliferation of small sized no external power started by GTX750. But the problem is none of them will work. Or at least specification wise. The reason is the motherboard only officially supports providing PCIe slots with 45W at most. That’s 30W less than what those GPU needs. The fastest consumer level GPU I can find is GT730. It sucks and noisy. Noise part can be mitigated by getting MSI card and adjusting fan speed curve accordingly but I generally don’t like installing crapware on the already crap Windows.

The absolute fastest one can get for this is Quadro K1200, by the way. At over US$300, I’m not sure if it’s worth buying.

USB ports also occasionally stop working on Windows 10. I vaguely remember it’s fine on Windows 7 but some changes on Windows 8 caused it to occasionally not working on boot. The only fix I have is to remote desktop into it and reinstall USB driver. Installing additional USB card didn’t help either from what I can remember.

Oh, and SmartOS doesn’t support the SATA controller because it is in RAID mode. Switching it to IDE emulation works but no one should do that. I’d just buy a SATA PCIe card instead if I really want to use SmartOS on it. Which I haven’t yet.

It’s got two SATA3 ports and two SATA2 ports. Note that it only has 1 external 5.25″, 1 external 3.5″, and 1 internal 3.5″ bays. I used a HDD 5.25″ converter to get one extra bay and dumped two SSDs to max out the storage (2 HDD, 2 SSD). Just remember that one SATA power cable splitter is needed.

Mine didn’t come with the screws needed for installing HDD at the back side but there are plenty of them for cheap in eBay. Those are of doubtful quality but as they’re just screws, I don’t really care that much. Just look for “hp sff hdd screws.”

Conclusion

In term of purpose, it goes like this:

  • best as personal non-storage server
  • don’t bother for desktop if using large monitors (1080p+?)
  • with additional SATA card, can also be used as SmartOS box

And yeah, it’s where some of my sites are currently hosted at.