0x15MMXXIII build

It's been about two years now since I've done a full-fledged build.
I used to do new builds every year. With 2021 being the most expensive build I'd ever made,
I decided I'd ride it out for about two years.

Here we are. It's 2022, almost 2023 now, the AM5 socket just released.

Let's reel it in for a second and talk about my prior builds. All of my builds are for
hardcore number crunching. Yeah, they can game, I can edit photos, do all the fancy shit
that most people would spend 2x what I spent on my computer, but the Sweets-tops are for absolute throughput of numbers.

And since it's about that time, I wanted to talk about the parts I'm choosing, and why I'm choosing them.

First and foremost, I'm loyal to the Mini ITX form factor. Coincidentally the newest FormD T1 just dropped,
so of course I'll be using the T1. Inside will be the Asus ROG Strix B650E-i. What a mouthful now. The motherboard itself
hasn't dropped yet, so pricing isn't known, but it has been announced at the very least.

A lot of people may question the B650 chipset choice, especially because X670 exists, but while this PC
is built to think, it still needs to be small. Since it's mini-itx, I don't need full PCIe 5.0 and every slot.
Only one x16 slot and one NVMe slot need full 5.0 speeds. If I got for an X670 itx, I'd just be throwing money away for little return.
Though you'll see soon that it's weird for me to be upgrading in the first place.

Next, the mitochondria of the build is of course going to be the Ryzen 9 7950x. 16 cores, 32 threads, but this time clocking in
at 4.5GHz base, 5.7GHz boost. Compare that to my original 3950x, clocking at 3.5 and 4.7, or even my 5950x at 3.4 and 4.9.
That's quick mafs. Fast as hell, not to mention I'll be overclocking it as I do.
Unlike my last official build though, only my CPU will be watercooled. She'll be brought back down to Earth
by the EK-AIO Basic 240. The fans on the AIO to be replaced with Noctuas. Full custom loops are fun and all,
and by now I've definitely had my fair share, but they're expensive as fuck for not that much performance increase
when you're already running crazy specs to begin with.

My current memory kit of choosing for this new build is the Kingston Fury 2x16 DDR5 6000MHz kit.
I really wanted to go for T-Force XTREEMs again, the kit is incredibly attractive to the eye, and
on my current build I paired the 4500MHz kit with my Ryzen by pulling the frequency all the way down
to 3600MHz, and using the freed up voltage to tighten timings as far down as they will go.
Yeah, Ryzen loves fast memory, but people often mistake that to mean just the clock speed. Ladies and gentlemen, latency
is what you should be looking at. And yeah, the Fury's aren't the lowest latency, and I may end up switching them out for a higher
clocked kit later to tighten timings, but one of my biggest rules nowadays, minimal to no RGB.
Again, I might play games, but I'm not a gamer really. RGB has no appeal to me, and I actually want the exact opposite
that all the lights provide. My main PC lives in my room with me, and I've got sleep to catch.

For my storage, I've looked at the newest rockets from Sabrent, but I think I'll just stick with my current rocket.
I've got in my build right now a 4TB Q4 4.0. It gets amazing speeds, and plus all my data is on there.
I used to go crazy with trying to get the highest storage capacity I could. I'm known to fill up a terabyte or two.
Since jumping to 4TB though, I've not even come close. Not to mention, nowadays I offload most of my files that I don't
use or need everyday onto my NAS that now houses my 8TB HDD I've had for a long while. So really, I'd just be throwing money
away if I got a new storage drive. Even though I'm also known to do that. And no, it's not PCIe 5.0, but I highly doubt I'll
ever come close to filling up the 5.0 bus. I never did for 4.0. Again though, this isn't going to be the weirdest choice you see
today.

Deliberately skipping the GPU for now, saving the weirdest for last, of course I'm going back to Corsair's SF line-up to power
this temper-tantrum thrower of a desktop. This time the SF750. I've currently got a 650, but with the new 7950x paired with
the GPU I've got planned, this desktop just drinks power. Absolutely 7-Eleven big gulps it down.

Finally, for the GPU. I did mention it in a previous post, but my choice for graphics card is none other than
a Titan RTX. The astute among you are ripping your hair out at the fact that I'm using a PCIe 3.0 card, with
a PCIe 4.0 NVMe, in a PCIe 5.0 motherboard.
And to that I say get fucked. It's my computer.
Above all else though, compared to current offerings from NVIDIA, the 3090s, 4090s, the Titan RTX
is still keeping pace with those behemoth cards. On top of that, on the second-hand market, you can get them for
$1.5k. Here's the biggest thing though. The Titan RTX is a 2-slot card. The 3090, and even 4090? Get fucked.

It's not necessarily impossible to fit them into a FormD. But I will die by 2-slots. And again, I'll remind you that
this is a build for compiling Gentoo yesterday, and for AI to determine the liklihood of me stubbing my toe in the
next 3 hours, not for gaming.

So that's all she wrote. That's the 2023 build. Expensive as fuck, coming in at near $6,000 for just the parts on the
PC alone. 2021's build was about $4,000 all-in on parts.

But this PC will be used again for as long as feasible. And honestly, that'll probably be a long fucking time.
AM5 was just released. AM4 lived from 2016 to 2022.
And hell, I'll probably ride out this past AM5+ or AM6, or whatever comes next. I myself am starting to hit a ceiling
in processing power. I did notice that my poor 2070 Super has lived a good life though.
With my recent eBay obsession, I can probably part the current PC out and sell the parts individually. Previously
I'd just sold my old builds entirely as-is.

Alright, I'll shut the fuck up now.


0x14irresponsible with money

My employer really needs to stop paying me. Just this week alone, I've bought a new graphics card,
a new DAC/AMP stack, a new motherboard, a new NAS... Not to mention I just
finished my network rack build for full gigabit speeds, and on top of that, I'm now eyeballing a Titan RTX for my main computer.

To be fair, I can kind of justify most of those purchases. The Vega 64 and the B450-F motherboard I bought are for a project
I'm doing to build my friend a new computer. Long story short, he went to Australia for a year, and before he left he
sold his old computer. Since coming back he's done nothing but talk about how much he missed gaming, his old computer,
and how his biggest regret was selling it. I used to build new PCs every year, and would you look at the time, it's
new build time. Only I'm not actually making this build for myself. He thinks I am because he knows me, but that's
what I told him when he came over and saw all the parts I've got lying around. It'll be a great surprise for him. I hope.
Don't tell him, guys.

That aside though, the NAS was the final major part for my network rack. I had an 8TB hard drive just collecting dust for years,
and I've needed a place to put my files anyways, so bam. It's a Synology DiskStation 118. Not the greatest, but it serves
its purpose well.

I can't think of any bullshit reason for the DAC/AMP or the Titan RTX. Those are just ooh-shiny purchases. I haven't yet
pulled the trigger on the titan card, but that's because it's minimum a $1.5k purchase if I go used, almost $3k if I go new.

I can't help myself sometimes. In other news, my finished network rack gives me the gigabit speeds I've been paying for.
It's not the fastest rack you can build for networking, but for my purposes, it more than exceeds expectations.

It was actually an extremely fun build. The router is actually pfSense and pi-hole installations that are virtualized in proxmox.
I've got a TP-Link SG1024DE network switch, fully managed, which is great. I've got 3 main VLANs setup,
one for hardwired devices that explicitly have access to the entire network (right now the only device that includes is my pc,
but potentially more if ever necessary), one for IoT devices (my smart television, my game consoles) which does not allow access
to the internal network at all (so only outgoing traffic is permitted), and one for network attached devices (so far just
my printer and my NAS) which are only allowed access to the internal network (nothing from the outside can reach them,
and they can't reach out).

The router was an interesting little build. It uses a Gigabyte GA-Q87TN motherboard, an i5-4460, and a fancy little Intel/Dell 0h092p card.
She pushes out anywhere from 950 Mbps to 1.1 Gbps depending on my network activity. All within a 1u enclosure.

My next major upgrades to the rack are Noctua 40mm fans, because man are the stocked 1u fans loud as shit. Other than that though, she's my baby.

That's really it for all of my recent projects. I could go down a whole rabbit hole for my network, it's pretty layered.
I could also go down a hardware rabbit hole, and talk about my current eBay obsessions, but that's for 8 months later when I decide
to update my blog again.

Same time next year sound good?


0x13the adventures of water cooling

Look, I'm no stranger to water cooling. Most PCs I've built in the last 5 years have had water cooling.
Now granted, some of those were AIOs. The others weren't though.

This time, however, I'm missing one vital part of the process. A resevoir.
There is a reason for that. As previously mentioned, my main PC is inside of a lovely
FormD T1. A sub-10 liter case with some custimizability, and if you're creative enough,
even more than advertised.

Anyways, filling a loop without a resevoir is hard. Well, more specifically,
filling a loop without a resevoir and a super weird loop setup is hard.

My GPU is mounted upside down in my case. That's by design, and by my choice for where I wanted my
radiator. Top mounted, as God himself intended.

Anyways, this means that bubbles like to congregate in my GPU block. And it pisses me off.
After about an hour of filling, turning my computer every which way, then settling on "That's good enough",
and another hour, bubbles, again. Really large ones, at that. If it was smaller ones, whatever.

Large bubbles are not optimal though. So rinse and repeat.

One of these unfortunate times though, among my brigade of paper towels to soak up any rogue clear XL8,
a single drop got through. It left its evidence on the side of my CPU block. And it left its evidence in my
fucking motherboard, as my beautiful B550-I would no longer turn on.

I didn't even need to troubleshoot, I knew what had happened immediately. I did, for safety, test the PSU.
All good.

So a quick run to Microcenter later, and I've got a new X570-I. What I didn't know was that same rogue
drop reaped my CPU with it. Well, not entirely. It stole some of its functionality.

My aggresively overclocked 3950x can no longer clock memory. At all. This problem did take a lot of troubleshooting.

I thought I had a defective X570-I at first. Then I tested my memory, multiple kits, but then it dawned on me.

I took my old 2700x, swapped it in, and in combination with my T-Force Xtreem kit... would you look at that,
it clocks memory perfectly fine.

Great. Now I'm down a motherboard AND a CPU. And one of them was vastly more expensive than the other, unfortunately.
Another quick trip to the local Microcenter 45 miles away, and now I'm the owner of a fully functional 5950x.

Man, I wish my towel brigade hadn't failed me. In 2 days I spent over $1k.

At least my GPU block doesn't have any large bubbles in it. For now.


0x12xbox controller connection issues with xbox adapter

It's been a minute since I've made a tech related post, so it's about that time.
A long while ago I bought my Xbox Elite Series 2 controller. Love it. One of my better purchases in life.
Unfortunately though, it was constantly disconnecting. I had no inclination as to why or how.

Let's get the most prominent solutions you'll find on the internet out of the way:
Check your drivers, disable Steam's dumbass big picture mode controller related settings,
check your power management settings.

After all was said and done, plus some, I was still having issues.

It took a lot of searching online, some self testing, etc., but I eventually found the problem to be
my graphics card. Now, I know that seems out there, but let's evaluate my specific situation a little bit.

I use a small form factor case, sub-10L in fact. My GPU is directly against my motherboard.
After some testing and only one mention of it online that I can no longer find,
if the USB adapter gets just barely over spicy in temperature, you'll suffer from the disconnecting issue.

There's several routes you can take to alleviate this problem.
I chose the more expensive option, using this as an excuse to shell out $650 in new water cooling parts.
With an overclocked 3950x and 2070 Super, I'm generating a metric ass ton of heat.

The other solutions include not using small form factor, or using a case that isn't sandwhich style,
but the simplest and cheapest option: just buy a USB 3.0 HUB that allows you to move your USB connections
away from the source of the heat (a decent length, not one so short that the cables conduct enough
heat to still affect the adapter).

I say all this in hopes that if anyone ever has a problem like I did, you'll find this blog entry.
It wasn't necessarily an easy thing to diagnose, but it made my gaming life better by several magnitudes
once all was said and done. And thankfully, it didn't make me feel like I wasted $200 on my controller.


0x11why I've been gone

I said in the last blog post you wouldn't see me in a year. Well, surprise, it's 2022, and I haven't seen y'all since last year.

I can feel you cringing from that joke.

I've been taking a break from programming lately. It's kind of been depressing me quite a bit.
I've mentioned this here before, but every now and then I get burnt out on programming, and when I'm like that I get pretty depressed.
Sometimes it just feels like I have an obligation to my GitHub, which just isn't the case. So every now and then I force myself off.

I'm still not quite back on the grind yet, but this entry is more of an "I'm alive, stop messaging me on Discord you daft bastards" entry.
I mean, I don't particularly mind messaging over Discord, but I do know that some people read this blog. It escapes me as to why.
Anyways, I'm alive.

I did say I would disappear for quite a bit to play some new games, and that I absolutely have.
This week and all last week when I wasn't at work, I put my sigma grindset dedication into Pokemon Brilliant Diamond.
How much of that grindset, you ask? Well let me put it this way. There's 488 obtainable pokemon in the game, and of those 488
I have actually caught and in my storage box a whopping 476. I'm completing my national dex this weekend.
Yeah, I had to catch them all.

I've also, like I've said I would, been playing the absolute fuck out of Forza Horizon 5.
You know, some people don't like racing games, and that's fine. One of my friends didn't understand--
He asked me why would I want to play a game where I drive a bunch of cars I can't in real life. And that's the point.
I will say, it's probably good that it isn't real life. My poor Porsche 911 GT3 in FH5 would have easily been turned inside out by now.

Other major life updates I suppose, I did recently catch the Kung Flu-19. That's right ladies and gentlemen, I had the cheese touch.
I'm fine now, did my little quarantine shin-dig, got better, what have you. I knew I had it when I woke up a 3:30 in the morning one day
with a pounding headache. Unfortunate because the very day prior I was drinking 805s with friends as we were all gathered
around my television watching anime. Luckily of the five of us goons that were at my place, only one other got covid.

Also, another lovely development: I went over to pslate customs and bought $230 worth of custom cables for my computer.
Your immediate reaction to this is probably "What the fuck?" or "What a waste of money."
To those I would repsond that pslate customs has extremely high quality sleeved cables they make, AND they're fitted to your case (which is perfect if you use small form factor),
and fuck you it's my money and this is capitalism at it's finest, and if you don't like me spending my money that I earn then you give me minimum 50k a year and tell me how to spend it.

Eventually I'm doing a full cooling upgrade on my computer. I say "eventually I'm doing" and not "I'm planning to" because I already have everything I need.
Got my radiator, my tubing, got my new CPU block, hell, even got my GPU block from EK. I just haven't... done it yet. I'll get to it. It's not that I don't have time, I'm just playing Pokemon still.

Speaking of, the new Legends Arceus game is coming out, and I've been hawkeying the shit out of the leaks. It looks amazing. There's some new pokemon forms I'm not entirely kosher with, but
I'll just get the fuck over it. I already pre-ordered the game anyways. Funny because in my whole lifetime I've pre-ordered only 3 games. Call of Duty Black Ops 2, and Pokemon BDSP and Legends Arceus.
Maybe I'm a fucking nerd.

Anyways, you've been on this blog entry long enough, it's time you started paying rent. That or move onto the next entry. Smell you later, nerd.
And yes, that is a Pokemon reference.


0x10you won't see me for a year

Just as a quick reminder if you've been following my blog,
this is a personal blog. I've been writing a lot about programming
lately, but it's time to switch speeds a little bit.

I'm going to be a huge fucking nerd here, but Q4 2021 and then all of 2022 looks
amazing for video games.

Right now I've got a huge list of games that I am going to buy and play.
Not in any particular order, Pokemon Legends Arceus, Pokemon Brilliant Diamond/Shining Pearl,
Dead Space (that's right ladies, Dead Space is finally getting a mother fucking remake), Forza Horizon 5,
Callisto Protocol, Elden Ring, Splatoon 3, Breath of the Wild 2 electric boogaloo, and A Plague Tale: The Squeaquel.

As far as announced games, that's what I've got on my list so far. I'm a huge fan of the Pokemon series,
generation 4 in particular. Dead Space is my favorite video game series of all time.
I'm also a huge Forza Horizon fan. The other games that are continuations are games I've thoroughly enjoyed--
my reasons go on and on. But it's been years since I've been this excited about video games.
In recent years gaming has been largely disappointing.

Yeah, sure, some games I've hawkeyed until release, like Nier Replicant and Luigi's Mansion 3.
But nothing has really grabbed me, if you know what I mean.

I just bought a Nintendo Switch, again. Solely for Pokemon Legends. Yeah, BDSP is enticing, but it's not my main
attraction. I still have Platinum on an actual DS (currently shiny hunting Cresselia, by the way. Have been since December 2020).
I sold my old Switch because I barely played it. Now look at me.

TL;DR you're going to see milk cartons with my face on them captioned with "Missing Person."
Gaming is absolutely going to be a hobby I pick back up. Soon, too, since the Pokemon Diamond and Pearl remakes are
a whopping 13 days away now.

Also yet another year that this is the year of Sonic the Hedgehog. It never really is, but supposedly a new Sonic
game is coming out. If it has a Chao garden I'll pick it up. If not, I won't. That's basically
how I've been with the blue blur games ever since Unleashed. With the exception of Mania.
The Chao garden in the Adventure series I still play to this day, and the newer Sonic games have not
been good enough for me to even think about picking up, unfortunately.


0x0fsometimes being unprepared prepares you the best

I've been programming for 10 years now. In fact, I just hit the 10 year mark.
In my experience, I've found several means of planning.

Sometimes I like to board things, sometimes I modularize functionality of something,
hell, even comments to my future self helps.

One thing I've come to realize though is sometimes the best way to prepare yourself to write
something is to not prepare at all.

Hear me out.
My most recent project, mxfw. The program is entirely just a bridge between a user and macOS' window server,
and users take advantage of the API that it provides.

Here's an example of responding to window events for only one specific window type.

mx.window { bundle = 'com.apple.Terminal' }
	:on("created", function(window)
		print("Hello, World!")
	end)
	:on("closed", function(window)
		print("Goodbye, World.")
	end)


Now, I spent quite a few days trying to figure out how I would implement that sort of API.
It was actually quite a difficult thing to do because it provides several challenges.
Firstly, the function mx.window(). What should it return?
The obvious answer is some sort of event handler, or something that contains :on().
And yes, that's an obvious one, but look at the actual function call for :on().
It takes two parameters, the event string, and a callback.
Mind you, this has to be implemented in the Objective C++ part of the program.

More than that, what if no rule is present?
And since it's in Objective C++, and the Lua API it uses can't explicitly provide transient data
that originated from the parent, then what?

In addition to this, how do I store these rules? How do I map them, such that the event emitter
in mxfw knows to only emit an event for this window if the rule applies?

These are all very good questions that have been asked. And more importantly, ultimately,
my solution that I ended up implementing was far simpler and more eloquent than what I had boarded out prior.

The solution I came up with was that mx.window() was actually just a method that would
create rules and store them in a map. This map would map window rules to a vector of structureas.
Said structures contained two children, an event and a callback. This was, in essence, a map, without the
super large overhead from C++'s STL.

When an event for a window is emitted, it checks to see if any rules apply. If they do, then
it looks through the vector of event-callback mappings and looks for the event that was emitted.
If one is found, then it calls the mapped callback. If not, who gives a shit. Discard. Move on.

See, the problem wasn't that I wasn't in the right mindset. The problem was that
I was overanalyzing the fuck out of the problem. I was too wrapped around thinking that
the mx.window() function should return some sort of grand object that contained all window data,
or that I needed to have a bitwise AND of all the events that a window responds to and then check a large
look up table and what have you.

Moral of the story; the bigger picture was there. And I was thinking too large, too complex.
Even breaking the problem down didn't help me at all.

And maybe, just maybe, you're going through something similar to me in concept.
Maybe there's a problem that you just can't figure out cognitively.

I'm here to tell you that you don't need to.
Just start writing. The implementation will fall into place.
And if you don't like it, that's fine. At the end of the day, you can always refactor.


0x0ei am bad at naming things.

I am bad at naming things.
I am.
It's just a fact of life.
The speed of light is 299,792,458 m/s, the sun will inevitably die, Earth will soon crumble due to human pollution,
California has shitty gas prices, and I... am bad at naming things.

To cut to the chase, I've had several projects I've had to rename in the past.
Some more than once. But my most recent project, mxfw, has been renamed five times.
It started out as Opal, then Moonstone, Carbon, Sodium--eventually I just quit trying.

mxfw as it is referred to right now, is a framework window manager for macOS. If you live in *nix land or use
X11, think awesome wm but for macOS.
It's been renamed five times. That's ridiculous. It started out as Opal simply because that was the first thing I thought of.
It moved to Moonstone because it uses Lua as its scripting language. I thought it made sense.
Eventually I wasn't a fan of that name because it seemed long and to some degree unmemorable.
Then came Carbon. Users complained that it was called Carbon because Apple already has a library they call Carbon.
For some that was too confusing. Then came Sodium. Turns out, there's a Minecraft mod named Sodium.
The elements theme was because it was supposed to be the building block for a window manager.

In the past I've named projects after a quality they have, and if not that, then the first thing that came to mind.
hummingbird was named because hummingbirds are fast, and it's
entire design philosophy is to be a fast init for Linux.
tiramisu was named because I was eating tiramisu at the time I created it.

I'm a simple man.
But alas, a simple man cannot name things. Not that you would expect me to be able to--I'm named after a color.


0x0dtranspiling is the answer to your problem

It's [insert year here].
You're a brand new developer. A little baby man.
You're special. You're unique. You bring to the table something that literally nobody else does.
You're trying to make your own programming language.

Look, I'm not going to be that guy, but I'm going to be that guy. You're not doing anything new by doing this.
"But it will have X! And Y!" Yeah, Z has X and Y, and it does it fifteen times more efficiently than you do, to the point where your
code takes at least 3 business days to execute. News flash, buddy, it's a Friday.
Meanwhile, Z is sending you your executable same day shipping, and it's gift-wrapped.

I'm not trying to say that you're not a special snowflake, but at least half of the population has done what you're trying before.
Sometimes it doesn't make sense to reinvent the wheel.

Sometimes, instead of compiling directly to unoptimized machine code because you don't know any better,
you may just want to transpile. The wheel has already been made for you, so just make the damn axles already.

Now what is transpiling? Well, put simply, instead of going from your code to machine code, you go from your code
to the code of another language, then to machine code.

There are benefits to this, there are also drawbacks.

First and foremost, your compilation will take much longer. That's just the nature of the beast.
Parallelization is your best friend here.
Next, the code is for the most part optimized already. Someone else already did the heavy lifting. Including carrying you and millions of others
on their backs.
The code immediately becomes much more portable. Especially if you're transpiling to say, C. Everyone and their mother's dog has the ability to compile C programs.
And the last major thing I want to highlight here is that transpiling might be a little bit more difficult.

To this, I hear you say, "But Sweets, I'm writing an interpreter. Not a compiler."
Don't worry, you're not doing that correctly either.
Interpreting a language is a much bigger beast than a lot of people take it to be. A lot of people will start writing one, and
just after they've written their lexical analyzer, and tokenizer, and then they'll realize they are up to their knees in shit.
They're doing a handstand right now.

At this point, I feel it's important for me to say that the tone of this entry isn't to condescend upon you, the reader, specifically.
I've gone through these same woes in my time, and I've seen plenty of others go through the same ones.

So look, at the end of the day, maybe transpiling is the answer to your problem.
What I will say, is maybe it's not though. Maybe LLVM is better for you.
There's a lot of different solutions to making a programming language. My intention in this post is to say that maybe just writing a simple
interpreter or trying to compile to machine code directly might not be your best choice of action.


0x0cbuilding blocks

It's been a while, hasn't it? Not that any new user would know, I don't date these things.
I was pretty burnt out on programming for a long while there. Now I'm not so much.

Every now and then I become enamered by something new. Or in this case, new to me.
The current obsession I've got is with meson.

Now, I've never been one to write good Makefiles. Anyone who knows me knows I don't know the Make syntax.
Not that it's ever bothered me any--but Makefiles are one of those things where everyone does it differently and everyone else is wrong.

Conveniently though, there are... better build systems. I mean, Make is nice. But it's old.
Nothing wrong with being crusty and old, but really what this means is newer things are out. And meson is one of them.
I've been trying to learn meson lately. Days of old have passed, no longer am I going to have pull requests made to my GitHub in regards to Makefiles.
Only pull requests about the code. And about my incorrectly made meson files.

Anyways, meson seems to be the new in and hip thing. And I've dedicated myself to trying to learn it. One roadblock I've found is that it's very...
lean on language support. Which for me may be a weird hurdle, considering I write projects in languages like Objective C...++.

Yes, that's a very specific language to be writing in. It's not without reason, I needed the Objective C layer over just regular C because it's much nicer to use
with a macOS environment, and on top of that, C++'s object oriented nature helps that particular project I use Objective C++ for.
Oh, and let's not forget Vala. I've recently rewritten Tiramisu in Vala. I enjoy the project much more now, but it's a niche language in the grand scheme of things.

Look, the programming language gene pool is vast, and C is the Genghis Khan of the pool. Python is the lifeguard, Java is drowning, and Rust and Go are... building sand castles, and kicking each others over every five minutes.
And Meson supports only 4 of these beach-goers. I don't know where this analogy went, but I somehow brought it back.

I guess this was a long winded way of me saying--I'm going to figure out a way to build my projects with Meson.
It's cleaner syntactically, faster (by their own metrics at least), and for the love of a God that I don't believe in, people won't have to PR as much for my projects.
Until someone makes another standard. Maybe that'll be my next blog entry. Bullshit standards made by bullshit organizations.

Anyways, something something automated build systems something something.


0x0bnothing but passion

As many have observed before, it sometimes takes me months to hammer out specific issues or bugs on projects.
Anyone who knows me knows that I don't program for any monetary reasons. I mean, yes, I do have a sponsor button on GitHub,
but I'm completely open to saying that I proudly have 0 sponsors. So realistically, this is all just a hobby for me.

That being said, when I work on programs, I do so out of purely passion. I don't work on things I don't like or don't want to,
and more over, if I have to force myself to work on something, I dread it even more.

I've said it before, but for as much as I love programming, I fucking hate it. This stems from me working as a developer in the past,
I lost my love for it for a long time, got really burnt out on it.

A very long period of time later, I started getting back into programming.
From time to time though, you may notice that I am very slow to fix issues.
In particular, something like #20 on tiramisu.
Opened October of 2020, closed (for good) July of 2021.

The sad reality of it is that I hate tiramisu now, though. I hate working on it, I hate its implementation as I think it could be better, I hate that it has the attention it does,
I just hate the whole project.
So of course it took three-fourths of an eternity for me to work on it. I have to force myself to work on it.
Not to mention that it's a pain in the ass for me to work on it, since I don't have any actual machine running linux, and WSL and dbus don't exactly agree.
So realistically, it's a pain in the dick hole to even begin work on, and then it's a more metaphorical pain in the ass crevice because I don't even want to work on it.

Other projects, like hummingbird and Moonstone, however, I still have a burning passion for.
I will say, just because there's been no recent git history on one of my projects doesn't mean I'm sick of working on it and just quitting on it. It could just mean it's actually done. Or, equally likely, I just havent pushed
my local commits any time recently.

Whatever the case may be though, just know, all of my projects and products of passion.


0x0aa secret scam

Every so often in society, people, businesses, companies even, get caught scamming people.
There's some things in life that are just completely unreasonable to pay for.
Maintenance on your car (I'm a firm beliver that everyone should do their own car work, but I'm also a car guy so my opinion is worthless),
scalped computer hardware, the cost of Pokemon games for the GameBoys and Nintendo DS', but worst of all, beef fucking jerky.

Hear me out. Everywhere from San Diego, CA and Dallas, TX, Jack Links Beef Jerky is a fat $5 for ~3oz. They're not the only offenders, though.
Other brands are worse. More money for less meat.

Now the more astute among you may pin the blame for the high price on smoking the beef (or meat in general, in the case of non-beef jerky),
on the spices, quality assurance, but none of that matters.
It's all bullshit because to make beef jerky, all you do is just spice and smoke.

When you can get much higher quality beef for just $10 per pound, that's 16oz of good meat.
This means that you pay just a bit over 3x for jack links that you would just buying meat yourself.

Wake up, sheeple. Big beef doesn't want you to know this.
Jerk your own beef.


0x09keyboards

Recently I've been wanting to get into making a custom keyboard. I love building things, and one of the things I've never gotten into before is clickity click clack machines
But I'm having a hard time justifying it.

Now whenever I get into something, I get into it. I research for several hours different parts, components, the benefits of some versus others,
no choice I ever make is without hours of learning. Which I quite like.

For some peculiar reason though, in the days, if not weeks now, of my looking into different PCBs, different layouts, switches, and keycaps,
I haven't been able to justify making one.

At the end of the day, a keyboard is just a means of input for my computer. More specifically for my desktop, since I use my laptop as an actual laptop.
Alright, so what's the hold up? Well, keyboard builds come out to over $200 typically. I've got a keyboard that I paid $40 for.

They both accomplish the same thing, but one for a drastically different price.
"One has a much better typing experience!" You say, and to that I say that I don't actually feel the difference... at all.
The nerves to my fingers may not be firing correctly or something, because despite what many scream, I feel no difference at all.
I mean, sure, the individual key caps may be different, but the actuation feels no different to me. Which is ideally what I want to feel different.

"But the sound! The custom one will sound so much better!" You begin to cry out.
I don't hear a difference. The only time I hear any difference at all is when someone drowns the stabilizers in lubricant.

Alright, well maybe the layout is what I would benefit from.
Nope, not one fucking bit. I use all of the keys on my keyboard. All of them.
I use the numpad regularly, I use my home/end and page up/down keys when programming, which I do regularly,
I use my function keys as media controls... I use every key no the keyboard. Even system request. Try me.

Anyways, I see no value in building a custom keyboard. I wish I did.
I really want to build one. But I can't justify to myself spending the amount of money one costs to build,
when I have a perfectly good one.

On the other hand, there are more things I really want to get into with keyboards.
In particular, I'm very interested in stenography.

If you aren't aware of what stenography is, neither was I. In essence, stenography is the use of a special keyboard
that has keys for phonetic sounds. So instead of pressing individual letters to write out a word,
you press the beginning phonetic sound, the ending sound, and the keyboard fills in the spaces in between.

That explanation was a little simplified, in actuality it's a little more complex than just that,
but it's a really interesting means of typing. The actual purpose of stenography is for stenographers,
someone who transcribes speech, typically in the court of law.

Maybe one day I'll build a keyboard. And maybe it'll be a steno keyboard.
Who knows.


0x08misunderstood

For whatever reason, macOS is very badly misunderstood in the context of *nix users.
I'm not quite sure why that is, honestly. In the same sense that linux is *nix, macOS is very much also *nix.

I've got the 2021 MacBook Pro, which has the new M1. I really wanted to play around with the new CPU,
but also branch out a bit more. I've developed for macOS before, not to say that I had amazing projects for it,
but I'm not new to it. What I am new to though is owning a real mac. Previously I just would use a Hackintosh install
when I had a Ryzen 3 2200G and an RX550.
Lovely hardware combination for macOS, I'd say.

Anyways, the M1 is actually extremely powerful I've found. Many others have found this news before me,
but to my surprise, it almost competes with my 3950x in single core performance.
I'm not quite sure whether that's the raw horsepower of the CPU at work, or if it's the amazing optimization Apple
is known for, or if it's the ARM architecture grinding. Regardless, at its core, it's still running macOS.
Which very indirectly is based on UNIX.

So why in a community of UNIX-like systems does it receive so much hate?
I think it's just misunderstood. Very badly at that.
For what it's worth, macOS has amazing system level APIs.
Like, incredibly good. Security is not an afterthought on macOS, and it shows. I'm not going to pretend
there haven't been any vulnerabilities before, but unlike a lot of Linux distributions, everything is built with
intent. Something that Linux suffers a lot from.

In macOS, the developers convenience is not at the expense of the end-users usability, nor security.
The root file system, for example, is not mounted as r/w, it's ro. If it ever is mounted as such, it's with intent.
Something many Linux distribution should take notes on.

On top of this, it's just as customizable as Linux distributions are. You can change out as much or as little as you want.
The options are limited, but that's because the same crowd of neckbeards writing window managers and compositors
for Linux are not the same crowd using macOS. There's very little overlap.

Another thing is that much of the system is event driven. I know the programmers out there are starting to lose their minds.
If you've ever had to program something that requires you to test conditionals, but you don't have any direct API,
you probably had to use an infinite loop. Which kills CPU usage. Maybe you could poll, but what if there's no file descriptors?

Realistically, I can see money as an argument against macs. But like previously brought up, you can do Hackintoshes.
If you intend on doing one, you'll find you're restricted in the exact hardware necessary because drivers don't exist for everything.

If you can do it though, it's definitely a fun project to take up, and not a bad daily driver.
If you don't like it, you don't like it. People will find every reason in the world to dislike something, except for
their own personal tastes. For whatever reason, they just can't say "there's nothing wrong with it, I just don't like it."

To summarize, I see no reason why macOS receives the hate it does. It's just badly misunderstood.


0x07systemdo or systemdon't

Often times in the *nix community, people will tell you things like "systemd is bad" and that you shouldn't use it.
Then they will go on a triad of reasons why systemd is bad.
Now, to any normal or informed user, some of these arguments against systemd may be considered valid.
But to any other user, in particular users new to *nix, these are horribly misleading reasons to use something other than systemd.
The latter crowd is who this blog entry is for.

Systemd calls itself a "software suite." So, as such, it provides many tools--few of which are actually simple--that are useful in the runtime of the operating system.
None of systemd's components are inherently bad.
The problem with systemd is that the entirety of the software suite is executed as pid 1.
The first process is the one that on many systems is the init system. It is the highest parent process in the hierarchy, as it is executed by the kernel.
Now in some applications, this is what you may want. You may want a reliable and battle tested process running as the init, especially on servers.
You may even want it on a desktop as well, for ease of use.
The biggest problem with being told what init to use and what not to use is that what works for others may not work for you.

A lot of people have asked me what init they should use. The simplest answer, use the one that works. If systemd works for your system, why replace it?
On top of that, some distributions of Linux almost hard depend on systemd. For example, Arch Linux, where the entire
system is built on the premise of using systemd. Packages that are installed have appropiate unit files for systemd. Replacing it, while it can be done,
is a tidious and quite annoying process.

So all of this is to say: if you're the user who is new to *nix, and you don't know what init to use, don't change your init at all.
To quote a friend of mine, "Linux users are all about freedom of choice, right up until someone's choice differs from theirs."

And on the opposite side of the spectrum, if you're the redbilled based user who is preaching to the newer, pure linux users:
Shut the fuck up. Let them learn the pros or cons of the software themselves. After all, for as much as I dislike systemd myself, it just works.


0x06a love letter to forums

Time goes on, technology is innovated upon, but when I was younger, I was enthralled by forums.
I don't know the particulars of it, but there's something oddly satisfying about creating a post on a board, and having to sit and refresh the page several times until you get a response.

That's something that nowadays technology has taken from us.
Why would people want to sit on a forum, create a post asking how to fix a problem or spark some sort of conversation when they can go on Discord and get a response damn near instantly?
Call it what you want, patience, delayed gratification--whatever the case may be, forums have a certain charm to them.
And while I appreciate a good quick response from time to time, or a quick catch-up conversation with old friends over instant messaging,
I have a much larger appreciation for simple forums.

In the forum community, it's hard to draw people in. It's hard to entice people to join in on something so primitive and outdated.
Maybe I just live in the past, but even today I still slay bits and bytes on good old bulletin boards.
Hell, if you look hard enough, you might even find forums that I'm still an active keyboard warrior on.


0x05a fictional story about timestamp gore

A long time ago, in a server far far away existed a web developer and a really interesting codebase. You are our savior in this story.
When you typically think of timestamps, chances are you think of a UNIX timestamp.
A 32-bit integer counting the amount of seconds that have passed since January 1st, 1970.

Storing timestamps seems like a relatively painless endeavor, just allocate enough space for the timestamp, and you're off to the race(condition)s.
Except maybe it isn't the best implementation. I mean, some time in early 2038 UNIX timestamps will overflow, and we'll need to either switch to 64 bit or create a new epoch.

It'll basically be Y2K all over again. 0x7fffffff16 will be our demise! Or will it?
Realistically speaking, likely not. What is more likely is developers will start implementing fixes for this overflow before it actually happens, and we won't be ill prepared.

But what about another solution? What if we stored timestamps as a string?
Well, this can be an implementation, and depending on how scalable you want your timestamps to be, maybe the best way to deal with this kind of issue.
So, as a little bit of a thought experiment, if we did this, how much space would we need to allocate for our string?

To calculate something like this, we'll need to make some assumptions. Typically, information on a website is stored in a database, and often times it's in SQL.
In a UTF8 encoded db, a single character is given 4 bytes of memory. If we use type varchar(32) to store a timestamp as a string, we'll have 128 bytes to work with.
128 bytes doesn't necessarily sound like a lot, but that's only for a single row. A single record.
If we say... store user data, to include when a user registered... Well, now we have N * 128 bytes of storage being taken up, where N is the amount of users, assuming we store no other information of course.

You see where I'm going with this, right? So just 8 users deep, and all of the sudden we're taken up a kilobyte of storage for just timestamps.
By the way, do you like gore? I sure hope so.
Imagine for a moment that instead of storing a timestamp as a string in only a single column, we split up each part into its own dedicated column.
Let's be conservative and say that we give each indice... 6 characters to work with at most. And we'll say we only need the seconds, minutes, hours, days, months, and years.
Alright, we've got 6 columns, each of them have 6 characters. Remember, one single character takes up 4 bytes. 4 * 6 * 6. Now our timestamps take up 144 bytes.

So given our algorithm of N * 144, we hit 1kb at just 7 users. Not that much of a difference, but we sure won't fucking scale well at all.
Now we're taking a metric ass-ton (abbreviated as AT) of space for only timestamps. We haven't even stored more than one, and we didn't calculate storage space for any other columns.
So by the time we add everything else back in, our user table is now taking up more than an AT. We're now taking kilo-ass-tons of storage space. kAT.

So we're taking severak kATs of space, and we haven't even considered actual usage with this.
Time to query the database.

SELECT `seconds`,`minutes`,`hours`,`days`,`months`,`years`,`uid`,`username`,... FROM `pain_and_suffering.users` WHERE 1;

Could you imagine having to write that query several times over?
Don't worry, you don't have to, because in our fictional story, Bill did it for you.
Bill got fired by the way. Google didn't like how badly their database was running. And it's your lucky day, because site-wide, you get to fix this.
Oh, no, Google isn't running a query generator or anything. They haven't moved onto more modern technologies yet because they don't know how secure something like Laravel is or isn't.

This might be a good time to mention that if we just used the datetime column type--which only uses 8 bytes--we would only be at N * 8. We'd hit 125 user records before we hit a kilobyte.
Just some food for thought; we haven't even hit a single AT of taken up storage space yet by using the datetime type.

You've been tasked now with fixing every database query, as well as every bit of source code that deals with Bill's original database schema.
So what do you do? You start a transaction.
Time to write up a bit of code that iterates over every record in the database, parses each column related to time, concatenates it all, converts it to a proper UNIX timestamp,
stores it properly, then stores it in the database. Then you make sure everything is good, commit to the database, or rollback if not.

Don't get me wrong; I'm not saying that storing a timestamp as a string is necessarily a bad thing, but don't be like Bill and assume it's going to be the best approach.

Thankfully this story is only fictional. I pity the person who once had to, or one day will have to, fix this completely fictional issue.
Remember, kids, sometimes the best approach is the one most used. We don't all need to be innovators.


0x04close the window, i'm programming

You know, being in quite a few *nix customization communities, and also developing programs primarily for Linux, people often ask me, "Why the fuck do you use Windows?"
Don't worry, the irony is not lost on me. I may be smoov brane, but I'm not senile. Not yet at least.

Now there's a lot of reasons I use it. The most common one I give is that I think linux is a waste of good hardware.
Take that how you will, but I've used linux for a very long time. That being the case though, as far back as I can remember using it, it was only ever prominently on old laptops or machines with weaker hardware.
Any average linux user will tell you it's because "linux gives your computer a performance boost!", which itself is wrong. Linux just doesn't use as many resources as another operating system, like Windows.
You have no actual performance gain as you otherwise would, it's just it seems like that to the end user because of these freed resources.

The next reason is simply I don't like babysitting my system anymore. I think it may because I'm getting older as the days go on, but I don't like doing package management anymore,
I don't like having to go on a witch hunt for a bug and I can't use my shell anymore because I did a partial system upgrade, and now I have to burn it at the stake because bash depends on readline, and bash can't use the version of readline on my system.
It's mostly beyond me now, I've done it before, I'm tired of doing it, so I don't want to do it again.

Next, I grew out of my customization phase. Don't get me wrong, I love doing desktop customization, and I do dabble in it every full blue moon, but that's only in passing now.
I no longer just sit down for hours on in trying to find the "optimal" setup. It's kind of like building a computer or working on a project car. You may get done with it, but it's never complete.

Now if any of these reasons apply to you, that's completely fine. There's nothing wrong with having old hardware, nothing wrong with babysitting a system,
and there's nothing wrong with wanting to customize your system unless you use i3. These are just my reasons.

This still begs the question though, "Sweets... you use Windows, yet you develop for Linux? What are you smoking, and where can I get some?"
To answer your second question, I'm not smoking no matter how hard that may be to believe, and then to answer your first, yes.
Alright, good talk. See you later.

In all seriouesness, I still love Linux, despite not using it myself. I especially love developing for it because, simply, it's fun for me. There's no rhyme or reason.
I just enjoy it. And hopefully you or others enjoy using what I make. Even if you don't, at the end of the day, I program for fun (and thank fuck I don't program for a living now, I can tell some horror stories about that).

So yeah, next time someone calls me a retard in a certain "down" Discord server, I'll just refer them here.
And if you're that person; go fuck yourself, there's nothing wrong with using Windows, despite what the insects living in your neckbeard tell you.
Also take a shower.


0x03help, i'm lost in walmart

After some bouncing of domains off of the forehead of a certain box, I've finally decided on what I want to call my blog.
A common tagline I've used on various social medias is "Help, I'm lost in Walmart."
This mostly stems from my inability to think up bios and such, but it's stuck for quite some time.

So what better to call my blog than the lost in walmart blog? My blog isn't strictly tech related, just a blog for me to write whatever comes to mind.

So officially; help, I'm lost in Walmart.

On another note, you may notice in the top right a "previous" and "next" button.
The lost in walmart blog is part of a webcircle, the links go to various sites of my friends, and at some point, back here eventually.
Check out the other sites in the circle, I'm sure something that piques your interest can be found.


0x02__always_inline, linux, and musl

If you're trying to install kiss linux (or any non-glibc distribution) with a linux 5.x kernel, you may find that you'll run into an error trying to build the kernel.

/usr/include/linux/byteorder/little_endian.h:44:8: error: unknown type name '__always_inline'

On a system running glibc, this isn't particularly an issue; __always_inline is defined.
This isn't the case on systems that are using musl or other libc's though.

So, as a quick fix, open up /usr/include/linux/swab.h in your flavor of text editor, and include linux/stddef.h just below linux/types.h.

#include <linux/types.h>
#include <linux/stddef.h>

With that, try to build again and you should be up an running.
Several PRs have been made in regards to this issue, so hopefully this will be fixed soon.


0x01pure of heart, dumb of ass

So this is my "blog" of sorts. A little about me, about the work I do, my system, etc.

I'm Gray, I go by Sweets on the internet. I'm 21 years old, born in Texas, I now live in California.
My background is in full stack web development, though I don't work as a developer anymore, now I'm just a lowly C hobbyist.
Now I work as a microminiature repair technician. I solder super tiny circuits and shit.

I believe in clean, readable code. I'm known for a various projects in *nix communities, tiramisu, hummingbird, and custard just to name a few.

Some information on my systems, my desktop is a custom build that I've named "mercury".
I've got a Ryzen 9 3950x @ 4.5GHz, an RTX 2070 Super, Sabrent Rocket Q4 4TB, an Asus Strix B550-I, all housed within the beautiful formd T1.
My primary operating system is Windows 10, which is where I do all of my development. And yes, I see the irony in that.
My primary laptop is a 2021 MacBook Pro 13". The fancy M1 one because I have a bad habit of spending copious amounts of money on unnecessary hardware.

My other systems I use only for testing purposes. A Compaq Presario CQ57, a Toshiba Satellite C655, and an HP 14-dk0045nr,
running debian, Chromium OS, and KISS linux respectively.

Some communities that I frequent are r/unixporn and r/sffpc.