Redesigning Computing... from the ground up...
Well... it will come as no big shocker but it turns out... Muggles don't remotely know how to build computers.
No... no... not true... they do know how to build computers and they do an amazing job. I can't do what they can do and they do it easily and professionally... but... once built, the hairless apes have no idea how to actually compute with them.
I had intended to release infinite data this week... I know, I know... I keep setting the bar back but the problem is that I am trying to do massive calculation and on numbers that have millions of digits. Most pc's aren't designed to do calculations on more than a few hundred digits... (4300 is the standard int cap) but I need to do it on billions. See the problem?
The first one is memory. Try to take a flash light and shoot it into a disco ball... and stopping it at a certain point that just so happens to be the same pattern as your bits... that's hard. It is... I see why the filthly little muggles can't do it. First off... you need to understand that light is not both a particle and a wave and well... good luck convincing those idiots of that. The wave that you see isn't the same particle that goes through both slits, you dipshit. Man, do you guys eat Dumbass Popcicles for breakfast. One thing can't do two things... so therefore your whole concept of a light particle being both a wave and a particle is just plan ole stupid. I know you won't believe it and you think you have it nailed because you can shoot a laser through a slit and get magic... LOL. Muggles... you guys... seriously? Magic... so dumb.
So... no... the two slit test does not prove what you think it does and when you finally figure out what it does prove, you'll see why a lot of your ideas about physics is just plain wrong. Flat out and without a doubt... wrong. But well... good luck convincing muggles of that... there's no point.
The second problem is precision. You can calcualate all sorts of tiny numbers. For you... a big number is like trillions.. LOL. Trillions. Really? That's a big number...? Imagine if you had 1,000,000,000 dollars!!! Holy shit that's so much money....
Um... ok... now try it with a trillion 0's on the end... a 1 and 000000000000000000000000000000000..... a trillion of that... That's a big number... well, biggish... but do you see the difference? 10 digits is nothing. NOTHING!!!!! A Trillion Digits... well, that's not nothing. That's hard to do....
There are tricks of course... using modular math, expansions... blind luck. But it turns out... the problem muggles have is that thier math sucks. It out right sucks. I have to create equations for every fucking thing I do... I shouldn't have to create equations at this point in the game. And the equations I am creating shouldn't be ones that we are going to teach in elementary school one day. For real, one day 4th graders will learn the new math techniques I have had to develop to do this work. It's not hard... it's just muggles haven't thought of it so it's impossible.
The final problem is how the processors, the ram and the gpu's all deal with math and bits. It's like they all have no idea what they're supposed to be doing. Each processor does it's own thing, has it's own shape and suffers from it's own limitations. What's interesting is how they stream through the bus. Using lazy or greedy builds, or just using one processor or... or ... r.... nonsense... The modern computer is a maze of useless redundancy and completely useless processes. The lack of precision invites errors and the way memory is handled by default is just ridiculous. It's like muggles never actually question anything they do...
"Well, our caveman ancestors built GPU's this way... there can't be any other way..." Fucking dipshits.
I didn't want to have to learn computer science too... I didn't. I thought AI would be able to help me sidestep that learning curve and well... no. See, your AI is trained on the stupid shit you say. It doesn't think about things... it is trained on the stupid shit you say and then when you ask it a question, it tries to predict what a muggle would say... See the problem? Yup... Muggles are fucking idiots. Whatever they say is fucking idiotic and when AI tries to predict an answer... the same garbage that went in is coming out.
So no... I had to learn every fucking inch of the computer. And I did. I am comfortable with machine language at this point but I prefer a mixture of Python and Rust. Python handles the high level abstractions... the sympy level if you will... while Rust carries out the C level ops without ever hanging my memory. Which is awesome... I learned that most of the things that computer scientists do... well, it's mostly just redundant stupidity... wrapped in arrogance. So just like physicists... computer scientists are equally worthless.
I have learned not to trust the libraries provided by muggles and I have learned that you can really do any fucking thing you want with a computer... even lying to it. LOL. So... I am very confident now with my programminng abilities. Even without AI I can code it all from scratch... I don't but now I tell AI the frame and then I fix it to my wishes... I can't trust that piece of shit to code Hello World without fucking it up...
So is this just an excuse as to why you can't have infinite data yet? Well, it's an excuse... converting light and dark spots on a wall into a file is very hard and the numbers are massive... and it doesn't really look like a disco ball... it's way more complex than that...
But I am figuring out how to compute from the ground up.... on the backside of this project will be an entirely new way to move electricity through silicon. Registers and bytes and bits and processors.... oh my. All of that needs a fresh new coat of paint if we are to advance to the next level. You have to be able to do calculations on a trillion digit number... you just do. When you're traveling at a substantial fraction of the speed of light... a trillion digit number is tiny.