• 1 Post
  • 20 Comments
Joined 1 year ago
cake
Cake day: January 16th, 2024

help-circle
  • The attitude to theoretical computer science re quantum is really weird. Some people act as if “I can’t run it now therefore it’s garbage” which is just such a nonsense approach to any kind of theoretical work.

    Turing wrote his seminal paper in 1936, over 10 years before we invented transistors. Most of CS theory was developed way before computers were proliferated. A lot of research into ML was done way before we had enough data and computational power to actually run e.g. neural networks.

    Theoretical CS doesn’t need to be recent, it doesn’t need to run, and it’s not shackled to the current engineering state of the art, and all of that is good and by design. Let the theoreticians write their fucking theorems. No one writing a theoretical paper makes any kinds of promises that the described algorithm will EVER be run on anything. Quantum complexity theory, for example was developed in the nineties, there was NO quantum computer then, no one was even envisioning a quantum computation happening in physical reality. Shor’s algorithm was devised BEFORE THAT, before we even had the necessary tools to describe its complexity.

    I find the line of argumentation “this is worthless because we don’t know a quantum computer is engineeringly feasible”

    1. Insulting,
    2. Stupid,
    3. Lacking whimsy,
    4. Unscientific at its core.





  • but i still think that it’s a little suspect on the grounds that we have no idea how many times they had to restart training due to the model borking, other experiments and hidden cost

    Oh ye, I totally agree on this one. This entire genAI enterprise insults me on a fundamental level as a CS researcher, there’s zero transparency or reproducibility, no one reviews these claims, it’s a complete shitshow from terrible, terrible benchmarks, through shoddy methodology, up to untestable and bonkers claims.

    I have zero good faith for the press, though, they’re experts in painting any and all tech claims in the best light possible like their lives fucking depend on it. We wouldn’t be where we are right now if anyone at any “reputable” newspaper like WSJ asked one (1) question to Sam Altman like 3 years ago.


  • Okay I mean, I hate to somehow come to the defense of a slop company? But WSJ saying nonsense is really not their fault, like even that particular quote clearly says “DeepSeek said training one” cost $5.6M. That’s just a true statement. No one in their right mind includes the capital expenditure in that, the same way when you say “it took us 100h to train a model” that doesn’t include building a data center in those 100h.

    Beside whether they actually lied or not, it’s still immensely funny to me that they could’ve just told a blatant lie nobody factchecked and it shook the market to the fucking core wiping off like billions in valuation. Very real market based on very real fundamentals run by very serious adults.




  • don’t even get me started on “whole language learning” and “new math”

    I don’t know what “whole language learning” is, and I’m way too young to have experience it, but wasn’t the curriculum before “new math” like arithmetic and nothing else? In other words, not math at all?

    I didn’t read much into it but from what I did it seems like they started teaching children actual math like algebra and logic and parents got frustrated because they were too stupid to help with homework anymore. Brings into my mind the whole “math was cool before they involved letters” thing that makes me want to throw a book at someone.




  • This is a really weird comment. Assembly is not faster than C, that’s a nonsensical statement, C compiles down to assembly. LLVM’s optimizations will most likely outperform or directly match whatever hand-crafted assembly you write. Why would BEQ 1000 be “considerably faster” than if (x == y) goto L_1000;? This collapses even further if you consider any application larger than a few hundred lines of code, any sensible compiler is going to beat you on optimizations if you try to write hand-crafted assembly. Try loading up assembly code and manually performing intraprocedural optimizations, lol, there’s a reason every compiled language goes through an intermediate representation.

    Saying that C# is slower than C is also nonsensical, especially now that C# has built-in PGO it’s very likely it could outperform an application written in C. C#'s JIT compiler is not somehow slower because it’s flexible in terms of hardware, if anything that’s what makes it fast. For example you can write a vectorized loop that will be JIT-compiled to the ideal fastest instruction set available on the CPU running the program, whereas in C or assembly you’d have to manually write a version for each. There’s no reason to think that manual implementation would be faster than what the JIT comes up with at runtime, though, especially with PGO.

    It’s kinda like you’re saying that a V12 engine is faster than a Ferrari and that they are both faster than a spaceship because the spaceship doesn’t have wheels.

    I know you’re trying to explain this to a non-technical person but what you said is so terribly misleading I cannot see educational value in it.



  • Also I’m sorry but

    Why the discrepancy? A footnote in the CE Delft report makes it clear: the price figures for macronutrients are largely based on a specific amino acid protein powder that sells for $400 a ton on the sprawling e-commerce marketplace Alibaba.com.

    this is exactly the sort of magical thinking I’m talking about “it will scale because we can order tons of the stuff off Alibaba” just what the fuck are you smoing mate, this can’t be good faith analysis


  • Very good read, but throughout I can’t help but say to myself “ye so the issue is scale. AS ALWAYS”

    This is a tale as old as time. Fusion energy is here! Quantum computers will revolutionise the world! Lab-grown meat! All based on actual scientific experiments and progress, but tiny, one-shot experiments under best-case conditions. There is no reason to think it brings us closer to a future where those are commonplace, except for a very nebulous technical meaning of “closer” as “yes, time has passed”. There is no reason to think this would ever scale in any way! Like, there is a chance that e.g. fusion energy at any meaningful scale is just… impossible? Like, physically impossible to do. Or a stable quantum computer able to run Doom. Or lab-grown meat on a supermarket shelf. Every software engineer should understand this, we know there are ideas that work only when they’re in a limited setting (number of threads, connections, size of input, whatever).

    The media is always terrible at communicating this. Science isn’t fucking magic, the fact that scientists were able to put one more qubit into their quantum computer means literally nothing to you, because the answer to “when will we have personal quantum computers” is “what? how did you get into my lab?”. We have no idea. 50 years? 100 years? 1000 years? Likely never? Which number can I pull out of my ass for you to fuck off and let me do my research in peace? Of course, science is amazing, reading about those experiments is extremely interesting and cool as all fuck, but for some fucking reason the immediate reaction of the general public is “great, how quickly can we put a pricemark on it”.

    And this leads to this zeitgeist where the next great “breakthrough” is just around the corner and is going to save us all. AI will fix the job market! Carbon capture will fix climate change! Terraforming Mars will solve everything! Sit the fuck down and grow up, this is not how anything works. I don’t even know where this idea of “breakthroughs” comes from, the scientific process isn’t an action movie with three acts and a climax, who told you that? What even was the last technological “breakthrough”? Transistors were invented like 70yrs ago, but it wasn’t an immediate breakthrough, it required like 40yrs of work on improving vacuum tubes to get there. And that was based on a shitton of work on electric theory from the XIX century. It was a slow process of incremental scientific discoveries across nations and people, which culminated in you having an iPhone 200 years later. And that’s at least based on something we can actually easily observe in the natural world (and, funnily enough, we still don’t have a comprehensive theory of how lightning storms even form on Earth). With fusion you’re talking about replicating the heart of a star here on Earth, with lab grown meat you’re talking about growing flesh in defiance of gods, and you think it’s an overnight thing where you’ll wake up tomorrow and suddenly bam we just have cold fusion and hot artificial chicken?

    I hate how everyone seems to be addicted to, I don’t know, just speed as a concept? Things have to be now, news is only good if it arrives to me breaking in 5 minutes, science is only good if it’s just around the corner, a product is only good if it gets one billion users in a month. Just calm the fuck down. When was the last time you smelt the roses?

    If you keep running through life all the roses are gonna burn down before you realise.



  • I had no idea so much of C++ and the Committee was so closely linked to the military industrial complex. Like people who design fucking murder drones just casually send their requests to them and they read them and care? And Bjarne Cplusplus, the inventor of C++, helped Lockheed Martin on the F22???

    No, seriously, sorry, I cannot put myself into a hypothetical headspace where someone sending me a letter “hello, we need this feature to kill civillians better, thanks” isn’t interpreted as a prank, since if it weren’t then the only acceptable response would be to return a pipebomb to the sender.