60k isn’t that much, I frequently run scripts against multiple hundreds of thousands at work. Wtf is he doing? Did he duplicate the government database onto his 2015 MacBook Air?
I have an sqlite db that is a few GB in size, game saves using the format. Sadly almost all blob data, would love to play with it if it was a bit more readable
Don’t know what Elmos minions are doing, but I’ve written code at least equally unefficient. It was quite a few years ago (the code was in written in perl) and I at least want to think that I’m better now (but I’m not paid to code anymore). The task was to pull in data from a CSV (or something like that, as I mentioned, it’s been a while) and it needed conversion to XML (or something similar).
The idea behind my code was that you could just configure which fields you want from arbitary source data and on where to place them on the whatever supported destination format. I still think that the basic idea behind that project is pretty neat, just throw in whatever you happen to have and have something completely else out of the other end. And it worked as it should. It was just stupidly hungry for memory. 20k entries would eat up several gigabytes of memory from a workstation (and back then it was premium to have even 16G around) and it was also freaking slow to run (like 0.2 - 0.5 seconds per entry).
But even then I didn’t need to tweet that my hard drive is overheating. I well understood that my code is just bad and I even improved it a bit here and there, but it was still so very slow and used ridiculous amounts of RAM. The project was pretty neat and when you had few hundred items to process at a time it was even pretty good, there was companies who relied on that code and paid for support. It just totally broke down with even a slightly bigger datasets.
But, as I already mentioned, my hard drive didn’t overheat on that load.
I mean if we were to sort of steelman this thing, there sure can be database relations and queries that hit only 60k rows but are still hteavy as fuck.
60k isn’t that much, I frequently run scripts against multiple hundreds of thousands at work. Wtf is he doing? Did he duplicate the government database onto his 2015 MacBook Air?
60k is laughably, embarrassingly small. It’s still sqlite-sized.
Sqlite can easily handle millions of rows. Don’t sell it short
How about a 6.4TB sqlite database?
https://boyter.org/posts/searchcode-bigger-sqlite-than-you/
Should be enough to hold 60k rows
I’m not
I have an sqlite db that is a few GB in size, game saves using the format. Sadly almost all blob data, would love to play with it if it was a bit more readable
i mean its even excel sized depending on how many columns. This is seriously sad and alarming
Hey now that’s real close to the 65,535 16-bit limit (from 20 years ago)
Holy shit if this ids lm issue that’s too funny
60k is single json file
A TI-86 can query 60k rows without breaking a sweat.
If his hard drive overheated from that, he is doing something very wrong, very unhygienic, or both.
He probably mining crypto on top of running his SQL queries.
What? You don’t run your hard drives in the oven while baking brownies? It makes them zesty.
There must be more join statements than column names
I’ve run searches over 60k lines of raw JSON on a 2015 MacBook air without any problems.
Don’t know what Elmos minions are doing, but I’ve written code at least equally unefficient. It was quite a few years ago (the code was in written in perl) and I at least want to think that I’m better now (but I’m not paid to code anymore). The task was to pull in data from a CSV (or something like that, as I mentioned, it’s been a while) and it needed conversion to XML (or something similar).
The idea behind my code was that you could just configure which fields you want from arbitary source data and on where to place them on the whatever supported destination format. I still think that the basic idea behind that project is pretty neat, just throw in whatever you happen to have and have something completely else out of the other end. And it worked as it should. It was just stupidly hungry for memory. 20k entries would eat up several gigabytes of memory from a workstation (and back then it was premium to have even 16G around) and it was also freaking slow to run (like 0.2 - 0.5 seconds per entry).
But even then I didn’t need to tweet that my hard drive is overheating. I well understood that my code is just bad and I even improved it a bit here and there, but it was still so very slow and used ridiculous amounts of RAM. The project was pretty neat and when you had few hundred items to process at a time it was even pretty good, there was companies who relied on that code and paid for support. It just totally broke down with even a slightly bigger datasets.
But, as I already mentioned, my hard drive didn’t overheat on that load.
No, its an external drive, appearently.
I mean if we were to sort of steelman this thing, there sure can be database relations and queries that hit only 60k rows but are still hteavy as fuck.
deleted by creator