This. Check out the “Psychological horror” tag for some excellent examples of trolltagging.
This. Check out the “Psychological horror” tag for some excellent examples of trolltagging.
For me, a good interview is a dialogue where the company representative shows me as much about the company as I do about me as a candidate. Take-home tasks are okay, I guess, but I suspect they might balk at me requesting they handle a mock HR issue, or whatever, for me!
Yeah, that’s my reading as well.
If you’re optimizing that hard you should probably sort the data first anyway, but yeah, sometimes it’s absolutely called for. Not that I’ve actually needed that in my professional career, but then again I’ve never worked close enough to metal for it to actually matter.
That said, all of these are implemented as functions, so they’re already costing the function call anyway…
This is why I think school and interviews are like a whole different universe from the one where actual work gets done.
“No, Vaas, that’s the definition of practice.”
…well, I do that, and enjoy it, so I guess that’s why I feel like an impostor that has my hobby for a job. “If they figure out how much I enjoy doing this, they’ll cut my pay…”
There is a “Not from a Jedi” joke in here somewhere. I can feel it.
The comments are not for what, they are for why.
The documentation is a summary of the code, a quick guide to the software to more easily find your way to what you need to work with.
Are you saying that when you work with some random library, you skip their documentation and go directly to the source code? That’s absurd. If you do it that way, you’re wasting so much time!
“Water accused of being wet in lawsuit” next, I guess.
Hah, that’s part of what I want to do with z0rz.net
Okay, I’ll take your word for it.
I’ve never ever, in many hours of playing with ChatGPT as a toy, had it make up a word. Hallucinate wildly, yes, but not stogulate a word out of nothing.
I’d love to know more, though. How does it combine new words? Do you have any examples of words ChatGPT has made up? This is fascinating to me, as it means the model is much less chained to the training data than I thought.
I used to do a lot of work in vim, over SSH. Five PuTTY windows, one of which was always showing cmatrix
Shared an office with a Business Analyst, so he was way more impressed with my “matrixing”, than I was with his “spreadsheeting”.
Yep yep, statistical analysis as to the frequency of tokens in the training text.
Brand new, never-before-seen Windows keys have a frequency of zero occurrences per billion words of training data.
I use tabs because I prefer 4-space indents and others might prefer 2-space indentation or the gross and unacceptable 6-space indentation.
If more than one person is working on a code base, there will likely be more than one preference, and with tabs everyone gets to just set their own tab width.
Yes, even the 3-space savages.
Trolling is a art.