I swear I read an article about treaps but instead of being used to balance the tree, they used the weights to Huffman encode the search depth to reduce the average access time for heterogenous fetch frequencies.
I did not bookmark it and about twice a year I go searching for it again. Some say he’s still searching to this day.
Huffman coding assumes your corpus is a string of discrete elements (symbol strings) without any continuous structure (eg. topology/geometry). With that fairly mild assumption, it gives a recipe to reorganize (transform/encode) your data as a prefix-tree, to minimize the bits of information needed to communicate the contents of your corpus i.e. reducing (on average) the bits of information you need to identify a specific item. Eg. To go back to the analogy from my previous comment above... if the function you are inverting via search has long plateaus then you could simply front-load those as guesses; that's roughly the spirit of Huffman coding, except it eschews monotonicity.
And real farmers have bad days and have difficulty maintaining the prescribed application conditions.
For one, you can't control what the weather does in the afternoon after you've applied it in the morning (and it might take all morning because farms are huge and you have to tank up again)
Real farmers, all 3,500 of them in local coop, take careful measures to control everything on large farms- spraying is generaly done at night for the cooler temps, rates are watched as over spray costs $$'s etc. Seed volumes are manually run through air seeder calibrate seed weight per acre, etc.
The trend today is toward AgBot / SwarmBot type boom sprayers with onboard weather stations for wind speed and air temp, coupled with computer vision to limit spray to actual weeds rather than broad area even spray for weed / non weed alike.
Or, third option, it could mean you're one of the few scientifically literate critical thinkers who doesn't jump on one or another end of extreme arguments, looks for evidence, and understands nuance
I admit that's probably like 1 out of every 100,000 people, but they're out there
When we dockerized a service the p95 time in testing notched up a noticeable amount. I was already juggling so much other work at that point that for shits and giggles I tried vertically scaling - reducing the cluster size by half and the cores per server by 2x. Zeroed out the p95 delta.
OPS gave me shit about it, and I was like kiss my ass, the cluster costs EXACTLY the same and deployments are 25% faster.
I think people forget that in the cloud, bigger servers don't really cost more until you get crazy about it.
Because it reads like permission not to think and for a group of supposed intellectuals we spend a lot of fucking time trying not to think.
Even 'grug brained' isn't about not thinking, it's about keeping capacity in reserve for when the shit hits the fan. Proper Grug Brain is fully compatible with Kernighan's Law.
Eeeeeh. You can never trust the client, but the server is far away, so most games have speculative execution of actions and then the server can tell them to fuck off and revert/ignore the action.
I've seen the video where all these methods were tried to no avail, so I don't have much faith in them. The safest solution is to put the animal down, but of course you have to have something on hand to do that. A 4x2 to the temple should do it. That'll end the aggressor and save the victim.
"Should", maybe, but I've seen a pretty disturbing video where a pit bull took a lot more than one hit... it was multiple minutes of hits. And it only let go after it died, I've never seen anything like it.
I did not bookmark it and about twice a year I go searching for it again. Some say he’s still searching to this day.
reply