I’m surprisingly level-headed for being a walking knot of anxiety.
Ask me anything.
I also develop Tesseract UI for Lemmy/Sublinks
But seriously, this is pretty great. Thanks for putting that together.
That sounds like the one I’m recalling.
They’re usually rated 20-25, but I think I read recently that some are still producing useful power after that.
Yeah, if
’s are weird in Nginx. The rule of thumb I’ve always gone by is that you shouldn’t try to if
on variables directly unless they’re basically pre-processed to a boolean via a map
(which is what the user agent map does).
They wouldn’t even have to sell it. Just make it available for EV charging, let customers swipe their card when they park, and top up while they shop.
I don’t know if that’s any less costly than selling the power to the grid, but it might help recoup the costs quicker.
you can only get back what you paid for after 10 years.
Another way to look at it: It used to be 20-25 years, so 10 is probably the best it’s ever been for ROI.
Yeah, that’s pretty recent. All my chargers are 100W or less. I think the 240W is achieved by increasing the voltage to 48v and keeping the current maximum to 5A.
Have never seen it, but will take your word on it. lol
There’s some hardware tools on amazon that will test the power capacity, but I’m not sure about speed. Some check the characteristics of the cable, so it would stand to reason that if those characteristics are within spec, then it should perform as expected.
Something like this does A and C: https://www.amazon.com/Eversame-Multimeter-Voltmeter-Indicator-DC3-6-30V/dp/B07JYVPLLJ
Not sure how good / reliable / accurate they are. I’ve seen some homemade projects which aim to do similar, but if you’re like me and not great with a soldering iron, you might need to make due with something from Amazon lol.
The sad fact is USB-C cables are just a confusing mess of optional features. I tend to just buy ones that are rated for 100W power delivery and have video support. Those tend to cover all my bases.
And, naturally, we’d be hauling boomboxes blasting gangsta rap in the baskets of our mobility scooters. lol.
Our generation’s old-folks home gonna be lit
Less apparent with this one (I just used a clock face generator), but if the hour hand was longer as is the case with some clocks, it would look a LOT like 5 minutes till 4:00. I’m assuming OP’s shift ends at 4:00 PM.
Those are also the “mall rat” generations, so it’d be pretty fitting lol.
Fingertips for me, lol.
Ah, the old school electric car cigarette lighter. Also known as "The curious child’s first learning experience with the concept of ‘hot’ "
I already block all the LLM scraper bots via user agent.
I’ve been toying with the idea of, instead of returning 404 for those requests, returning LLM-generated drivel to poison the well.
Good. Those models flooded the internet with shit, so they can eat it.
“Don’t shit where you eat” is solid advice no matter the venue.
How much do those bots consume your bandwidth?
Pretty negligible per bot per request, but I’m not here to feed them. They also travel in packs, so the bandwidth does multiply. It also costs me money when I exceed my monthly bandwidth quota. I’ve blocked them for so long, I no longer have data I can tally to get an aggregate total (I only keep 90 days). SemrushBot alone, before I blocked it, was averaging about 15 GB a month. That one is fairly aggressive, though. Imagesift Bot, which pulls down any images it can find, would also use quite a bit, I imagine, if it were allowed.
With Lemmy, especially earlier versions, the queries were a lot more expensive, and bots hitting endpoints that triggered a heavy query (such as a post with a lot of comments) would put unwanted load on my DB server. That’s when I started blocking bot crawlers much more aggressively.
Static sites are a lot less impactful, and I usually allow those. I’ve got a different rule set for them which blocks the known AI scrapers but allows search indexers (though that distinction is slowly disappearing).
And by blocking search robots, do you stop being present in the search results or are you still present, but they do not show the content in question?
I block bots by default, and that prevents them from being indexed since they can’t be crawled at all. Searching “dubvee” (my instance name / url) in Google returns no relevant results. I’m okay with that, lol, but some people would be appalled.
However, I can search for things I’ve posted from my instance if they’ve federated to another instance that is crawled; the link will just be to the copy on that instance.
For the few static sites I run (mostly local business sites since they’d be on Facebook otherwise), I don’t enforce the bot blocking, and Google, etc are able to index them normally.
I just include the map-bot-user-agents.conf
in my base nginx.conf
so it’s available to all of my virtual hosts.
When I want to enforce the bot blocking on one or more virtual host (some I want to leave open to bots, others I don’t), I just include a deny-disallowed.conf
in the server
block of those.
# Deny disallowed user agents
if ($ua_disallowed) {
return 444;
}
server {
server_name example.com;
...
include conf.d/includes/deny-disallowed.conf;
location / {
...
}
}
So could my TI-89 back in the 90s. And four AAA batteries would run it for weeks.