I’d be more than happy to sacrifice a distro I don’t care about like Ubuntu to the mainstream if it means Microsoft’s market cap gets a sizeable chunk taken out of it.
I’d be more than happy to sacrifice a distro I don’t care about like Ubuntu to the mainstream if it means Microsoft’s market cap gets a sizeable chunk taken out of it.
I guess, in a very liberal definition of the term, “cloud gaming”. Specifically the old LodgeNet systems in hotels where you could rent Nintendo games by the hour to be streamed to your room from a physical console somewhere behind the front desk. Every room had a special controller with oodles of extra buttons on it hardwired to the television that also functioned as television remotes.
The service was objectively awful, of course, when factoring in how much the hotel charged compared to what little you got for it. But I’ve always found it fascinating.
My true hell would be instances only federating explicitly through whitelist. If what the other reply I received about Mastodon is correct, and if Lemmy behaves similary, then they operate on an implicit auto-federation with every other instance. Actual transaction of data needs to be triggered by some user on that instance reaching out to the other instance, but there’s no need for the instances involved to whitelist one another first. They just do it. To stop the transfer, they have to explicitly defed, which effectively makes it an opt-out system.
The root comment I initially replied to made it sound, to me, like Mastodon instances choose not to federate with one another. Obviously they aren’t preemptively banning one another, so, I interpreted that to mean Mastodon instances must whitelist one another to connect. But apparently what they actually meant was, “users of Mastodon instances rarely explore outward”? The instances would auto-federate, but in practice, the “crawlers” (the users) aren’t leaving their bubbles often enough to create a critical mass of interconnectedness across the Fediverse?
The fact we have to have this discussion at all is more proof to my original point regardless. Federation is pure faffery to people who just want a platform that has everything in one place.
That sounds worse than I thought it was. I just assumed Mastodon was like Lemmy, where every instance federates with every other instance basically by default and there’s only some high-profile defed exceptions.
A Fediverse where federations are opt-in instead of opt-out sounds like actual hell. Yeah, more control to instances, hooray, but far less seamless usability for people. The only people you will attract with that model are the ones who think having upwards of seven alts for being in seven different communities isn’t remotely strange or cumbersome. That, and/or self-hosting your own individual instances. Neither of these describe the behavior of the vast majority of Internet users who want to sign up on a platform that just works with one account that can see and interact with everything.
Season’s the reason!
Art supplies were historically not cheap. If you wanted to do this for a living, you were probably needing to aim for selling your art to the rich upper class. That implicitly meant catering to their fickle tastes and working on commission. You didn’t make art for you and find your audience later, you made art for the customers you had or you starved.
And to put it bluntly, realism wasn’t the fashionable hotness for most of human history. The more “crude” styles you may think of as objectively inferior to and less technically impressive as realism were in fact the styles in demand at their respective times. Fashion existed in ancient and medeival times just like it does today, and those styles were the fashion.
The idea of the independent eccentric artist who lives secluded in their ideas cave producing masterpieces for no one in particular leaving the world in awe at their genius every time they come out with something to show is a very modern concept. If any artist wanted to make a realism painting in an era where it was not popular, they’d be doing it purely for themselves at their own expense. So virtually no one did. Or if they did, their works largely didn’t survive.
This is basically asking why anyone would live in or near a city like Los Angeles or New York City when Minot exists and has everything you could possibly need.
If you had to look up where Minot even is, you’ve proven my point.
Say what you will about whether living near the proverbial big city is worth it or not. But it cannot be denied, there is a world of experiences on offer at larger platforms that a smaller platform simply cannot provide. Network effect can be a cruel mistress.
I’m pretty sure they’re referring to the concept of defederation and how that can splinter the platform.
Bluesky is ““federated”” in largely the same ways as Mastodon, but there’s basically one and only one instance anyone cares about. The federation capability is just lip service to the minority of dorks like us who care.
To the vast majority of Twitter refugees, federation as a concept is not a feature, it’s an irritation.
I don’t think the existence of large instances is in itself strictly antithetical to decentralization. The network effect makes them inevitable.
The power in the fediverse is everyone has a standard toolset to interact with the entire fediverse. Most people won’t, and that’s okay. The important thing is that, should larger communities become too oppresive as they gentrify, replacing them is a cheap decision, as you and everyone like-minded with you can squad up and leave at any time and lose nothing as the standard tooling of the platform facilitates that migration. You have mobility in the fediverse, and that permits choice to those who seek it.
This will stop being true once the larger instances start augmenting their experiences with proprietary nonsense. Features that only work there, that you can invest into and become dependant on, that you’d have to give up if you leave.
The day that happens will be the day that chunk of the Fediverse dies. Or, well, it won’t die, it will probably flourish and do very well. But it won’t be the Fediverse anymore. It will just be another knee-high-fence-gated community, that happens to run on Fediverse tech.
Happy Debian daily driver here. I would never ever recommend raw Debian to a garden variety would-be Linux convert.
If you think something like Debian is something a Linux illiterate can just pick up and start using proficiently, you’re severely out of touch with how most computer users actually think about their machines. If you even so much as know the name of your file explorer program, you’re in a completely different league.
Debian prides itself on being a lean, no bloat, and stable environment made only of truly free software (with the ability to opt-in to nonfree software). To people like us, that’s a clean, blank canvas on a rock-solid, reliable foundation that won’t enshittify. But to most people, it’s an austere, outdated, and unfashionable wasteland full of flaky, ugly tooling.
Debian can be polished to any standard one likes, but you’re expected to do it yourself. Most people just aren’t in the game to play it like that. Debian saddles questions of choice almost no one is asking, or frankly, even knew was a question that was ask*-able*. Mandatory customizeability is a flaw, not a feature.
I am absolutely team “just steer them to Mint”. All the goodness of Debian snuck into their OS like medicine in a kid’s dessert, wrapped up in something they might actually find palatable. Debian itself can be saved for when, or shall I say if, the user eventually goes poking under the hood to discover how the machine actually ticks.
Everything works the same, times of website incompatibility are long gone.
Not completely true. It’s mostly true. I’ve daily driven Firefox for years, and the number of websites I’ve crossed that wouldn’t function in it correctly but would work just fine in Chrome was very slim… but not zero. Definitely not comparable to the complete shitshow of the 90’s and 00’s. That’s true. But it’s not a completely solved problem.
And with Mozilla’s leadership practically looking for footguns to play with combined with the threat of Google’s sugar daddy checks drying up soon due to the antitrust suit (how utterly ironic that busting up the monopoly would actually harm the only competition…), that gap can get much worse in very little time if resources to keep full time devs paid disappear.
I recognize three kinds of comments that have different purposes.
The first kind are doc block comments. These are the ones that appear above functions, classes, class properties, methods. They usually have a distinct syntax with tags, like:
/*
* A one-line description of this function's job.
*
* Extra details that get more specific about how to use this function correctly, if needed.
*
* @param {Type} param1
* @param {Type} param2
* returns {Type}
*/
function aFunctionThatDoesAThing(param1, param2) {
// ...
}
The primary thing this is used for is automatic documentation generators. You run a program that scans your codebase, looks for these special comments, and automatically builds a set of documentation that you could, say, publish directly to a website. IDEs can also use them for tooltip popups. Generally, you want to write these like the reader won’t have the actual code to read. Because they might not!
The second kind is standalone comments. They take up one or more lines all to themselves. I look at these like warning signs. When there’s something about the upcoming chunk of code that doesn’t tell the whole story obviously by itself. Perhaps something like:
/* The following code is written in a weird way on purpose.
I tried doing <obvious way>, but it causes a weird bug.
Please do not refactor it, it will break. */
Sometimes it’s tempting to use a standalone comment to explain what dense, hard-to-read code is doing. But ideally, you’d want to shunt it off to a function named what it does instead, with a descriptive doc comment if you can’t cram it all into a short name. Alternatively, rewrite the code to be less confusing. If you literally need the chunk of code to be in its confusing form, because a less confusing way doesn’t exist or doesn’t work, then this kind of comment explaining why is warranted.
The last kind are inline comments. More or less the same use case as above, the only difference being they appear on the same line as code, usually at the very end of the line:
dozen = 12 + 1; // one extra for the baker!
In my opinion, these comments have the least reason to exist. Needing one tends to be a signal of a code smell, where the real answer is just rewriting the code to be clearer. They’re also a bit harder to spot, being shoved at the ends of lines. Especially true if you don’t enforce maximum line length rules in your codebase. But that’s mostly personal preference.
There’s technically a fourth kind of comment: commented-out code. Where you select a chunk of code and convert it to a comment to “soft-delete” it, just in case you may want it later. I highly recommend against this. This is what version control software like Git is for. If you need it again, just roll back to it. Don’t leave it to rot in your codebase taking up space in your editor and being an eyesore.
That’s the thing, though. I computed from the claimed figure above of 13 billion net income. The costs are already accounted for.
If you make $50k/yr after taxes, the equivalent fine would be on the order of about $120.
Where I’m from, that’s a speeding ticket.
Emojis to me are like a strongly flavored seasoning. It’s only appropriate in specific contexts, and even in those contexts, just a pinch goes a long way. Too much and it can detract from the experience.
Emojipasta is grossly overseasoned food. But that’s the point, obviously. It’s the emoji version of those white women on Tiktok who throw three pounds of ground beef wrapped around an entire block of cheese in a baking sheet full of milk and bake it in the oven for rage clicks.
Me, personally, I usually don’t need emoji seasoning. I’m fine with it plain. Besides, most emojis to me have all the class of drowning your entire meal in ranch dressing. There are a very small handful of exceptions. But that’s just my lame opinion.
And of the ones I do find theoretically useful, I’m always hesitant to use them, because emoji rendering is platform specific. They’re not quite like text, where the glyphs are entirely utilitarian and typeface it’s written in conveys little to no information. But with emojis, the subleties pile up. A thinking emoji rendered on a Windows PC isn’t quite the same as a thinking emoji on an iPhone, or various kinds of Android phones. Unless I’m on a platform like Twitter or Discord that forces all clients to use a single emoji set, I can never confidently send a precise emotion with an emoji.
Platforms like Discord that let you create your own emojis instead of using the comparatively sterile, corporate-approved, general purpose set provided in standard Unicode is another story. I like those and use them extensively. If Lemmy natively supported a Discord-esque system where instances or communities could define custom emojis that didn’t rely on custom clients, plugins, or instance-specific rendering hacks, I’d use them all the time. Though this would, I presume, be to the extreme chagrin of many.
Choice is an irritating speed bump to people who don’t care to choose, which unfortunately is most of them.