I am so tired of people, especially people who pretend to be computer experts online, completely failing to understand what Moore’s Law is.
Moore’s Law != “Technology improves over time”
It’s an observation that semiconductor transistor density roughly doubles every ~2 years. That’s it. It doesn’t apply to anything else.
And also for the record, Moore’s Law has been dead for a long time now. Getting large transistor density improvements is hard.
I’m gonna go on “no stupid question” and ask why my old hard drives aren’t doubling in size.
You need to properly feed, water and fertilize them. If you don’t do this, your old hard drives will just waste away until they’re just a few megabytes, not flourish into giant petabyte trees.
Did you try evolving them?
You have to walk around in the right environment otherwise they’re all going to turn into generic eevees, and you don’t want that
deleted by creator
Sure, but also no.
More’s law is at the most fundamental level a observation about the exponential curve of technological progress.
It was originally about semiconductor transistors and that is what Moore was specifically looking at but the observed pattern does 100% apply to other things.
In modern language the way language is used and perceived determines its meaning and not its origins.
More’s law is at the most fundamental level a observation about the exponential curve of technological progress.
No. Let me reiterate:
Moore’s Law was an observation that semiconductor transistor density roughly doubles every ~2 years.
It is not about technological progress in general. That’s just how the term gets incorrectly applied by a small subsect of people online who want to sound like they’re being technical.
Moore’s Law is what I described above. It is not “technology gets better”.
I meant that sentence quite literally, semiconductor is technology. My perspective is that original “moors law” is only a single example of what many people will understand when they hear the term in a modern context.
At some point where debating semantics and those are subjective, local and sometimes cultural. Preferable i avoid spending energy on fighting about such.
Instead il provide my own line of thinking towards a fo me valid reason of the term outside semiconductors. I am open to suggestions if there is better language.
From my own understanding i observe a pattern where technology (mostly digital technology but this could be exposure bias) gets improving at an increasingly fast rate. The mathematical term is exponential.
To me seeing such pattern is vital to understand whats going on. Humans are not designed to extrapolate exponential curves. A good example is AI, which large still sucks today but the history numbers don’t lie on the potential.
I have a rather convoluted way of speaking, its very unpractical.
Language,at best, should just get the message across. In an effective manner.
I envoke (reference) moores law to refer to the observation of exponential progress. Usually this gets my point across very effectively (not like such comes up often in my everyday life)
To me, moors law in semiconductors is the first and original example of the pattern. The fact that this interpretation is subjective has never been relevant to getting my point across.
but the observed pattern does 100% apply to other things.
Sure, if you retroactively go back and look for patterns where it matches something but that isn’t a very useful exercise.
In modern language the way language is used and perceived determines its meaning and not its origins.
This is technically correct but misleading in this context, given that it falsely implies that the original meaning (doubling transistor density every 2y) became obsolete. It did not. Please take context into account. Please.
Furthermore you’re missing the point. The other comment is not just picking on words, but highlighting that people bring “it’s Moore’s Law” to babble inane predictions about the future. That’s doubly true when people assume (i.e. make shit up) that “doubling every 2y” applies to other things, and/or that it’s predictive in nature instead of just o9bservational. Cue to the OP.
Please take context into account. Please.
(this is a lil’ lemmy thread and I think everyone understands what OP had in mind)
In modern language the way language is used and perceived determines its meaning and not its origins.
So we should start calling monitors computers, desktop towers modems (or CPUs (or hard drives)), wifi as internet, browsers as search engines and search engines as browsers. None of this is incorrect, according to the average person.
Do you have to archive all the porn in the Internet?
“We do these things not because they are easy. But because we are hard!” -JFK
Did I just pave way to the greatest joke today.
With how often videos get removed or set to private? Yes.
Moore’s law is about circuit density, not about storage, so the premise is invalidated in the first place.
There is research being done into 5D storage crystals, where a disc can theoretically hold up to 360TB of data, but don’t hold your breath about them being available soon.
This is true, but…
Moore’s Law can be thought of as an observation about the exponential growth of technology power per $ over time. So yeah, not Moore’s Law, but something like it that ordinary people can see evolving right in front of their eyes.
So a $40 Raspberry Pi today runs benchmarks 4.76 times faster than a multimillion dollar Cray supercomputer from 1978. Is that Moore’s Law? No, but the bang/$ curve probably looks similar to it over those 30 years.
You can see a similar curve when you look at data transmission speed and volume per $ over the same time span.
And then for storage. Going from 5 1/4" floppy disks, or effing cassette drives, back on the earliest home computers. Or the round tapes we used to cart around when I started working in the 80’s which had a capacity of around 64KB. To micro SD cards with multi-terabyte capacity today.
Same curve.
Does anybody care whether the storage is a tape, or a platter, or 8 platters, or circuitry? Not for this purpose.
The implication of, “That’s not Moore’s Law”, is that the observation isn’t valid. Which is BS. Everyone understands that that the true wonderment is how your Bang/$ goes up exponentially over time.
Even if you’re technical you have to understand that this factor drives the applications.
Why aren’t we all still walking around with Sony Walkmans? Because small, cheap hard drives enabled the iPod. Why aren’t we all still walking around with iPods? Because cheap data volume and speed enabled streaming services.
While none of this involves counting transistors per inch on a chip, it’s actually more important/interesting than Moore’s Law. Because it speaks to how to the power of the technology available for everyday uses is exploding over time.