CarlCopplepot
110166
1378
16
your childhood PCs would not have been possible without his contributions
thank you Gordon, for all the fun and games we continue to have with out PCs due to your inventions
he predicted that the performance of computers will practically double each year
[deleted]
[deleted]
mycatwontletmeclosemybedroomdoor
So will OP's mom..
FeelingButter
It wasn't so much a prediction as it was an observation.
OneUniverse
I like how he wanted to be a teacher, but couldn't find a job.
Voric13
Crazy facts…
AllTheGoodOnesWereGone
And now we have AI which is doubling in capability approximately every eight weeks. Hang on tight!
PenguinNamedWobbles
I'll always remember him from that Billy Idol song. "In the midnight hour, she cried Moore Moore Moore."
Geo80
He looks like the bad guy from Robocop.
AllGloryToTheDarkLord
Correction: He predicted, computers will become 2x powerful, 1/2 the size, and 1/2 the cost every 18 months, and hes been on track every 18m
AllGloryToTheDarkLord
We're now at the cusp of contact-lense AR technology. https://bigthink.com/the-future/augmented-reality-ar-milestone-wearable-contacts/
darkripjaww
So, Moore's Law is dead then?
shastaRed
Lol intel is basically dead so just about
shock32638
Moores law can actually can be extended the other direction if you extend it to the Atanasoff Berry Computer in 1942 it still holds up.
phototraveler
What I’m the hell is that graph?? Are both axis years? How does that work?
somnif
It's a graph of years per year, seems fairly straight forward really.
Wombat2000
Per the chart: 'Intel ignites the trend of personal computing'. I seem to remember that credit to the Apple Macintosh, and for good reason.
TheMooreYouKnow
There can be only one. No Moore.
tankjr85
Yes but it was Federico Faggin to invent the first microprocessor called the Intel 4004, his initials F.F. can be found in the chip's die.
FromTheBeyond
He an Intel co-founder Robert Noyce, the actual inventor of the integrated circuit, made Silicon Valley a thing
Snotspill321
Read Chip War. Pretty good rundown
meganical
What’s with the mannequins in the top article? Was he into some weird stuff?
pleaseconsiderthatImightbejoking
I think they're sets of differenct computer labs
NuclearMonk
He was an avid Real Doll collector
belongsinamuseum
What good is a computer if you can’t fuck it?!?
DarkSock
He was Mannequin Thigh-Stalker, aka Dark Veiner
DarkSock
I'm somewhat disappointed anybody upvoted this… It was terrible…
TK421isAFK
Intel's co-founder inspired Noyce's Law, which predicted the dramatic increases of his last name following a South Park episode.
DeWillpower
RIP
AmbroseWolfinger
But he didn't invent Cole's Law, which is shredded cabbage salad.
LAMovieDesign
Have your fucking +1. You’ve earned it.
DragonBjorne
I mean…. when you’re right, you’re right.
EccentricNimoy
Mayo have a good day, sir!
OngoGoblogianTheArtCollector
athomp78
putcleverusernamehere
RisqueMarmalade
VincitQuiSeVincit
Moore's the pity
forResearchPurposesOnly
Clever dude. That’s the law where post the wrong thing to get the right answer.
definitelynotafae
Moores law was BS that intel hid behind to launch their processors in a specific way. Worked until they hit 9nm.
djangojazz
I am a developer but only marginally adept at hardware. I thought due to limitations of silicon wafer tech, we were finally reaching a>
djangojazz
a plateau till we could make quantum computing and other smaller alternatives more cost effective. Don't know but would like to know. I >
djangojazz
on last three machines I built it was more about faster RAM, M2 fast NVME HDD, and GPUs with super fast RAM and clock speed.
Syrpynt
Quatum computing is achievable at near (absolute) 0°K, No one is getting this hardware small+affordable enough for retail consumers.
Syrpynt
Especially when it goes directly against efficiency, environmentally-friendly needs, and a growing 1st world consumer population.
Syrpynt
Quantum computing may be possible for businesses + government entities, but the cost & downsides is too great for making it consumer-grade.
michaeloberg
NO - he predicted the *surface density of logic gates* will double. Not performance
suckstoyourauntie
I've always been fascinated by graphene, and wondered how/when it might be used for things like this. But, I'm just a pleb, I have no idea.
DarkSock
The surface density of your left nostril will double every two years
Atomic2
And every 2 years, not every year. So roughly 1.4x each year.
somnif
His initial 1965 paper said every year, he revised it later on.
Spidey209
But common culture extended this to other computer metrics such as storage space, number of triangles a gfx card could render etc.
shastaRed
viila
And we have to this day roughly continued doubling the number of gates, but have long since fallen off the performance curve.
michaeloberg
Exactly - it's why we see 12+ cores (what else are you going to do with all that space?) but a peak of around 4-5GHz. Need a diff substrate
bhobby1212
Now comes quantum entanglement
YouMayFindThisMildlyInteresting
I believe it was an observation not a prediction.
AllTheGoodOnesWereGone
Because no sane person would believe that what was about to happen could actually happen for the next 50 years.
michaeloberg
Not true, physicists knew then the limits of silicon, so could project out to those limits. Question was if we could ever get there. We did
BaloneyBob
He’ll be twice as dead next year.
WendyTheWendigo
🏆
Dapperworth
geotard
I thought he died exponentially?
Euchre
Dammit, beaten to it!
Smayds
Or, he'll be the same amount of dead, at half the price.
Bonals
Winner. A
TallynNyntyg
No, it was updated - he'll be twice as dead in *two* years. Read the flippin' graph!
AxellTheDragon
TallynNyntyg
Exactly!
aShogunNamedMarcus
This comment will be just as savage
Sonicschilidogs
Just think of how dead he'll be in 30 years
DarkSock
so ded
PocketCleric
2^30 dead
DarkSock
🎶🎵...in the midnight hour, she cried Moore Moore Moore...🎶🎵"
TK421isAFK
And in the midnight hour 2 years from now, she'll cry Moore Moore Moore Moore Moore Moore.
DarkSock
TK421isAFK
fnxweb
It apparently paused for a few years ~2015 while they changed fabrication techniques but is back (now doubling every two years).
Edowyth
It's kind of being cheated nowadays with multiple cores instead of pushing it all on one core... And we're getting really close to the limit
Edowyth
with our current designs: electrons kind of just hop around if you place the "wires" much closer than they are currently. So, even if you
Edowyth
allow for multi-core and 3d placement of transistors, we're still approaching the end of it. Brilliant projection, but it is ending.