Moore’s Law Is Ending… So, What’s Next?

Moore’s Law Is Ending… So, What’s Next?

Articles, Blog , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , 100 Comments

Remember when cellphones looked like this? You could call, text, maybe play snake on
it … and it had about 6 megabytes of memory, which was a small miracle at the time. Then, phones got faster and around every two
years, you probably upgraded your phone from 8 gigs to 16 to 32 and so on and so forth. This incremental technological progress we’ve
all been participating in for years hinges on one key trend, called Moore’s Law. Co-founder of Intel, Gordon Moore made a prediction
in 1965 that integrated circuits, or chips, were the path to cheaper electronics. Moore’s law states that the number of transistors,
the tiny switches that control the flow of an electrical current that can fit in an integrated
circuit, will double every two years, while the cost will halve. Chip power goes up as cost goes down. That exponential growth has brought massive
advances in computing power… hence tiny computers in our pockets! A single chip today can contain billions of
transistors, and each transistor is about 14 nanometres across! That’s smaller than most human viruses! Now, Moore’s law isn’t a law of physics,
it’s just a good hunch that’s driven companies to make better chips. But experts are claiming that this trend is
slowing down. Granddaddy chip maker Intel recently disclosed
that it’s becoming more difficult to roll out smaller transistors in a two year timeframe
while also being affordable. So, to power the next wave of electronics,
there are a few promising options in the works. One is quantum computing. Another currently in the lab stage is neuromorphic
computing, which are computer chips that are modeled after our own brains! They’re basically capable of learning and
remembering all at the same time at an incredibly fast clip. Let’s break that down and start with the
human brain. So, your brain has billions of neurons, each
of which forms synapses or connections with other neurons. Synaptic activity relies on ion channels,
which control the flow of charged atoms like sodium and calcium that make your brain function
and process properly. So, a neuromorphic chip copies that model
by relying on a densely connected web of transistors that mimic the activity of ion channels. Each chip has a network of cores, with inputs
and outputs that are wired to additional cores, which all operate in conjunction with each
other. Because of this connectivity, neuromorphic
chips are able to integrate memory, computation, and communication all together. These chips are an entirely new computational
design. Standard chips today are built based on von
Neumann architecture… where the processor and memory are separate and the data moves
between them. A central processing unit runs commands that
are fetched from memory to execute tasks. This is what’s made computers very good
at computing, but not as efficiently as they could be. Neuromorphic chips however completely change
that model by having both storage and processing connected within these “neurons” that
are all communicating and learning together. The hope is that these neuromorphic chips
could transform computers from general purpose calculators into machines that can learn from
experience and make decisions. We’d leap to a future where computers wouldn’t
just be able to crunch data at break neck speeds but could do that AND process sensory
data in real time. Some future applications of neuromorphic chips
might include combat robots that could decide how to act in the field, drones that could
detect changes in the environment, and your car taking you to a drive through for ice
cream after being dumped… basically these chips could power our future robot overlords. We don’t have machines with sophisticated,
brain-like chips yet but they’re on the horizon. So get ready for a whole new meaning for the
term “brain power.” But we have something less frightening than
AI to share with you… did you know we have a sister channel called Seeker VR? It’s everything you love about Seeker, but
in 360 degrees! SeekerVR will take you on some incredible
journeys that you probably wouldn’t get to experience otherwise. Like in a recent episode, we took a ride on
one of the most deadly trains in the world. Check it out here. Want to learn more about how the fastest computers
in the world work? We’ve got a video about them here. And am I the only one who misses my Motorola
Razr? Let us know in the comments and check back
here for more videos.

100 thoughts on “Moore’s Law Is Ending… So, What’s Next?

  • Lenny M Post author

    360 means you are going in the same direction Example you are moving north you stop turn 360 degrees you are still heading north, now if you turn 180 degrees you are facing south, the complete opposite…

  • dennis Post author

    optical transistors ???.. optical transputer with massive parrelism

  • Naixin Zong Post author

    Hey that's GPU!

  • Humble Armidillo Post author

    3:17 I got lost I’m going to sleep

  • Shädbase Post author

    In the future the internet will be obsolete and people will use something called the neuro-net were all brains will be linked together

  • cracking pirates Post author

    like vacuum tubes,transistors,IC had gone. now i think there will be a technology which could replace all the stuff we use now.

  • L. C. Post author

    Totally miss my Moto Razor.

  • Jerome Ley Post author


  • SiggyPony Post author

    I miss my Nokia 6120c 🙁 Only phone I ever brought new and paid over $300 for lol

  • Svent Istvan Post author

    And yes I loved my BRICK…… was a mobile phone!……you actually talked to a fellow person…..

  • mel saint Post author

    Nasa matured stage na tayo of technology kaya mahirap nang mag-exponential growth. Just like companies, pag large cap na limited na lang ung room for growth vs small cap. Ung VR is just a trash. Sabi nga nung ate ko nakakasuka daw ung 3D glasses. Sa AR naman, wala ding sense. Wala nang maisip na bago ung mga tech companies kasi saturated na.

  • Justin Moore Post author


  • Fillip Hulles Post author

    I see where this is going. The video went from Moore's Law to computer chips designed after the human brain. As long as you're not putting them inside of humans and changing their minds. But self learning and correcting computation is a great idea for the future.

  • Nick Martin Post author

    We already have them each phone can be turned into a neuron

  • ozzynomicon Post author

    I want ur shirt bro haha. That is awesome

  • ALGORITHM KID Post author

    1:58 it's sodium Na+ and potassium K+ ions not calcium 😑😫😁

  • Alvin Lam Post author

    You pronounced Von Neumann wrong.

  • Dark Defender Post author

    Really cool!

  • Marshall Eastwood Post author

    Is this guy from the tv show called new girl?

  • Cod4 Wii Post author

    Memristors and Quantum Computers

  • ΛUTOMΛTIC JΛCK Post author


  • valor0dragoon Post author

    So kinda like Data's brain from Star Trek?

  • Peter Egan Post author

    I just watched Terminator 2 last night. Hope this chip ends up the same rn.

  • Matthew Hackney Post author


  • Ivan Guerra Post author

    Why they insist in improve an artificial brain instead of improve our natural brain ?

  • Ivan Guerra Post author

    We don´t need more war machines, we need to stop war. We don´t need more drones scaring our birds, we need to plant more trees…

  • Ivan Guerra Post author

    Why they insist in improve an artificial brain instead to improve our natural brain ?

  • Felipe Forlin Post author

    First thing that arrives as a subject of the new technology: combat robots. Human kind: creates all sorts of amazing stuff, just to fight with each other over things that really don't matter.

  • 420hapster Post author

    Law of exponential growth…. Another law.

  • Ahmet Öztürk Post author

    Moores law will be still there till 2030 because tech companies will try to screw us

  • MintArcade Post author

    Moore's law only slows down at Intel, other manufacturers already working on 7nm already. While Intel still struggling on 10nm…

  • Higgs Boson Post author

    When a machine learn how to disobey its own algorithms it will be end of the world.

  • Paul Frederick Post author

    I never owned a cell phone in my life. But let me know when they're done and I might pick one up then.

  • Raven Post author

    I'm waiting for my car to be my best friend.

  • Maynard Hermes Post author

    Its already happening bud.The best example of an AI procesor TODAY,is the HiSilicon Kirin chipset line-up of Huawei.

  • Moe K Post author

    What is going to happen to products like Microsoft Windows if computers stop getting faster? In other words, Microsoft cant just keep making Windows bigger and bigger otherwise it would run too slow. For that matter what is going to happen to the entire Tech Industry. If computers dont get faster how are those companies going to advertise their products?

  • Nikita Wahl Post author

    the puns are REEEEEEAAAAAAL!!!!!

  • thegamingbulldog 17 Post author

    How fitting my video started at 144p

  • daichai Post author

    i miss the moto razer flip fone cause it had a game called "bejeweled twist" and for whatever reason they never ported it over to smart fones. so mad

  • Jerry J. Post author

    Programming today's advancement of AI technology to these neuromorphic chips will dramatically improve the robotics industry, and I find that kinda scary

  • Joel Robert Justiawan Post author

    The True Machine Learning chip.
    Perfect for building Android like Detroit do.

  • Ara Stu Post author

    Why do i feel like people only work on Quantum Computers to build better weapons

  • plutoburn Post author

    skynet pls.

  • Otto Ernst Post author

    I dont know if its ending just yet AMD isnt having that problem yet

  • fuk0ff PACY Post author

    1:20 intel says so, well if you take a look at amd they're already at 7nm while intel stuck on 14nm or 10nm idk

  • Josiah Woodson Post author

    …so, basically Skynet…

  • good guy Post author

    Just because intel wants to continue milking their 10 year old core i cpus doesn't mean the moore's law is ending -_-

  • a novice Post author

    that was educational.. thanks a lot seeker

  • Joshua Zincke Post author

    While it would not work for smartphones, has anyone ever considered simply making a processor Bigger. Bigger means more space for transistors.

  • ThatGuyRed Post author

    Finally, those "this super computers are gonna be the size of a phone" comments are gonna die

  • A Mohamed Nazel Post author

    cyberdyne or Skynet is coming…mmmmmmmmmm

  • lXBlackWolfXl Post author

    Computational power has increased solely for the gaming industry. You don't need a superpowerful computer for most practical applications. Office programs don't need super advanced computers. Micro controllers, which run automated systems, also don't need to be super advanced (unless you want them to respond to a real-world environment anyway).

    Also, all the funding going into computer development has come from the gaming industry, because they're the only ones who actually benefit from it. Now, they're no longer doing that, because the advances in computers is driving their industry obsolete. Why buy a specialized console when you can get a computer that can play just as advanced games and onto of that do so much more? As Nintendo said some time around when the switch was announced, people today are walking around with insanely powerful devices in their pockets, and that's posing a huge problem for them.

    Besides, moore's law can't continue on forever. There IS a limit to how small things can get, and transistors are as small as they can reasonably get (barring a major technological breakthrough). Right now what the IT industry is focusing more is optimizing their codes. We're still relying on codes in our operating systems that were made decades ago. Essentially they've just been adding on to the previous system over and over again, so what we have isn't exactly optimized. Its believed that we aren't even using the machines we have now to their current capacity. What this means is that further advancement isn't really necessary, what our computers can do will continue to change even if the hardware stays the same, and nobody knows when software will stop advancing.

    So why invest in better hardware that's driving industries obsolete and we're not fully utilizing anyway? In fact, the console gaming industry isn't planning on releasing any new consoles in the forseeable future, because they honestly see no reason to believe that their current hardware will be replaced any time soon. We may see new operating systems on them, but they plan to just keep producing the same machines until they see a reason to upgrade the hardware, which they don't currently see.

  • RockManDo KeeperOfTheStones Post author

    Antinomy which can be a dual crystalline state ..silicon is mono- use antinomy wafer would allow transistors to be simultaneously 1 or _1 or +1 and 0 _0 +0 includinh half or combined partual state of crystal axis as transistors allowing instantaneous at least x5 performance of current silicon using current architecture…one that exploits partial states or multiple state transistors would potential a easy ride past 12 the without much effort however even reaching 10ghz will require a different materiel for cpu and gpu dies – antinomy is already under some investigation for use as ultra high speed RAM' nasty stuff though- mining by product includes antinomy turns into a cyanide like substance in water and cyanide and many other toxic process required just to get raw antinomy crystalline,,, processing and refining to sub micron size suitable for wafer use would be immensely polluting but also revolutionary goodbye silicon age hello multistate antinomy processors…. virtually no heat produced by switching current or signal loss or follow through the benefits are endless and reliance on current processors silicons prooerties themselves maybe the limiting factor….MAP tm Me😁

  • wellwellwellall Post author

    that really destroyed my interest

  • Vital Mendoza Post author

    So why don´t they make bigger CPUs? More space, more transistors.

  • Aonoymous Andy Post author

    no such thing as Morse law, it is only a trend, but AMD just made a 7nm transistor on their 3rd gen ryzen cpu

  • Joseph .S Post author

    Apple broke Moore’s Law

  • Kennedy 123 Post author

    I turned the volume to the highest so my family would think I’m smart

  • Jake Martins Post author

    Will neuromorphic chips get mental illnesses?

  • Rodger Myles Post author

    Ninety five percent of this we already know. The Hope is!? Experts say? So who is giving the talk?

  • dustin buck Post author

    That's not new. We have had that shit since the 60s. It's just recently we can make them cost effective.

  • BlackMamba Mwangi Post author

    The second one after Quantum computing is how Cortana from Halo was modeled. SO IM ALL FOR THAT YERRRR🗣🗣🗣🗣

    Edit: wait I just realized she went a lil loco……nvm

  • Michael Hill Post author

    Mike's Law! Called it. It's mine now.

  • _true_ _APEX_ Post author

    This gave me a binary boner

  • 0274. door Post author

    Open the door

  • Daring Endrek Post author

    I’m more interested in Photonic chips…

  • John Dripper Post author

    Just lemma know when its ready and I'll stamp my name on it lmfao

  • Christopher Evans Post author

    Stacked cpu cores by 2023

  • James Newcombe Post author

    insert minecraft theme

  • Zhao Yun Post author

    "Charged atoms" please stop youtube, NO MORE CHARGED ATOMS PLEASE! use ions.

  • red nasorwerd Post author

    where did you get that life atlteringly awesome shirt???!?!

  • Mohit Tiwari Post author

    Your t shirt is awesome 😍

  • mightwilder Post author

    complete rubish about replacing current chips…

  • Colin Kirk Post author

    Combat robots with intelligence. I wish you guys wouldn’t use this in a sentence with otherwise decent, helpful applications. If someday these things exist, it will because we were told it was necessary and commonplace.

  • Mike Mondano Post author

    After Moore comes Dalton, and then Brosnan's Law.

  • Grand Priest Goku 神道悟空 Post author

    these new computers are cool,


  • Movie - holic Post author

    Snap sytem umm whete have i heard it 🙄🙄
    Yeah shuri to banner in black phanther

  • Above 1 Post author

    Put it in a robot !!! I volunteer !!!!!!!! Dude this is one of my plans, but without a chip and conscious transfer #Immortals that's where our investment should be delivered to it only makes sense. We need to hurry, at this pace I feel like the company would probably take it slow instead of making the leap, like most companies rather make money first and then pass the good stuff but of course they have to to succeed. I just hope that company when it shows up takes the leap instead of making humanity hold on.

  • Joshua Silva Post author

    I liked it just for his shirt

  • Phoenix Smith Post author

    Video cards don't have a problem improving and many have 384 – 512 bit busses with HBM or GDDR5 RAM with MANY more cores ! Not to mention Intel Terahertz demonstrated in 2001 running at 1000GHz with a demo chip at 100GHz. So if we could do 100 and 1000GHz in 2001 why are we stuck at 4.5GHz ? DARPA and Motorola also have demonstrated TERAHERTZ semiconductor chips running at 1000GHz and FASTER. Not to mention 5G coming out running at 60GHz and 94GHz. The same for the network speed of 1Gb that is still not the standard when 10GHz has been out since 2002. And why have RAM prices almost trippled since 2012 ? It isn't Moore's Law since we already have working devices and the Intel tech ran at LOWER power, lower voltage with 10,000 times less leakage !  Until AMD started making the newer chips we were stuck with only 4 cores too and the CPU core speed has barely improved sine 2007, 12 years ago ! A new chip is maybe 38% faster than a 2007 4 core chip. Yet video cards have greatly increased in power even without a higher clock speed and Nvidia shows testing of 10GHz electronics in it's videos.

  • Steven Madara Post author

    Whyd they kill the water molecule into wine molecule guy again????

  • Alex Bordon Post author

    Please don’t ruin this technology with “combat robots” and more stupid war toys, even this technology is based on communication and culture of information, we should learn from technology and evolution, don’t use it to ruin our comunicación, lets use that extension of human evolution to be more unit and have more collective consciousness, not collective unconsciousness that government offers us with wars an that shit

  • Cyan Redstoneer Post author

    We can run matrix if it happens

  • Bobbito Chicon Post author

    🔥 Next is the Quantum Age wich will lead to AGI

  • Ninja Gizzmo Post author

    I wonder if they are working on connecting that chip to someone’s brain to improve brain power. It could use the static electricity or your body heat to power it. Cyborgs on the horizon lol

  • Mars Bars Post author

    My Dicks Law.

  • Chen Lee Post author

    This is not a real law. Ever since this video, it has flatlined.

  • Tom Meyers Post author

    After Moore's law comes Slappy's law. Third world buffoons breeding on Welfare double every eight months.

  • Alexander Thompson Post author

    can he stand still?

  • ventisca Post author

    Phone was getting faster and smaller until we realized we could watch porn on phone.

  • Cat Poke Post author

    Could my computer become a friend that can interact with and learn with me?

  • Professor Toast Post author

    Moore's law is ending…..Milo Murphy's law is coming

  • Milo Bostrom Post author

    when you are artificially creating a human brain and you accidentally create an artificial human conscious that thinks exactly like yourself, essentially creating a clone of yourself

  • David Rodriguez Post author

    I just drank a 40oz of Old English
    And now I'm here!!!

  • BattousaiHBr Post author

    von "newmon"?

  • Krusty The Clown Post author

    Don't see the half price each years…. RTX cards are still expensive.

  • Steven Madara Post author

    We went from snake to slither

  • Shauna Ellis Post author


  • Michael Wilson Post author

    Where is my implant

  • Miguel Pereira Post author

    No mention of Photonic computing at all?

Leave a Reply

Your email address will not be published. Required fields are marked *