Moore’s Law and The Secret World Of Ones And Zeroes

Moore’s Law and The Secret World Of Ones And Zeroes

Articles, Blog , , , , , , , , , , , , , , , 100 Comments

Behold! The transistor, a tiny switch about
the size of a virus that can control the flow of a small electrical current. It’s one of
the most important inventions ever because when it’s on, it’s on and when it’s off, it’s
off. Sounds simple. Probably too simple. But this “either/or” situation is incredibly useful
because it is a binary system. On or off, yes or no, one or zero. But with enough transistors
working together we can create limitless combinations of “ons” and “offs”, “ones” and “zeros” to
make a code that can store and process just about any kind of information you can imagine. That’s how your computer computes, and it’s
how you’re watching me right now. It’s all because those tiny transistors can be organized,
or integrated into integrated circuits also known as microchips or microprocessors, which
can orchestrate the operation of millions of transistors at once. And until pretty recently,
the only limitation to how fast and smart our computers could get was how many transistors
we could pack onto a microchip. Back in 1965, Gordon Moore, co-founder of
the Intel Corporation, predicted that the number of transistors that could fit on a
microchip would double every two years. So essentially every two years computers would
become twice as powerful. This is known in the tech industry as Moore’s Law, and for
forty years it was pretty accurate; we went from chips with about 2,300 transistors in
1972, to chips with about 300 million transistors by 2006. But over the last ten years we’ve fallen behind
the exponential growth that Moore predicted. The processors coming off assembly lines now
have about a billion transistors, which is a really big number, but if we were keeping
up with Moore’s Law, we’d be up to four or five billion by now. So why is the trend slowing down? How can
we get more transistors onto a chip? Are there entirely different technologies we could be
using instead, ones that pose no limitations? And how do billions of little on/off switches
turn into movies and music and YouTube videos about science that display on a glowing, magical
box? Spoilers: it’s not magic; it’s science. [SciShow intro music] To understand the device that you’re using
right now as well as the challenges computer science is facing, and what the future of
computing might look like, you have to start small with that transistor. A transistor is
essentially a little gate that can be opened or shut with electricity to control the flow
of electrons between two channels made of silicon, which are separated by a little gap.
They’re made of silicon because silicon is a natural semiconductor. It can be modified
to conduct electricity really well in some conditions or not at all in other conditions.
In its pure state, silicon forms really nice, regular crystals. Each atom has four electrons
in its outer shell that are bonded with the silicon atoms around it. This arrangement
makes it an excellent insulator. It doesn’t conduct electricity very well because all
of its electrons are spoken for. But you can make that crystalline silicon conduct electricity
really well if you dope it. You know, doping, when you inject one substance into another
substance to give it powerful properties, like what Lance Armstrong did to win the the
Tour De France seven times, only instead of super-powered tiger blood or whatever, the
silicon is doped with another element like phosphorous, which has five electrons in its
outer shell; or boron, which has three. If you inject these in to pure crystal silicon,
suddenly you have extra unbonded electrons that can move around, and jump across the
gap between the two strips of silicon. But they’re not gonna do that without a little
kick. When you apply a positive electrical charge to a transistor, that positive charge
will attract those electrons, which are negative, out of both silicon strips, drawing them in
to the gap between them. When enough electrons are gathered, they turn in to a current. Remove
the positive charge, and the electrons zip back in to their places leaving the gap empty.
Thus the transistor has two modes: on and off, one and zero. All the information your computer is using
right now is represented by sequences of open and shut transistors. So, how does a bunch
of ones and zeroes turn in to me talking to you on your screen right now? Let’s just imagine
eight transistors hooked up together. I say 8 because one byte of information is made
out of 8 bits, that’s 8 on or off switches, that’s the basic unit of a single piece of
information inside your computer. Now the total number of possible on/off configurations
for those 8 transistors is 256. That means 256 combinations of ones and zeroes in that
8 bit sequence. So let’s say our 8 transistor microchip is given this byte of data, that’s
the number 67 in binary by the way. Okay, so what now? The cool thing about binary data is that the
same string of ones and zeroes can mean totally different things depending on where it’s sent. Different parts of your computer use different
decoding keys to read the binary code. So if our teeny tiny little 8 transistor microchip
kicks that byte over to our graphics card, our graphics card will interpret it as one
of 256 colors. Whichever one is coded as number 67. But if that same byte is sent over to
our sound card, it might interpret it as one of 256 different spots mapped on to a sound
wave. Each spot has its own sound and our byte will code for a spot number 67, so your
speaker will put out that sound. If it’s sent over to the part of your computer
that converts data into written language, called the UTF-8 code, it turns it into the
letter C. Uppercase C actually, not lowercase c which is a different byte. So our eight
transistor processor already has a lot of options; the problem is that it can only manage
one byte of data at a time, and even if it’s flying through bytes at a rate of a few million
per second, which your computer is doing right now, that’s still a serious data checkpoint,
so we need more transistors, and then more, and more, and more, and more! And for the past 50 years, the biggest obstacle
to cramming more and more transistors onto a single chip, and therefore increasing our
processing power, has come down to one thing – how small we can make that gap between the
two silicon channels. In the early days of computing, those gaps
were so big that you could see them with the naked eye. Today, a state-of-the-art microchip
has gaps that are only 32 nanometers across. To give you a sense of perspective, a single
red blood cell is 125 times larger than that. 32 nanometers is the width of only a few hundred
atoms. So, there’s a limit to how low we can go.
Maybe we can shave that gap down to 22 or 16 or even 10 nanometers using current available
technology, but then you start running into a lot of problems. The first big problem is that when you’re
dealing with components that are so small that just a few stray atoms can ruin a chip,
it’s no longer possible to make chips that are reliable or affordable. The second big problem is heat. That many
transistors churning through millions of bytes of data per second in such a small space generates
a lot of heat. I mean, we’re starting to test chips that get so hot that they melt through
the motherboard, and then sometimes through the floor. And the third big problem is quantum mechanics.
Oh, quantum mechanics, you enchanting, treacherous minx. When you start dealing with distances
that are that small, you start to face the very real dilemma of electrons just jumping
across the gap for no reason, in a phenomenon known as quantum tunneling. If that starts
happening, your data is gonna start getting corrupted while it moves around inside your
computer. So, how can we keep making our computers even
faster when atoms aren’t getting any smaller. Well, it might be time to abandon silicon. Graphene, for example, is a more highly conductive
material that would let electrons travel across it faster. We just can’t figure out how to
manufacture it yet. Another option is to abandon electrons because,
and get ready to have your mind blown, electrons are incredibly slow. Like, the electrons moving
through the wire that connects your lamp to the wall outlet, they’re moving at about 8
and a half centimeters per hour. And that’s fast enough when electrons only have to travel
32 nanometers, but other stuff can go a lot faster. Like light. Optical computers would move around photons
instead of electrons to represent the flow of data. And photons are literally as fast
as anything can possibly be, so you can’t ask for better than that. But, of course,
there are some major problems with optical computing, like the fact that photons ARE
so fast that it makes them hard to pin down for long enough to be used for data. And the
fact that lasers, which are probably what optical computing would involve, are huge
power hogs and would be incredibly expensive to keep running. Probably the simplest solution to faster computing
isn’t to switch to fancy new materials or harness the power of light, but to just start
using more chips. If you’ve got four chips processing a program in parallel, the computer
would be four times faster, right? Welllll, yeah, I mean es, but microchips are
super expensive, and it’s also hard to design software that makes use of multiple processors.
We like our flows of data to be linear because that’s how we tend to process information
and it’s kind of a hard habit to break. And then there are some really exotic options,
like thermal computing which uses variations in heat to represent bits of data, or quantum
computing which deals in particles that are in more than one state at the same time, thereby
totally doing away with the whole on-off, either-or system. So, wherever computers go next, there are
gonna need to be some big changes if we want our technology to keep getting smaller, and
smarter, and faster. Personally, I’m holding out hope for the lasers,
laser computer- I want one of those. Thanks for watching the SciShow Infusion,
especially to our Subbable subscribers. To learn how you can support us in exploring
the world, whether it’s inside your computer or outside in the universe, just go to And speaking of that whole universe, check
out our new channel, SciShow Space where we talk about that, including the latest in space
news, and as always don’t forget to go to and subscribe, so that
you can always keep getting more of this, because I know you like it. [SciShow outro music]

100 thoughts on “Moore’s Law and The Secret World Of Ones And Zeroes

  • Mosie Neakngen Post author

    What if we used neuromorphic computing?

  • Pink Tigress Post author

    I laughed so hard when he called devices "glowing magical box" I have no idea why….

  • Firebully11 Post author

    year: 2040
    friend walks up to you
    friend: Hey, I see you bought a new computer, what components did you get for it?
    you: its a new kind of computer that came out last year.
    friend: yeah? that sounds cool how does it work then?
    you: IT USES LASERS!

  • Matas Post author

    It would be very cool to have a computer, which uses light instead of electricity, so you won't need to look for RGB components, because whole PC would be RGB 😀

  • Mark Nono Post author

    Transistors are not always binary.

  • Paul Smith-Keitley Post author

    Presenter is nuts, but brilliant – very engaging!

  • GTIP productions Post author

    currently my PC is using a 14nm microarchitecture in the CPU and a 16nm microarchitecture for the GPU so things have gotten a bit smaller than when this video was made

  • efraim netanyahu Post author

    Send more document all papers to signee appointmentMOA agreement from the phillippinesCall me at 09277776747Sergio Adino,MOA agreement form ,asap

  • Vectix Post author

    01011001 01101111 01110101 00100111 01110010 01100101 00100000 01100001 00100000 01101110 01100101 01110010 01100100 00101110

  • Ashton Simmons Post author

    TIL Neil Armstrong injects tiger blood

  • Jason martin Post author

    Just make bigger chips

  • Richard Zippler Post author

    Well to update this a bit, Intel's currently using 14nm process for it's current CPU's and is working on 10nm

  • odiousominious Post author

    hey what about organic, anything in that department. brains still process more, or am I wrong now? I dont know if i want to be wrong about this Xd

  • Justin Dias Post author

    Awesome !!

  • Messy Messr Post author

    Damn, them fat photons!

  • Chad Kent Post author

    Good video. Thanks for the continued education

  • Keplergamer Post author

    Why do the CPU has to be so small?

  • Magnus Juul Post author

    Silicons is pretty dope to use in electronics XD

    edit: that joke is just dead now… 🙁

  • r hous Post author

    Why is it when things like memory it increased it's done by doubling. Like 8, 16, 32, etc.?.

  • Victor Rodriguez Post author

    I'm the 1,000,020 view, i feel important

  • Tofu Kingpin Post author

    Solution: make atoms smaller, obviously!

  • Karim Post author

    Just a reminder we have billions now the law is still applies

  • DatAlien Post author

    Video about computer chips, thumbnail of an empty cpu socket

  • k0rls Post author

    There are 10 kind of people. The ones who can count in the binary system and those who can not.

  • Todd Starbuck Post author

    It was a prediction, not a law.

  • Dyslexic Artist Theory on the Physics of 'Time' Post author

    Could the wave particle duality be acting like the bits or zeros and ones of a computer. This would form an interactive process continuously forming a blank canvas that we can interact with turning the possible into the actual!

  • Ian Moriarty Post author

    Instead of making transistors smaller or using multiple chips, why not take the chips bigger? Double the size of a current chip, double the amount of transistors. I'm sure many people would be ok with having a marginally bigger machine if it would mean it's more powerful in the end.

  • Charles Darwin Post author

    4minutes to ask the same question twice.

  • Hanvit Lee Post author

    Silicone limitations

  • Hanvit Lee Post author

    Well actually, the gpu (graphics processor unit) vram (video memory) remembers what has to go on the screen and the gpu core processes this and changes it into a signal sent through a displayport port, hdmi port and/or dvi port

  • Hanvit Lee Post author

    Use 4 processors at once you said. It'll be 4 times faster you said.
    Look at sli.

  • Nate A Post author

    for computers like desktops, servers, or anything else that is not small like a cell phones or tablets, it seems to me like they need to stop trying to make everything smaller for a little while. i understand the appeal of making things smaller and smaller for small devices, but those devices do not really require a huge increase in performance in the foreseeable future. i think if they were to make the chip itself physically a bit bigger, that would gain a huge amount of leeway. intel cpus in desktop are pretty small as it is where a doubling in physical size would not make them all that much bigger. in servers it would be very beneficial to be able to buy chips that are, say 4 times bigger physically than current ones. imagine the number of transistors they could pack into a chip that size, and it could still fit in any normal case with a slightly larger heatsink.

  • Drake Dragon Post author

    This is probably the best episode…for me.

  • Storyline Fin Post author

    Your just a linus tech tips rip off broski

  • trebor trahrebe Post author

    Quantum computers fused with human brains!

  • Jacobi H3rbshire Post author

    Where did you get that shirt? It's 01100100 01101111 01110000 01100101 00100001

  • Martin Rangel Post author

    Who else thinks that we're already to smart for our own good. I say enough.

  • sourav das Post author

    Now we have 10 nm hahahah

  • Jesse Fritz Post author

    go 3D on the layers. start stacking. decrease the heat and just build up.

  • Leandro Costa Post author

    +1 to lasers computers 🙂

  • Steven Bulfer Post author

    As an electrical engineer, I cringe at the inaccuracies in your explanation, but yes I get it, you gotta dumb it down for the masses.

  • johenrique21 Post author

    I don't know you guys, but I think this was his best show. I laugh so much! Thank you man that talks fast…..I mean….. real fast!

  • Sahil Doshi Post author

    2min intro😱

  • traso Post author

    if electrons are so slow why turning the light on etc is instant?

    well it isn't but is stupidly fast because electrons don't travel all of that distance, they push other electrons and this push is chained sort of having a pipe full of balls then you insert one and push it so the ball at the end moves almost instantly

  • Ricky Bonezz Post author

    Quantum computers

  • Pratip Sarkar Post author

    Man…ur just literally RAPPING!!

  • Austin Harding Post author

    i enjoy your videos, but really, the quick in-an-out witty remarks arn't necessary, people that watch these are already interested in the subject and if you want to grab a larger audience, people that have to be woed by such fancies wouldn't be into such videos in the first place

  • Mr McGee Post author

    All these new computers with life-like speed while I’m in my basement using a vista

  • Spartacus547 Post author

    I think maybe a combination of some of these ideas working together would probably bridge that Gap it might be hard to figure out to get Optical computers to work with quantum computers and multi chip functioning processors that work on a nonlinear architecture would also be difficult but not impossible there's never a single thing that really can be a silver bullet fixed two large problems that usually ends up being a multi-faceted solution working together with multiple ideas and combinations of new technologies not one single technology to basically control all, more of a single flow of ideas versus a single technology to flow all ideas.

  • aaron faucett Post author

    It was 18 months

  • rosenvitae Post author

    The current of electrons from my outlet moves at about 8.5cm/hour?? Omg, mind blown. But wait, then speed of current isn't responsible for the near-instantaneous reactiontime of computing power (seems more on par with the speed like lightning, electromagnetic stuff). In that case, the argument of fiberoptic transistor using the speed of light compared to speed of current seems illogical – more an argument of speed of light vs speed of electromagnetic stuff?

  • Z2ZProductions Post author

    What if someone made a website that automatically generated pictures, so every picture that could ever be made was made. That’s all

  • Nina Williams Post author

    For some reason I dont understand

  • Zeno Post author

    "and sometimes, through the floor."

  • Adli Safiy Azhar Post author

    Yeah been thinking about graphene… Come material scientist 😂

  • Lemon Lord Post author

    Little did he know we are at a 7nm size now

  • steve H. Post author

    The video starts at 2min wow good video though

  • jjhack3r Post author

    9:06 "cuz I know you like it"

  • fadingbeleifs Post author

    And here we are, in 2018… And we have CPUs manufactured on a 5 nanometer scale…

  • fadingbeleifs Post author

    And the phone I'm watching this on has 8 CPU cores, 4 GB of RAM… And is made on (I believe) a 10 nm process…

  • Hussain Nawaz Lalee Post author

    More and more and more and more xD 5:15

  • John Mariano Post author

    Cool, now let me try.

    Transistor (Brain), do I do my homework?

    input (procrastinate = 1, responsibleStudent = 0)

    Procrastinate me, the cpu has spoken.

  • xsoon17 Post author

    Now there are 14nm FinFet Transistors

  • suraj tiwari Post author

    Nice, detailed video. Crisp & clear!

  • Karmanya Gupta Post author

    I loved that statement "oh quantum computing that enchanting trecherous mink"
    also i think that we should switch to parallel computing like Spacex dragon uses
    and also the processing could be switched to optics and storage as usual.
    prhps….. that would help 😛

  • Globe Shangrila Post author

    Send more info about moore law..09331501985 update buyer ready bank to bankjewishbanker

  • Stink simply stink Post author

    14 nanometers… the new archetecture is great

  • sev Post author

    I understand how binary works, I can also read it, but I still don't understand how the chip INTERPRETS the 1's and 0's. Why does an off/on state mean something at all to the chip to begin with? How and why does it interpret it in the first place?

  • ¬ 黒 Black Moon 月¬ Post author

    any chance you could make a version of this video using only the 1,000 most used words in the english language? 🙂 i bet it'd be an even better (and weirder.) time than the one about our galaxy was. (*^_^)

  • THE-BLACKADDE R Post author

    Are so fat?

  • JoshIsParty Post author

    Fuckin stray Adam, he ruins everything

  • Sathvik Malgikar Post author

    damn with the quantum mechanics

  • Dede Mastra Post author

    why watching hank here is so fun, haha

  • Vicky Garcia Post author

    7:35 “Photons are so FAT “😂

  • Judith Priestess Post author

    Thanks for this informative video. This might sound naive but, why do we need computers to be faster? Seems like each alternative is not worth its consequence. I understand that progression is integral to the human experience, however, I think it's important to distinguish between need and want. Do we NEED ever Moore (pun intended) faster computers to survive and progress?.. I don't know, do we? At what cost?
    Perhaps Moore's law's evident plateau is showing us that we must inevitably arrive at some state of resolution, where certain human advances are concerned.

  • Mya R Post author

    What about ternary chips rather than binary? Instead of just 1 or 0, you could have -1 as well, and my logic would assume that would increase processing power by a whole 50%.

  • HenryManson Post author

    there are 10 kinds of ppl, those who understand binary & those who dont !

  • Mason Skorup Post author

    Let's hear a hell yeah for quantum computing!! 😀

  • NA Post author

    to answer the people who ask "why don't we make the chips bigger with the transistors the same size" u must remember that the people who make these chips are companies who do it to make money if you make the chips bigger it cost more and they would need to change there skills in making things smaller to making this bigger all fore a temporary solution

  • I Created An Account For This Post author

    I like chips. They sure are tasty.

  • Sumit Chakraborty Post author

    This was like CS101 how computers work. Absolutely fantastic!

  • Sahil Post author

    And 4 years later we have chips running 7nm

  • Teralcraft Post author

    As a computer Science/ I.T ,CNET Major this video is exellent. We are actually working on alternatives to how chips are made, biggest contender being gemstones, Specifically Emeralds. That's why we have multicored CPU's to adjust for the more chips issue. Software actually has not been able to keep up with hardware advances.We actaully recently started bringing back multi soketed motherboards again and more monsterous CPU's like Threadripper

  • Burgundy Burnouts Post author

    "36nm" pfft

    We're on 7nm now

  • osityan Post author

    So are there actual ones and zeroes being written somewhere?

  • Thomas Conrow Post author

    In 1978, when we were using 16k x 1 chips in sets of eight (or nine for parity check), one problem was the radioactive impurities in the silicon which generated ionizing radiation which in turn could corrupt the data stored on the chip. – FYI a set of 8 16k x 1chips would hold a whopping 16k bytes of data.

  • Jeremy Newhard Post author

    And to thing that in 2019, there is finally a commercial 7nm cpu around the corner

  • Rafal D Post author

    Worth to mention that some bytes are treated as instructions for CPU itself. For example, on Intel processors (x86 architecture) the byte value 67 causes to increase ebx register value by 1.

  • Adrian Bartholomew Post author

    Wow. Just to further illustrate how fast technology changes, since this video was posted, 3nm chips are already in development.

  • Elijah Jaques Post author

    i do not understand

  • Kelvin Broder Post author

    I love this channel but can't help to notice that this guy seems to be the one person least likely to need a flannel shirt.

  • anna blue Post author

    I was literally trying to break down what I already know to figure out how it plays videos

  • VliDono Post author

    Cant wait for laser computers… oh wait… rgb… oh no….

  • zerokmatrix Post author

    Ah, 2014 Hank Green, 5 years younger and who, in those days, was obviously getting his blood exchanged with liquid cocaine, before placing him inside a 3m square perspex box to ensure no damage to other people or equipment from his arms, and recording an episode.

  • nur alam Post author

    felze helfe you

  • Muhammed Miqdad Post author

    Why some electrons of chip in my phone become stop suddenly…?What is the reason behind the hanging of electrons? 😐

  • Swarup Dwivedy Post author

    How it occurs. Please tell me???

  • Abhi Ram Post author

    fast forward to 2019 and we have processor with 7nm technology at a very reasonable price..

  • Megan Perreault Post author

    HaHA ! Moore is gonna hit a Turkish 40! Umm, it's not magic it's science! I could laugh, but it might not make anybody happy. I wish I wish you were happier! JK not first song done by a bully. He needs a 40  ;not a 40 ounce!

  • Aeturnalis Post author

    Bastards are going to make my fancy 8-core computer obsolete with quantum states 🙁

Leave a Reply

Your email address will not be published. Required fields are marked *