Moore’s Law Is Ending… So, What’s Next?


Remember when cellphones looked like this? You could call, text, maybe play snake on
it … and it had about 6 megabytes of memory, which was a small miracle at the time. Then, phones got faster and around every two
years, you probably upgraded your phone from 8 gigs to 16 to 32 and so on and so forth. This incremental technological progress we’ve
all been participating in for years hinges on one key trend, called Moore’s Law. Co-founder of Intel, Gordon Moore made a prediction
in 1965 that integrated circuits, or chips, were the path to cheaper electronics. Moore’s law states that the number of transistors,
the tiny switches that control the flow of an electrical current that can fit in an integrated
circuit, will double every two years, while the cost will halve. Chip power goes up as cost goes down. That exponential growth has brought massive
advances in computing power… hence tiny computers in our pockets! A single chip today can contain billions of
transistors, and each transistor is about 14 nanometres across! That’s smaller than most human viruses! Now, Moore’s law isn’t a law of physics,
it’s just a good hunch that’s driven companies to make better chips. But experts are claiming that this trend is
slowing down. Granddaddy chip maker Intel recently disclosed
that it’s becoming more difficult to roll out smaller transistors in a two year timeframe
while also being affordable. So, to power the next wave of electronics,
there are a few promising options in the works. One is quantum computing. Another currently in the lab stage is neuromorphic
computing, which are computer chips that are modeled after our own brains! They’re basically capable of learning and
remembering all at the same time at an incredibly fast clip. Let’s break that down and start with the
human brain. So, your brain has billions of neurons, each
of which forms synapses or connections with other neurons. Synaptic activity relies on ion channels,
which control the flow of charged atoms like sodium and calcium that make your brain function
and process properly. So, a neuromorphic chip copies that model
by relying on a densely connected web of transistors that mimic the activity of ion channels. Each chip has a network of cores, with inputs
and outputs that are wired to additional cores, which all operate in conjunction with each
other. Because of this connectivity, neuromorphic
chips are able to integrate memory, computation, and communication all together. These chips are an entirely new computational
design. Standard chips today are built based on von
Neumann architecture… where the processor and memory are separate and the data moves
between them. A central processing unit runs commands that
are fetched from memory to execute tasks. This is what’s made computers very good
at computing, but not as efficiently as they could be. Neuromorphic chips however completely change
that model by having both storage and processing connected within these “neurons” that
are all communicating and learning together. The hope is that these neuromorphic chips
could transform computers from general purpose calculators into machines that can learn from
experience and make decisions. We’d leap to a future where computers wouldn’t
just be able to crunch data at break neck speeds but could do that AND process sensory
data in real time. Some future applications of neuromorphic chips
might include combat robots that could decide how to act in the field, drones that could
detect changes in the environment, and your car taking you to a drive through for ice
cream after being dumped… basically these chips could power our future robot overlords. We don’t have machines with sophisticated,
brain-like chips yet but they’re on the horizon. So get ready for a whole new meaning for the
term “brain power.” But we have something less frightening than
AI to share with you… did you know we have a sister channel called Seeker VR? It’s everything you love about Seeker, but
in 360 degrees! SeekerVR will take you on some incredible
journeys that you probably wouldn’t get to experience otherwise. Like in a recent episode, we took a ride on
one of the most deadly trains in the world. Check it out here. Want to learn more about how the fastest computers
in the world work? We’ve got a video about them here. And am I the only one who misses my Motorola
Razr? Let us know in the comments and check back
here for more videos.

100 Replies to “Moore’s Law Is Ending… So, What’s Next?

  1. We don´t need more war machines, we need to stop war. We don´t need more drones scaring our birds, we need to plant more trees…

  2. First thing that arrives as a subject of the new technology: combat robots. Human kind: creates all sorts of amazing stuff, just to fight with each other over things that really don't matter.

  3. Moore's law only slows down at Intel, other manufacturers already working on 7nm already. While Intel still struggling on 10nm…

  4. What is going to happen to products like Microsoft Windows if computers stop getting faster? In other words, Microsoft cant just keep making Windows bigger and bigger otherwise it would run too slow. For that matter what is going to happen to the entire Tech Industry. If computers dont get faster how are those companies going to advertise their products?

  5. i miss the moto razer flip fone cause it had a game called "bejeweled twist" and for whatever reason they never ported it over to smart fones. so mad

  6. Programming today's advancement of AI technology to these neuromorphic chips will dramatically improve the robotics industry, and I find that kinda scary

  7. While it would not work for smartphones, has anyone ever considered simply making a processor Bigger. Bigger means more space for transistors.

  8. Computational power has increased solely for the gaming industry. You don't need a superpowerful computer for most practical applications. Office programs don't need super advanced computers. Micro controllers, which run automated systems, also don't need to be super advanced (unless you want them to respond to a real-world environment anyway).

    Also, all the funding going into computer development has come from the gaming industry, because they're the only ones who actually benefit from it. Now, they're no longer doing that, because the advances in computers is driving their industry obsolete. Why buy a specialized console when you can get a computer that can play just as advanced games and onto of that do so much more? As Nintendo said some time around when the switch was announced, people today are walking around with insanely powerful devices in their pockets, and that's posing a huge problem for them.

    Besides, moore's law can't continue on forever. There IS a limit to how small things can get, and transistors are as small as they can reasonably get (barring a major technological breakthrough). Right now what the IT industry is focusing more is optimizing their codes. We're still relying on codes in our operating systems that were made decades ago. Essentially they've just been adding on to the previous system over and over again, so what we have isn't exactly optimized. Its believed that we aren't even using the machines we have now to their current capacity. What this means is that further advancement isn't really necessary, what our computers can do will continue to change even if the hardware stays the same, and nobody knows when software will stop advancing.

    So why invest in better hardware that's driving industries obsolete and we're not fully utilizing anyway? In fact, the console gaming industry isn't planning on releasing any new consoles in the forseeable future, because they honestly see no reason to believe that their current hardware will be replaced any time soon. We may see new operating systems on them, but they plan to just keep producing the same machines until they see a reason to upgrade the hardware, which they don't currently see.

  9. Antinomy which can be a dual crystalline state ..silicon is mono- use antinomy wafer would allow transistors to be simultaneously 1 or _1 or +1 and 0 _0 +0 includinh half or combined partual state of crystal axis as transistors allowing instantaneous at least x5 performance of current silicon using current architecture…one that exploits partial states or multiple state transistors would potential a easy ride past 12 the without much effort however even reaching 10ghz will require a different materiel for cpu and gpu dies – antinomy is already under some investigation for use as ultra high speed RAM' nasty stuff though- mining by product includes antinomy turns into a cyanide like substance in water and cyanide and many other toxic process required just to get raw antinomy crystalline,,, processing and refining to sub micron size suitable for wafer use would be immensely polluting but also revolutionary goodbye silicon age hello multistate antinomy processors…. virtually no heat produced by switching states..no current or signal loss or follow through the benefits are endless and reliance on current processors silicons prooerties themselves maybe the limiting factor….MAP tm Me😁

  10. The second one after Quantum computing is how Cortana from Halo was modeled. SO IM ALL FOR THAT YERRRR🗣🗣🗣🗣

    Edit: wait I just realized she went a lil loco……nvm

  11. Combat robots with intelligence. I wish you guys wouldn’t use this in a sentence with otherwise decent, helpful applications. If someday these things exist, it will because we were told it was necessary and commonplace.

  12. Put it in a robot !!! I volunteer !!!!!!!! Dude this is one of my plans, but without a chip and conscious transfer #Immortals that's where our investment should be delivered to it only makes sense. We need to hurry, at this pace I feel like the company would probably take it slow instead of making the leap, like most companies rather make money first and then pass the good stuff but of course they have to to succeed. I just hope that company when it shows up takes the leap instead of making humanity hold on.

  13. Video cards don't have a problem improving and many have 384 – 512 bit busses with HBM or GDDR5 RAM with MANY more cores ! Not to mention Intel Terahertz demonstrated in 2001 running at 1000GHz with a demo chip at 100GHz. So if we could do 100 and 1000GHz in 2001 why are we stuck at 4.5GHz ? DARPA and Motorola also have demonstrated TERAHERTZ semiconductor chips running at 1000GHz and FASTER. Not to mention 5G coming out running at 60GHz and 94GHz. The same for the network speed of 1Gb that is still not the standard when 10GHz has been out since 2002. And why have RAM prices almost trippled since 2012 ? It isn't Moore's Law since we already have working devices and the Intel tech ran at LOWER power, lower voltage with 10,000 times less leakage !  Until AMD started making the newer chips we were stuck with only 4 cores too and the CPU core speed has barely improved sine 2007, 12 years ago ! A new chip is maybe 38% faster than a 2007 4 core chip. Yet video cards have greatly increased in power even without a higher clock speed and Nvidia shows testing of 10GHz electronics in it's videos.

  14. Please don’t ruin this technology with “combat robots” and more stupid war toys, even this technology is based on communication and culture of information, we should learn from technology and evolution, don’t use it to ruin our comunicación, lets use that extension of human evolution to be more unit and have more collective consciousness, not collective unconsciousness that government offers us with wars an that shit

  15. I wonder if they are working on connecting that chip to someone’s brain to improve brain power. It could use the static electricity or your body heat to power it. Cyborgs on the horizon lol

  16. After Moore's law comes Slappy's law. Third world buffoons breeding on Welfare double every eight months.

  17. when you are artificially creating a human brain and you accidentally create an artificial human conscious that thinks exactly like yourself, essentially creating a clone of yourself

  18. you mean they arent going to get exponentially better every year? Yay! I can actually keep my phone until it breaks mechanically, without having to worry about my phone not being able to run the next iteration of OS! I can create less garbage and have more spending money in my pocket! I can be less depressed about missing out on the NEWER BETTER FASTER TECH.

    People don't seem to get that we're building a world that will soon be filled with insurmountable confusion that our mammalian brains won't be able to handle. But Johnny tech boy here in the video with his smart watch and apple T-shirt that he bought on the internet like the fanboy that he is just keeps on shouting MORE MORE MORE!!

  19. This video is predicated on "Intel's claim" that their chips aren't meeting Moore's law anymore. I got news for you, intel is only a fraction of the Semiconductor business. Look at AMD, or Apple's in-house chips. AMD's 7nm process shows Moore's is still in full effect. Sounds to me like Intel just needs to step up their game.

  20. So this end of moores law… does it say that there won't be more transistors, or that they will not get cheaper? And why exactly should these neuromorphic chips be a replacement for what we have now? They seem to do different things.

  21. i guess Moore never met N. Tesla.
    still some zombies wait around the block for 100-year old technology.
    i used to be an electrician who put in wires that are also obsolete.
    our society is outdated, so this is just a silly conversation to have to begin with.

Leave a Reply

Your email address will not be published. Required fields are marked *