Intel: The Godfather of Modern Computers


At the heart of your phone, tablet and computer
lies the microprocessor, a tiny chip home to billions of transistors capable of processing
an immense amount of information. Without the microprocessor, modern technology
could not exist, which is why this week we’ll be looking at the company that started
it all, Intel. December 23, 1947. After two years of restless labor at Bell
Laboratories, these three men stood in awe of the transistor, their greatest invention. The man in the middle was William Shockley,
an entrepreneurial fellow who realized what a fortune he could make from this new technology. In 1956 he moved to the west coast, establishing
the first silicon device company in what came to be known as Silicon Valley. He couldn’t convince any of his former colleagues
at Bell Labs to leave with him and so he resorted to hiring fresh university graduates. In an ironic twist of fate, just one year
later eight of his brightest employees got together and left the company in the same
way that he had left Bell Laboratories. Under the patronage of industrialist Sherman
Fairchild, the “Traitorous Eight”, as they were called, founded Fairchild Semiconductor. Much to Shockley’s dismay, Fairchild became
one of the leaders of the industry while his own venture failed. In 1959 one of the original “Traitorous
Eight”, Robert Noyce, created the first integrated circuit. Like the transistor before, the integrated
circuit was a technology with huge potential, and he knew that. In 1968 he left Fairchild to start his own
company and he was joined by his colleague and fellow ‘traitor’ Gordon Moore, who
had famously postulated Moore’s law. To fund their venture they went to Arthur
Rock, the acclaimed investor who had arranged their original deal with Sherman Fairchild
a decade earlier. With $3 million of initial capital and the
creative portmanteau of integrated electronics, Noyce and Moore founded Intel on July 18,
1968. Behind their venture was the ambitious plan
to build large-scale integrated semiconductor memories. Back then, they were ten times more expensive
than standard magnetic core memories, which were much slower and less efficient. Nine months after its creation, Intel had
developed its first product: the 3101 Schottky bipolar memory. It was the world’s first solid state memory
device and it was capable of storing a whopping 64 bits. One year later, Intel became pioneers in dynamic
random access memory, or DRAM, by creating the first commercially available DRAM chip,
the 1103. Its success marked the beginning of the end
for magnetic memory and established DRAM as the primary storage medium of modern computers. Intel’s reputation grew quickly, and not
just in the United States. A Japanese calculator company called Busicom
had reached out to Intel in 1969 with a request to build integrated circuits for their calculators. While working on this project, Intel engineer
Ted Hoff figured out a way to build a central processing unit onto a single chip. By cramming 2,300 transistors onto a one-eighth-
by one-sixth-inch chip, Hoff’s invention had the same power as the ENIAC computer that
was the size of a room. Intel had unwittingly stumbled upon the foundation
of modern computing, the microprocessor. They called it the 4004 and started selling
it in 1971. A year later, Intel unveiled the 8008, an
8-bit microprocessor. Intel’s first general-purpose microprocessor,
the 8080, came in 1974 and it essentially became the industry standard, finding its
way into almost every cash register, calculator and traffic light of its day. Interestingly enough, the 8080 was designed
for almost everything except computers. At the time, computers were manufactured entirely
in-house, with a single company building its own terminals, compilers, and operating systems. The 8080, however, became so popular that
the manufacturers, starting with Hewlett Packard, eventually began designing their systems around
it. In 1978 Intel released the 8086, a 16-bit
processor that would eventually become Intel’s saving grace. Up until that point Intel’s revenues were
coming almost entirely from their DRAM division, but Japan’s rising semiconductor industry
was quickly eating away at their profits. Intel’s only way forward was microprocessors,
and they went all in by partnering up with IBM. We’ve already covered IBM in a previous
video, but just to recap, in the early 1980s IBM were struggling to catch up with the rise
of the personal computer. At first, IBM didn’t think PCs would be
worth it to the average person, but once that started happening anyway, IBM’s bureaucracy
made developing their own PC a nightmare. They ended up partnering with Intel for their
processor and with Microsoft for their operating system, which allowed them to develop their
IBM PC in just under a year. It was released in 1981 and it became the
dominant personal computer of its time, establishing Intel as the chief supplier of processors. The IBM PC used a modified 8086 processor,
and although IBM eventually lost the personal computer market to cheap compatible copycats,
Intel remained at the heart of every personal computer made over the next decade. The legacy of the 8086 remains to this day,
as the vast majority of modern computers are based on its derivative x86 architecture. During the 1980s Intel emerged as the most
profitable hardware supplier to the rising PC industry. They reached $1 billion of revenue in 1983,
and the same amount as net income just nine years later. In 1993 Intel released the Pentium line, their
fifth generation of processors. For this generation Intel started building
dedicated motherboards alongside its processors, a move that kept them ahead of their competition
and doubled their net income that year to $2.3 billion. Throughout the 90s Intel continued to develop
more powerful processors, more or less in accordance with Moore’s law. In 1998 Intel branched out into the value-PC
market by releasing the cheap, low-performance Celeron line. The new millennium, however, would be a much
more difficult time for Intel. The dot-com crash and fierce competition from
AMD saw Intel fall below 80% market share for the first time in decades. The situation became so bad that in 2001 Intel’s
profits had slumped by a stunning 87%. By that point it became clear that racing
to build faster and faster processors wasn’t the way to go, especially when most people
were using their computers just to read their email or browse the web. Intel shifted their focus accordingly, building
a more efficient, less power-hungry line called Centrino. Released in 2003, the Centrino wasn’t actually
a processor but a fully functional platform, complete with a chipset and wireless network. It worked extremely well on portable computers
just around the time when laptops were finally starting to take off, lifting Intel back to
the top of the industry. In line with their new philosophy, Intel began
developing multi-core processors, releasing their first dual-core in 2005. In general, the past few generations have
been split into three main categories based on processing power: i3, i5, and i7. Up until last year, Intel were operating on
a “Tick-Tock” model, where they either shrink the size of the current microarchitecture
to make it more efficient or release an entirely new one every 18 months. The performance of the last two generations
hasn’t improved that much though, and Intel have also attracted a lot of antitrust litigation. In 2009 the European Union fined Intel more
than one and a half billion dollars for bribing computer manufacturers to use their processors. Similar accusations have sprung up in the
US, Japan and South Korea. Despite the lawsuits, Intel’s business has
been going great, and they’ve been able to branch out into various other tech markets,
usually through acquisitions. Among other things they’re working on solid-state
drives, machine learning and autonomous vehicles. Some of these projects are more successful
than others, but it’s unlikely that they’ll be replacing Intel’s main microprocessor
business any time soon. Thanks for watching and a big thank you to
all of our patrons for supporting this video! Be sure to subscribe if you haven’t already
and to check out the full Behind the Business playlist for the interesting stories of other
companies. Once again, thanks a lot for watching, and
as always: stay smart.

100 comments on “Intel: The Godfather of Modern Computers”

  1. The Gaming Of Rean says:

    Did anyone noticed the Intel Jingle at the start?

  2. Dominick Santora says:

    I'll bet Shockley wanted to put a contract out on those eight. LOL

  3. qflux says:

    No mention of their biggest blunder? The Itanic?

  4. Jose Moya says:

    The promedio for ser the video in spanish is 2years

  5. Al Hashim Caezar Nonok says:

    Do a video on their number one competitor AMD?

  6. The Digital Nerd says:

    for some reason this makes me smile

  7. Jeff Caligari says:

    Who's watching this on a 286? 14kpbs! WOO

  8. BlokeyO's says:

    8:36 am i the only one who notices a Cat girl burning in the fire?

  9. Studio 73 says:

    Do Duracell next! And this time the history of the company, not one part of it’s history.

  10. ThatSpookyMan says:

    4:40 Anime chick behind the japan flag 😀

  11. Exalaxy X says:

    HOW BOUT RYZEN?!?!?!

  12. Lil homo says:

    YouTube didnt exist on 2001

  13. k.v.Narasimha Murthy says:

    INTEL DESERVES TO RULE MANKIND . R

  14. Shahaan Saleem says:

    Gj on choosing Asus tablet. I have one for a couple years now and it's amazing

  15. Anand Kothari says:

    7:05 its funny how YouTube was not even founded till 2006!

  16. OOBAIDD DID says:

    Moore powerful at 6:18 bottom

  17. TWO Otaku A says:

    Khe khew that
    Wkho was khvery intelligent

  18. Róbert Nagy says:

    intel az jobb. de internet nélkul egy szart sem ér a AI :)) internetfuggö zombie :)))))mint ahogyan a média gyaártja a sok média zombiet:))))))))))))

  19. Tactical Aioli says:

    4 bit is 16 instructions. 8 bit is 256 instructions. 16 bit is 64k instructions. 32 bit is 4 billion instructions. do we need a 64 bit? nope. but how else will you sell those 6 core CPUs to the plebs?

  20. Sam Kirchner says:

    Your channel is awesome guy. Very informational.

  21. Sam Kirchner says:

    Very valuable step by step overview of the life of companies.

  22. Sinclair Tabone says:

    the X86 made all the difference for Intel. Before the X86 software had to be re-written for new different platform/architectures. The X86 architecture was the first backwards compatible processor family

  23. Harish Ganesan says:

    No mention of Steve jobs in this video

  24. Zeitaluq says:

    Rise and fall of the the Sinclair ZX Spectrum would be interesting until its demise in the face of emergence of Microsoft and Apple

  25. vincent pribish says:

    "unwittingly stumbled on the foundation of modern computing" – by very intentionally setting out to develop the CPU. who writes this crap?

  26. maxwell koros says:

    You deserve a million+ subscribers🙌🙌

  27. Pompomatic says:

    Soooo.. What happened to the actual inventor of the transistor? He who got betrayed.

  28. mena seven says:

    Intel revolutionized the computer industry with the invention of the CPU.

  29. Microphunktv says:

    Riiight… except Cyrix outbenchmarked intel every single fucking time.. and Intel bullied them to bankrupt with court cases with their fatter wallet….

    fuck off normies

  30. Honk Honk says:

    first intel logo was better lol

  31. Honk Honk says:

    For some reason id like to use the intel 8080.

  32. Honk Honk says:

    I like intel. Its nice and all. But AMD is important aswell, mostly because it gives competition

  33. rjrex8 says:

    My Asus Zenbook Flip 15 is Intel.

  34. darthvader5300 says:

    NOT ANYMORE! RUSSIA HAS INVESTED A LOT IN IMPROVING THE CMOS TECHNOLOGY AND IN MASKLESS LITHOGRAPHY. NOW THEY ARE QUITELY MAKING MILITARY GRADE AND MILITARY HARDENED SILICON CARBIDE-NITRIDE INTEGRATED CIRCUIT CHIPS PACKAGED IN CERAMIC AND PACKAGED AGAIN INSIDE A SINGLE CONTACT EDGE CERAMIC PACKAGE. ALL RUSSIAN MILITARY COMPUTERS HAS COMBINED THE BEST OF THE OLD 80s COMPUTERS WITH THE BEST OF 2018 COMPUTERS. THEIR MOTTO: NOTHING EVER GETS DELETED!

  35. MsJavaWolf says:

    And now compare this to a company like Apple, whose value comes only from the fact that their smartphones are alittle bit more shiny than other phones.

  36. Art Dehls says:

    There's no way to do a video on Intel in under 10 minutes without completely leaving out the back and forth with AMD. Any proper video including them would have to run an hour.

  37. HKashaf says:

    How about Cisco, VMware, Dell, EMC?

  38. Kuntal Ghosh says:

    we need amd!! jhinga la la huh huh!!!

  39. 13375PLAYS says:

    Intel is the godfather of none

  40. lithgrapher says:

    Not anymore

  41. SRG & SRG Enterprise - Home Furniture Assembly says:

    CHEAP, Reliable and FAST… Call: (819) 616-0128

  42. InfiniteAura says:

    Our godfather is dying

  43. Cappuchino/Cup o' Chino says:

    I saw a little Felix in 4:40

  44. Steve says:

    American innovation. This is why we need H1-B. If not us, another country will

  45. Joshua Kemp says:

    What was that background music playing through the video?

  46. D J says:

    Intel: The Godfather of over inflated pricing 🤣

  47. Matthew Lea says:

    The PC Gamer comments seem to be anti Intel for some reason.

  48. Infinity Infinity says:

    One year later, see Felix, learning mood ruined.

  49. Nithyanth Productions says:

    who saw free money machine at 1:40

  50. MrHatoi says:

    Sure can't wait for x86 to be replaced by something like RISC.

  51. Yasser Arguelles says:

    Didn't the Intel 8086 come first in 1978 and then the compatibility Intel 8080 in 1979

  52. wopmf4345FxFDxdGaa20 says:

    There is still so much more to this subject, but it is a good scratch of the surface. 🙂 For example, in the early days, perhaps the biggest driver of this development was the military and a lot of of the money to that field came from the different type of government projects and research. I doubt we would have much of the computing technology today without WW2.

    Then when personal computers started to really take off, the hardest part was already done. After that it has been a lot about iterating the same idea, making the transistors smaller and fitting more on the same chip. The biggest revolutions here have probably been in the manufacturing technology.

    Why they stopped increasing the clock frequency is largely because it increases the power consumption so heavily. The increase of calculation power itself did not end, the means to increase it are just different than by increasing the chip operating frequency. Power consumption is a problem in two days. It directly defines how much heat the chip is going to generate, and therefore defines the size of the cooling system ( you cannot have a massive cooler for example in a laptop ). Secondly for mobile devices one big problem even today is the battery life. Energy consumption in relation to the computation power is very important parameter for servers and data centers as well. By other words, how much you can calculate with some unit of energy consumed.

    General purpose processors nowadays adjust their clock frequency up and down constantly depending of the load, to generate less heat, and to consume less energy. They also turn parts of the chip completely off when they are not being used, for the very same reasons.

  53. Defunct Lizard says:

    The beggining was a little misleading. Intel doesn't make the ARM processors in tablets and phones.

  54. Star Mayhem says:

    godfather: a person directing a criminal organization

  55. Abhijit Jacob says:

    AMD came out with 386 and 486 just like Intel. All keyboard makers and HDD makers made generic parts to conform to the PC standards. Intel sued and court told them 486 is just a number. 586 was labelled pentium so that AMD was forced to find its own number. While AMD was company of geeks, Intel was full of Jewish #### with business acumen. They made sure AMD never had support from motherboard manufacturer, RAM and PC manufacturer. AMD couldn't be fully killed as that would make Intel a bully and raise eyebrows. Today Ryzen rocks but its motherboard and memory is also easy to get with a simple search on Amazon.

  56. Roger Barraud says:

    "…Revenues were coming entirely from their DRAM division…" <shows 1702A EPROM/>
    Derrrrpp

  57. Roger Barraud says:

    Don't give up your day j0rb…

  58. Lapis Wolf says:

    i'm watching this video right now with a Hewlett Packard laptop that has Intel parts

  59. Lapis Wolf says:

    what happened to i1, i2, i4 and i6?

  60. S.S RADON says:

    can i have a video on AMD now please

  61. dareen moughrabi says:

    Tommorow i am going to visit intel company and learn somethings in it so cool

  62. Srajan Dikshit says:

    I guess the inventor of pentium chip deserved a mention

  63. Boleslaw Petroski says:

    It's inside.

  64. fatq says:

    4:22 40 years later the limited edition 8086K (released this year). I almost got it but decided to spend less on the almost as powerful 8700K. (I dont want to seem spoiled… I was gonna have an i5 and a 1060 3gb but i got a gtx 1080 for free, meaning i didnt have to include a gpu price in my budget)

  65. A DE says:

    @Business Casual what is the song in the beginning

  66. Chief Steph says:

    Haha, AMD jokes for those who want a video about AMD… Please fanboys stop being asshats.

  67. VIVEK CHOUDHURY says:

    Intel is still the king of the microprocessor market, hands down!

  68. LowbreDZ says:

    watching this with an amd fx8350 🙂

  69. Droid 16 Beta says:

    Why is "bribing computer manufacturers to use their processors" illegal?

  70. The Gaming Of Rean says:

    Intel did not make or produce or manufacture phone/mobile processors… only Tablets and Laptops and Desktops

  71. TreeMobile. says:

    That transition to the Intel logo was amazing

  72. Utube says:

    And Indian computer scientist 'Vinod Dham' working at Intel, is famously known as 'Father of Pentium chips'… He developed the first pentium chip..

  73. Matt Harden says:

    Is that 4004 processor at 3:32… wood grain??

  74. The N says:

    As a sound engineer bruv, this vid had your voice without any de-esser on it and it's hugely irritating to me considering that your sibilants are sharp af and are borderline whistles.

  75. IsItAbout TheHedge says:

    Mr Meltdown and Mr Spectre greet you all. ;~)

  76. Prasoon Tiwari says:

    More interesting story would've been about the "Traitorous Eight" and how they founded Fairchild Semiconductors that established Silicon Valley in what it is today seeding the area with talent people who left and made their own companies. Robert Noyce in particular. He was known as The Mayor of Silicon Valley for a reason. I think Fairchild Semiconductor was one of the the most influential company of 20th century.

  77. All Kindz! ቀስተደበና says:

    The last badass Intel CPU of all time was Intel Pentium 4

  78. Fr06t says:

    Enter i9…..

  79. Alejandro Batman's Apprentice says:

    AMD???

  80. lol i can't think of a better name says:

    what is this ?? 4:41

  81. Dont Ask now says:

    You should have mentioned about Vinod Dham, this video is incomplete without mentioning his name same like you mentioned about other Engineers

  82. The Gaming Of Rean says:

    The CPU on the Phones are different, they're not x86, nor x64, nor x32, but A.R.M. and Intel tried to make A.R.M. CPU's but failed… And the only A.R.M. CPU Brands are, Qualcomm, Samsung Exynos, and Apple A series (maybe Not arm…)

  83. The Gaming Of Rean says:

    Creative! The Intel Jingle! On 0:20

  84. Chialuen Lis says:

    4:41 hey, that's a trap!

  85. Luis Escamado Nhamue says:

    Having a CPU those days was the equivalent of driving a great car. I remember the first intel up I had, was a Prescott, fast and strong, had a great grip and could break any code, then I had an athlon amd CPU. Was perfect, playing games on it was smooth like hot knife cutting butter. It was just beautiful to see a motherboard with no north bridge… eventually and stopped doing good coups and I had a sandy bridge. The first of the first core 2. This was a new era. Amd took years to catch up…

  86. Luis Escamado Nhamue says:

    Do ATI

  87. Supercyber Cow says:

    At the heart of your phone, tablet and …. "lies" the microprocessor … yes "lies", because it's ARM not Intel's 🙂

  88. Vincent Koech says:

    YouTube in 2001? 😒

  89. КАК ДЕЛАТЬ ОТ 4500 РУБ В ДЕНЬ ? ЖМИ СЮДА says:

    Покажу, как легко выводuть раз в сутки до 50 долларов на свою карту. Изучите вuдео у меня канале

  90. Rahul Mathew says:

    Xeros made the biggest mistake in the history of computers, not understanding wat they actually invented, GUI. Apple stole from it and microsoft stole from apple

  91. Vittorio Folino says:

    FAGGIN MADE POSSIBLE TO CREATE THE FIRST MICROPOCESSOR.

  92. Darkseed says:

    Now, the Intel i9 is here.

  93. Bill Olsen says:

    Better be nice to Intel or you will find a horse's head in your bed.

  94. Aditya Bhambere says:

    bro also make video on LG group

  95. ntandoyezwe mntungwa says:

    tips

  96. Chakkara Kp says:

    I stoped watching the video when i noticed the the intel doesnt have a moral background and nothing to be inspired

  97. Daniel Black says:

    6:19 & 6:20 Chuckle chuckle

  98. P Christmas says:

    Amazing how the transistor and electronics advanced after Roswell.

  99. lifeinsepia says:

    i just finished watching silicon cowboys on netflix and wanted to know more about the history of intel. the early years sound like a story for a movie/documentary too!

  100. Swiss Dude says:

    This is BS! Where are the COMMODORE, APPLE II and SINCLAIR computers I was playing with as a kid in this historical progression?

Leave a Reply

Your email address will not be published. Required fields are marked *