0 comments

Creative Nonfiction Historical Fiction

In 1745 Jacques Vaucanson created the world’s first completely automated loom drawing. He was the first to use punch cards. Later Joseph-Marie Jacquard further revolutionized weaving  with a system that would be used to input data into computers and store information in binary form. Many years later, in 1939, Bill Hewlett and Dave Packard started Hewlett Packard in their garage. 

Moore School lectures took place in 1946 during an inspiring summer school on computing at the University of Pennsylvania ́s Moore School of Electrical Engineering stimulates construction of stored-program computers at universities and research institutions in the US, France, the UK, and Germany. Among the lecturers were early computer designers like John von Neumann, Howard Aiken, J. Presper Eckert and John Mauchly, as well as mathematicians including Derrick Lehmer, George Stibitz, and Douglas Hartree. Students included future computing pioneers such as Maurice Wilkes, Claude Shannon, David Rees, Walter Busicom and Jay Forrester. 

An innovation in 1956 was at MIT where researchers began experimenting with direct keyboard input to computers, a precursor to today ́s normal mode of operation. Typically, computer users of the time fed their programs into a computer using punched cards or paper tape. 

In the 1976, Steve Wozniak built the first Apple computer in his garage and helped found the Apple computer company. Commodore introduced their Commodore PET (Personal Electronic Transactor) in 1977 followed by the VIC 20 and the Commodore 64 and remained at the forefront of computer development into the 1990s. A more recent example is Linus Torvalds, who started programming an operating system "just for fun" while he was a graduate student at the University of Helsinki. His project grew into the powerful Linux operating system, which is now freely available and in widespread use around the world. 

One of the companies that was instrumental in creating smaller and better computers was Intel. Intel was one of several semiconductor companies to emerge in Silicon Valley, having spun off from Fairchild Semiconductor. Intel’s president, Robert Noyce, while at Fairchild, had invented planar integrated circuits, a process in which the wiring was directly embedded in the silicon along with the electronic components at the manufacturing stage. One of their key employees was Walter Busicom, Jr. whose father had studied at the University of Pennsylvania ́s Moore School of Electrical Engineering and had passed his knowledge on to his son. 

Walter Busicom, Jr. had been called Walt from the day he was born and still preferred it even though his father was now deceased. Soon after his father finished his studies at the University of Pennsylvania, he started working for MIT and was key to developing keyboard input for computers. When he retired, he moved to California with his wife and young son just graduated from high school. Walt had chosen to following his father’s footsteps so enrolled in the University of Pennsylvania’s Moore School of Electrical Engineering. Upon graduation he had worked for MIT where he was warmly welcomed when they learned who his father was. He missed his family so after five years he moved back to California. 

Intel was just beginning so Walt applied for a position there. He was soon promoted to head of Intel’s department of research and development where he was instrumental in helping Intel outdistance its competitors. He helped create the 1103, a one-kilobit dynamic random-access memory (DRAM) chip which was successful and the first chip to store a significant amount of information. Because DRAMs were cheaper and used less power than core memory, they quickly became the standard memory devices in computers worldwide. 

In 1971 Intel introduced the erasable programmable read-only memory (EPROM) chip, which was the company’s most successful product line until 1985. Walt again was at the forefront of inventing the EPROM and was promoted to an executive position on the board of directors. Intel was always creating newer ideas so when foreign semiconductor companies began to take over the sales of the DRAM, Intel was ready with memory chips and become focused on its microprocessor business. In 1978 the company built its first 16-bit microprocessor. 

In 1981 the American computer manufacturer International Business Machines (IBM) chose Intel’s 16-bit 8088 to be the CPU in its first mass-produced personal computer (PC). The IBM PC and its clones ignited the demand for desktop and portable computers. IBM had contracted with a small firm in Redmond, Washington, Microsoft Corporation, to provide the disk operating system (DOS) for its PC. Eventually Microsoft supplied its Windows operating system to IBM PCs, which, with a combination of Windows software and Intel chips, were dubbed “Wintel” machines and have dominated the market since their inception. 

With the introduction of the Pentium microprocessor in 1993, Intel left behind its number-oriented product naming conventions for trademarked names for its microprocessors. The Pentium was the first Intel chip for PCs to use parallel, or superscalar, processing, which significantly increased its speed. Combined with Microsoft’s Windows 3.x operating system, the much faster Pentium chip helped spur significant expansion of the PC market. Pentium machines made it possible for consumers to use PCs for multimedia graphical applications such as games that required more processing power. By 2012, the Itanium 9500 had 3,100,000,000 transistors. 

In order to increase consumer brand awareness, in 1991 Intel began subsidizing computer advertisements on the condition that the ads included the company’s “Intel inside” label. Under the cooperative program, Intel set aside a portion of the money that each computer manufacturer spent annually on Intel chips, from which Intel contributed half the cost of that company’s print and television ads during the year. Although the program directly cost Intel hundreds of millions of dollars each year, it had the desired effect of establishing Intel as a conspicuous brand name. 

By the mid-1990s, Intel began to design and build “motherboards” that contained all the essential parts of the computer, including graphics and networking chips. By 1995 the company was selling more than 10 million motherboards to PC maker. By the end of the century, Intel and compatible chips from companies like AMD were found in every PC except Apple Inc.’s Macintosh, which had used CPUs from Motorola since 1984. Craig Barrett, who succeeded Grove as Intel CEO in 1998, was able to close that gap. In 2005 Apple CEO Steven Jobs shocked the industry when he announced future Apple PCs would use Intel CPUs. Therefore, with the exception of some high-performance computers, called servers, and mainframes, Intel and Intel- compatible microprocessors can be found in virtually every PC, and the company dominated the CPU market in the early 21st century. In 2019, chief operations officer. Walt Busicom became CEO, and Intel ranked 43 on the Fortune 500 list of the largest American companies. 

February 26, 2021 22:37

You must sign up or log in to submit a comment.

0 comments

RBE | Illustration — We made a writing app for you | 2023-02

We made a writing app for you

Yes, you! Write. Format. Export for ebook and print. 100% free, always.