Муниципальное бюджетное общеобразовательное учреждение «средняя общеобразовательная школа 17»
для чтения на тему:
« Мое хобби- компьютер»
Атаян Есения Григорьевна,
учитель английского языка
МБОУ «СОШ 17»
The first computers…………………………………….……3
Тhe first hackers…………………………………………..…13
IBM Names New Boss for Watson “Cognitive Computing” Service
The first computers
There is no easy answer to this question due to the many different classifications of computers. The first mechanical computer, created by Charles Babbage in 1822, doesn't really resemble what most would consider a computer today. Therefore, this document has been created with a listing of each of the computer firsts, starting with the Difference Engine and leading up to the computers we use today.
The word "computer" was first recorded as being used in 1613 and originally was used to describe a human who performed calculations or computations. The definition of a computer remained the same until the end of the 19th century, when the industrial revolution gave rise to machines whose primary purpose was calculating.
In 1822, Charles Babbage conceptualized and began developing the Difference Engine, considered to be the first automatic computing machine. The Difference Engine was capable of computing several sets of numbers and making hard copies of the results. Babbage received some help with development of the Difference Engine from Ada Lovelace, considered by many to be the first computer programmer for her work and notes on the Difference Engine. Unfortunately, because of funding, Babbage was never able to complete a full-scale functional version of this machine. In June of 1991, the London Science Museum completed the Difference Engine No 2 for the bicentennial year of Babbage's birth and later completed the printing mechanism in 2000.
In 1837, Charles Babbage proposed the first general mechanical computer, the Analytical Engine. The Analytical Engine contained an Arithmetic Logic Unit (ALU), basic flow control, and integrated memory and is the first general-purpose computer concept. Unfortunately, because of funding issues, this computer was also never built while Charles Babbage was alive. In 1910, Henry Babbage, Charles Babbage's youngest son, was able to complete a portion of this machine and was able to perform basic calculations.
The Z1 was created by German Konrad Zuse in his parents' living room between 1936 and 1938. It is considered to be the first electro-mechanical binary programmable computer, and the first really functional modern computer.
The Turing machine was first proposed by Alan Turing in 1936 and became the foundation for theories about computing and computers. The machine was a device that printed symbols on paper tape in a manner that emulated a person following a series of logical instructions. Without these fundamentals, we wouldn't have the computers we use today.
The Colossus was the first electric programmable computer, developed by Tommy Flowers, and first demonstrated in December 1943. The Colossus was created to help the British code breakers read encrypted German messages.
Short for Atanasoff-Berry Computer, the ABC began development by Professor John Vincent Atanasoff and graduate student Cliff Berry in 1937. Its development continued until 1942 at the Iowa State College (now Iowa State University).
The ABC was an electrical computer that used vacuum tubes for digital computation, including binary math and Boolean logic and had no CPU. On October 19, 1973, the US Federal Judge Earl R. Larson signed his decision that the ENIAC patent by J. Presper Eckert and John Mauchly was invalid and named Atanasoff the inventor of the electronic digital computer.
The ENIAC was invented by J. Presper Eckert and John Mauchly at the University of Pennsylvania and began construction in 1943 and was not completed until 1946. It occupied about 1,800 square feet and used about 18,000 vacuum tubes, weighing almost 50 tons. Although the Judge ruled that the ABC computer was the first digital computer, many still consider the ENIAC to be the first digital computer because it was fully functional.
The early British computer known as the EDSAC is considered to be the first stored program electronic computer. The computer performed its first calculation on May 6, 1949 and was the computer that ran the first graphical computer game, nicknamed "Baby".
Around the same time, the Manchester Mark 1 was another computer that could run stored programs. Built at the Victoria University of Manchester, the first version of the Mark 1 computer became operational in April 1949. Mark 1 was used to run a program to search for Mersenne primes for nine hours without error on June 16 and 17 that same year.
The first computer company was the Electronic Controls Company and was founded in 1949 by J. Presper Eckert and John Mauchly, the same individuals who helped create the ENIAC computer. The company was later renamed to EMCC or Eckert-Mauchly Computer Corporation and released a series of mainframe computers under the UNIVAC name.
First delivered to the United States government in 1950, the UNIVAC 1101 or ERA 1101 is considered to be the first computer that was capable of storing and running a program from memory.
In 1942, Konrad Zuse begin working on the Z4 that later became the first commercial computer. The computer was sold to Eduard Stiefel, a mathematician of the Swiss Federal Institute of Technology Zurich on July 12, 1950.
In 1964, the first desktop computer, the Programma 101, was unveiled to the public at the New York World's Fair. It was invented by Pier Giorgio Perotto and manufactured by Olivetti. About 44,000 Programma 101 computers were sold, each with a price tag of $3,200.
Although it was never sold, the first workstation is considered to be the Xerox Alto, introduced in 1974. The computer was revolutionary for its time and included a fully functional computer, display, and mouse. The computer operated like many computers today utilizing windows, menus and icons as an interface to its operating system. Many of the computer's capabilities were first demonstrated in The Mother of All Demos by Douglas Engelbart on December 9, 1968.
The Vietnamese-French engineer, André Truong Trong Thi, along with Francois Gernelle, developed the Micral computer in 1973. Considered as the first "micro-computer", it used the Intel 8008 processor and was the first commercial non-assembly computer. It originally sold for $1,750.
In 1975, Ed Roberts coined the term "personal computer" when he introduced the Altair 8800. Although the first personal computer is considered by many to be the KENBAK-1, which was first introduced for $750 in 1971. The computer relied on a series of switches for inputting data and output data by turning on and off a series of lights.
The IBM 5100 is the first portable computer, which was released on September 1975. The computer weighed 55 pounds and had a five inch CRT display, tape drive, 1.9MHz PALM processor, and 64KB of RAM. In the picture is an ad of the IBM 5100 taken from a November 1975 issue of Scientific America.
The first truly portable computer or laptop is considered to be the Osborne I, which was released on April 1981 and developed by Adam Osborne. The Osborne I weighed 24.5 pounds, had a 5-inch display, 64 KB of memory, two 5 1/4" floppy drives, ran the CP/M 2.2 operating system, included a modem, and cost US$1,795.
The IBM PC Division (PCD) later released the IBM portable in 1984, it's first portable computer that weighed in at 30 pounds. Later in 1986, IBM PCD announced it's first laptop computer, the PC Convertible, weighing 12 pounds. Finally, in 1994, IBM introduced the IBM ThinkPad 775CD, the first notebook with an integrated CD-ROM.
The Apple I (Apple 1) was the first Apple computer that originally sold for $666.66. The computer kit was developed by Steve Wozniak in 1976 and contained a 6502 8-bit processor and 4 kb of memory, which was expandable to 8 or 48 kb using expansion cards. Although the Apple I had a fully assembled circuit board the kit still required a power supply, display, keyboard, and case to be operational. Below is a picture of an Apple I from an advertisement by Apple.
IBM introduced its first personal computer called the IBM PC in 1981. The computer was code named and still sometimes referred to as the Acorn and had a 8088 processor, 16 KB of memory, which was expandable to 256 and utilized MS-DOS.
The Compaq Portable is considered to be the first PC clone and was release in March 1983 by Compaq. The Compaq Portable was 100% compatible with IBM computers and was capable of running any software developed for IBM computers.
See the below other computer companies first for other IBM compatible computers
Below is a listing of some of the major computers companies first computers.
Commodore - In 1977, Commodore introduced its first computer, the "Commodore PET".
Compaq - In March 1983, Compaq released its first computer and the first 100% IBM compatible computer, the "Compaq Portable."
Dell - In 1985, Dell introduced its first computer, the "Turbo PC."
Hewlett Packard - In 1966, Hewlett Packard released its first general computer, the "HP-2115."
NEC - In 1958, NEC builds its first computer, the "NEAC 1101."
Toshiba - In 1954, Toshiba introduces its first computer, the "TAC" digital computer.
The first hackers
The first "hackers" were students at the Massachusetts Institute of Technology (MIT) who belonged to the TMRC (Tech Model Railroad Club). Some of the members really built model trains. But many were more interested in the wires and circuits underneath the track platform. Spending hours at TMRC creating better circuitry was called "a mere hack." Those members who were interested in creating innovative, stylistic, and technically clever circuits called themselves (with pride) hackers.
During the spring of 1959, a new course was offered at MIT, a freshman programming class. Soon the hackers of the railroad club were spending days, hours, and nights hacking away at their computer, an IBM 704. Instead of creating a better circuit, their hack became creating faster, more efficient program - with the least
number of lines of code. Eventually they formed a group and created the first set of hacker's rules, called the Hacker's Ethic.
Steven Levy, in his book Hackers, presented the rules:
Rule 1: Access to computers - and anything, which might teach you, something about the way the world works - should be unlimited and total.
Rule 2: All information should be free.
Rule 3: Mistrust authority - promote decentralization.
Rule 4: Hackers should be judged by their hacking, not bogus criteria such as degrees, race, or position.
Rule 5: You can create art and beauty on a computer.
Rule 6: Computers can change your life for the better.
(4) These rules made programming at MIT's Artificial Intelligence Laboratory a challenging, all encompassing endeavor. Just for the exhilaration of programming, students in the Al Lab would write a new program to perform even the smallest tasks. The program would be made available to others who would try to perform the same task with fewer instructions. The act of making the computer work more elegantly was, to a bonafide hacker, awe-inspiring.
(5) Hackers were given free reign on the computer by two AI Lab professors, "Uncle" John McCarthy and Marvin Minsky, who realized that hacking created new insights. Over the years, the AI Lab created many innovations: LIFE, a game about survival; LISP, a new kind of programming language; the first computer chess game; The CAVE, the first computer adventure; and SPACEWAR, the first video game.
IBM Names New Boss for Watson “Cognitive Computing” Service
IBM has announced that former The Weather Co. CEO David Kenny will take over its “cognitive computing” service, Watson. The decision follows the financially floundering tech giant’s move to acquire the websites, apps and data platform of The Weather Co. for an undisclosed sum.
Watson, which analyzes and sources data trends through use of machine-learning and natural-language skills, has become key to IBM CEO Virginia Rometty’s plight to find new revenue streams. The New York-based corporation commonly nicknamed “Big Blue” has especially keenly pushed Watson as a useful tool for industries including health care and retail.
According to analysts mentioned by Associated Press, IBM hopes to utilize The Weather Co.’s significant amounts of climate data and its Internet platform, which provides weather forecasts through free consumer apps and serves various businesses including airlines and insurance firms.
Though IBM reported its 15th consecutive quarter of declining revenue earlier this month, the board remains behind Rometty’s drive to turn around the company, having disclosed late on Thursday that she will be paid a $4.5m performance bonus on top of her $1.6m salary for 2015.
The history of video games goes as far back as the early 1950s, when academics began designing simple games and simulations as part of their computer science research. Video gaming would not reach mainstream popularity until the 1970s and 1980s, when arcade video games, gaming consoles and home computer games were introduced to the general public. Since then, video gaming has become a popular form of entertainment and a part of modern culture in most parts of the world.
As of 2016, there are eight generations of video game consoles, with the latest generation including Nintendo's Wii U and Nintendo 3DS, Microsoft's Xbox One, as well as Sony's PlayStation 4 and PlayStation Vita. PC gaming has been holding a large market share in Asia and Europe for decades and continues to grow due to digital distribution. Since the release of smartphones, mobile gaming has been a driving factor for games to reach out to people not before interested in gaming, as well as people not able to afford dedicated hardware.
The term "video game" has evolved over the decades from a purely technical definition to a general concept defining a new class of interactive entertainment. Technically, for a product to be a video game, there must be a video signal transmitted to a cathode ray tube (CRT) that creates a rasterized image on a screen. This definition would preclude early computer games that outputted results to a printer or teletype rather than a display, any game rendered on a vector-scan monitor, any game played on a modern high definition display, and most handheld game systems. From a technical standpoint, these would more properly be called "electronic games" or "computer games."
Today, however, the term "video game" has completely shed its purely technical definition and encompasses a wider range of technology. While still rather ill-defined, the term "video game" now generally encompasses any game played on hardware built with electronic logic circuits that incorporates an element of interactivity and outputs the results of the player's actions to a display. Going by this broader definition, the first video games appeared in the early 1950s and were tied largely to research projects at universities and large corporations.
The first electronic digital computers, Colossus and ENIAC, were constructed during World War II to aid the Allied war effort against the Axis powers. Shortly after the war, the promulgation of the first stored program architectures at the University of Pennsylvania (EDVAC), Cambridge University (EDSAC), the University of Manchester (Manchester Mark 1), and Princeton University (IAS machine) allowed computers to be easily reprogrammed to undertake a variety of tasks, which facilitated the commercialization of the computer in the early 1950s by companies like Remington Rand, Ferranti, and IBM. This in turn promoted the adoption of computers by universities, government organizations, and large corporations as the decade progressed. It was in this environment that the first video games were born.
The computer games of the 1950s can generally be divided into three categories: training and instructional programs, research programs in fields such as artificial intelligence, and demonstration programs intended to impress or entertain the public. Because these games were largely developed on unique hardware in a time when porting between systems was difficult and were often dismantled or discarded after serving their limited purposes, they did not generally influence further developments in the industry. For the same reason, it is impossible to be certain who developed the first computer game or who originally modeled many of the games or play mechanics introduced during the decade, as there are likely several games from this period that were never publicized and are therefore unknown today.
The earliest known written computer game was a chess simulation developed by Alan Turing and David Champernowne called Turochamp, which was completed in 1948 but never actually implemented on a computer. The earliest know computer games actually implemented were two custom built machines called Bertie the Brain and Nimrod, which played tic-tac-toe and the game of Nim, respectively. Bertie the Brain, designed and built by Josef Kates at Rogers Majestic, was displayed at the Canadian National Exhibition in 1950, while Nimrod, conceived by John Bennett at Ferranti and built by Raymond Stuart-Williams, was displayed at the Festival of Britain and the Berlin Industrial Show in 1951. Neither game incorporated a CRT display. The first games known to incorporate a monitor were two research projects completed in 1952, a checkers program by Christopher Strachey on the Ferranti Mark 1 and a tic-tac-toe program called OXO by Alexander Douglas on the EDSAC. Both of these programs used a relatively static display to track the current state of the game board. The first known game incorporating graphics that updated in real time was a pool game programmed by William Brown and Ted Lewis specifically for a demonstration of the MIDSAC computer at the University of Michigan in 1954.
Tennis for Two – Modern recreation
Perhaps the first game created solely for entertainment rather than to demonstrate the power of a particular technology, train personnel, or aid in research was Tennis for Two, designed by William Higinbotham and built by Robert Dvorak at the Brookhaven National Laboratory in 1958. Designed to entertain the general public at Brookhaven's annual series of open houses, the game was deployed on an analog computer with graphics displayed on an oscilloscope and was dismantled in 1959. Higinbotham never considered adapting the successful game into a commercial product, which would have been impractical with the technology of the period. Ultimately, the widespread adoption of computers to play games would have to wait for the machines to spread from serious academics to their students on U.S. college campuses.
The mainframe computers of the 1950s were generally batch processing machines of limited speed and memory. This made them generally unsuited for games. Furthermore, these computers were expensive and relatively scarce commodities, so computer time was a precious resource that could not be wasted on frivolous pursuits like entertainment. At the Lincoln Laboratory at the Massachusetts Institute of Technology (MIT), however, a team led by Jay Forrester developed a computer called Whirlwind in the early 1950s that processed commands in real time and incorporated a faster and more reliable form of random access memory (RAM) based around magnetic cores. Based on this work, two employees at the lab named Ken Olsen and Wes Clark developed a prototype real time computer called the TX-0 that incorporated the recently invented transistor, which ultimately allowed the size and cost of computers to be significantly reduced. Olsen subsequently established the Digital Equipment Corporation (DEC) with Harlan Anderson in 1957 and developed a commercial update of the TX-0 called the PDP-1.
Lincoln Laboratory donated the TX-0 to MIT in 1958. As the computer operated in real time and therefore allowed for interactive programming, MIT allowed students to program the computer to conduct their own research, perhaps the first time that university students were allowed to directly access a computer for their own work. Furthermore, the university decided to allow students to set the computer to tasks outside the bounds of classwork or faculty research during periods of time no one was signed up to do official work. This resulted in a community of undergraduate students led by Bob Saunders, Peter Samson, and Alan Kotok, many of them affiliated with the Tech Model Railroad Club, conducting their own experiments on the computer. In 1961, MIT received one of the first PDP-1 computers, which incorporated a relatively sophisticated point-plotting monitor. MIT provided a similar level of access to the computer for students as it did for the TX-0, resulting in the creation of the first (relatively) widespread, and therefore influential, computer game, Spacewar!
Spacewar! is credited as the first widely available and influential computer game.
Conceived by Steve Russell, Martin Graetz, and Wayne Wiitanen in 1961 and programmed primarily by Russell, Saunders, Graetz, Samson, and Dan Edwards in the first half of 1962, Spacewar! was inspired by the science fiction stories of E. E. Smith and depicted a duel between two spaceships, each controlled by a player using a custom built control box. Immensely popular among students at MIT, Spacewar! spread to the West Coast later in the year when Russell took a job at the Stanford Artificial Intelligence Laboratory (SAIL), where it enjoyed similar success. The program subsequently migrated to other locations around the country through the efforts of both former MIT students and DEC itself, particularly after CRT terminals started becoming more common at the end of the 1960s. As computing resources continued to expand over the remainder of the decade through the adoption of time sharing and the development of simpler high-level programming languages like BASIC, an increasing number of college students began programming and sharing simple sports, puzzle, card, logic, and board games as the decade progressed. These creations remained trapped in computer labs for the remainder of the decade, however, because even though some adherents of Spacewar! had begun to sense the commercial possibilities of computer games, they could only run on hardware costing hundreds of thousands of dollars. As computers and their components continued to fall in price, however, the dream of a commercial video game finally became attainable at the beginning of the 1970s.
The History of Google
Everyone knows the name Google. Whether young or old, computer smart or not this name will pop up in any conversation about computers. Google has created some very impressive milestones of its time and continues to grow rapidly every day. It all started when Larry Page and Sergey Brin met in Stanford. Larry was 22 and a graduate of University of Michigan was there considering attending the school. And low and behold Sergey, who was 21, was there to show him around. Talk about a match made in heaven!
However, according to some they disagreed on just about everything during their first meeting. In 1996, now firm friends and both of them computer science grad students, began developing a search engine called BackRub. This search engine had operated on Stanford servers for just a little over a year when it started taking up to much bandwidth to suite Stanford. So they decided to switch servers and renamed the search engine in 1997, calling it Google. The name comes from a mathematical term for the number 1 followed by 100 zero’s. The use of the term reflects their mission to organize a seemingly infinite amount of information on the web.
In august of 2008, Sun co-founder Andy Bechtolsheim writes them a check for $100,000 to a company that didn’t even exist yet. It was at this very moment that they realized what they had and went and incorporated the name Google Inc. Their knowledge was great, but not great enough to impress the money boys or the major internet portals. Oh how they wish they invested in them now! So they began struggling for financial support. Andy was one of the few to see true potential of what these boys had created. During their presentation to him, Andy said he had to duck out for another meeting and offered to write them a check. The check was for $100,000 and that indeed had got things moving for them.
In September the boys moved into the their workspace in Susan Wojcicki’s garage at 232 Santa Margarita, Menlo Park, CA. They then went on to file for incorporation in California on September 4 1998. Shortly after completing this important task, the boys went an opened a bank account in the name of Google Inc., their newly established company, and deposited the $100,000 dollar check Andy Bechtolsheim had given them. Shortly after they have established there new business they began hiring employees. There first one was Craig Silverstein, a fellow grad student from Stanford as well.
In December of 1998 PC Magazine wrote: “The 25 million pages currently catalogued seem to be good choices. The site has uncanny knack for returning extremely relevant results. There’s much more to come from Google, but even in its prototype form it’s a great search engine.” . They went on to say that Google had made its mark as one off the Top 100 websites for 1998. Even at the very beginning they received only the best reviews.
They then went on to become the most successful internet company ever. Early in 1999 they struck a deal with Sequoia Capital and Kleiner Perkins for $25 million. In November 1999 Charlie Ayers joined Google as the company’s first chef. In April of 2000. Google announced the MentalPlex program, which envisages the software’s ability to read your mind as you visualize the search results you want. In June of 2000, Google partnered with Yahoo! to become their default search provider. Also in June they announced the first billion URL index, making Google become the world’s largest search engine. In September of 2000 they started offering searches in Chinese, Japanese, and Korean , bringing their total number of supported languages to 15. In December 2000 Google toolbar was released.
They have been going strong ever since, making them the largest and best search engine site today, with multiple enhancements. They will continue to be at the top of their game for years to come.