Transistors were invented in 1947 as an alternative to vacuum tubes for use as electronic switches. At first they were more expensive (transistor = $8 vs. vacuum tube = 75¢). However, by the mid 1950s they had decreased significantly in price and size, and began to see widespread use in radios, televisions - and computers.
The transistor was far superior to the vacuum tube, allowing computers to become smaller, faster, cheaper, more energy-efficient and more reliable than their first-generation predecessors. Although the transistor still generated a great deal of heat and still required massive amounts of wiring, it was a vast improvement over the vacuum tube.
Storage
In 1956, IBM unveiled the world's first hard disk -
the IBM 350 RAMAC (Random Access Method for Accounting and Control).
The disk was comprised of 50 platters, each with a 2-foot diameter,
and the entire unit could store a whopping 5MB. Only two read
/ write heads was used for access to all the platters, making the
average access time very slow (just under 1 second). In 1961, IBM introduced a hard disk with read / write heads for each platter (IBM 1301), which significantly improved access time. In addition, it introduced a 2MB "removable" hard disk (IBM 1311), which allowed data to be removed from the computer and stored off-line. While magnetic tape was still used for large volumes of data, the removable hard disk could be used for smaller "data-sets". This marked the beginning of the end of the punched card storage era (although punched cards would remain in limited use for another twenty-five years!).
Input / OutputThroughout the second generation, teletypewriters and punched cards remained the primary method for input. In 1960, Digital Equipment Corp. (DEC) introduced the PDP-1, the first computer to include a video display and keyboard. The video display was known as a VDT (video display terminal), and functioned primarily as a video typewriter, useful only for situations when it was not necessary for input operations to be "logged".
Line printers remained the primary output device, and IBM continued to dominate the market. The IBM 1403 line printer (part of the 1401 computer system) could produce 20 pages per minute. It stood almost 5 feet tall and was louder than a power saw! IBM line printers would not be surpassed for print quality until the advent of laser printing technology in the 1970s.
Software"Assembly language" became the major tool for
software development. Assembly language used symbols and words
to represent the numeric codes and addresses that were previously
used in machine language. This greatly simplified the task of
designing and testing programs. (For example, the
assembly language statement LOAD R4, NUM may be somewhat
cryptic, but was a vast improvement over it's machine language
equivalent 3504006B !) Second generation computers were still single-user machines that were "shared" among the users of a company. The user had sole use of the machine, and would arrive with program and data in hand - usually on punched cards. The cards would be fed into the card reader, and the computer would begin to execute until the program completed (or crashed) while a lineup of impatient users waited for their turn. Computer time had to be booked in advance, and the computer was not always utilized efficiently. Over time, a company's frequently used program routines were stored in magnetic tape "libraries", available for use by other programmers. Executing a program now became a multistage process - first, the user had to run a special program to convert their assembly language program into machine code (a process called "assembling"). Then, additional routines (if required) would need to be copied from magnetic tape libraries (a process called "linking"), which might involve manually searching for, then mounting the required tape). Finally, the merged (or "linked") code could be loaded and executed. Programmers soon recognized the need to automate this process, and began to write special "monitor" programs that could perform the tasks of assembling, linking and loading without user intervention between steps. One of the earliest-known and most successful monitor programs was created in 1956 by employees at General Motors and North American Aviation, who were leasing an IBM 704 computer. However, this made it very difficult for computer manufacturers like IBM and Sperry-Rand to provide adequate support for their customers, since each machine was programmed differently. IBM set out to solve this problem by creating a standardized set of monitor programs that could be used by any customer that bought an IBM computer. These programs would collectively be known as an "operating system". High-level programming languages were also being developed at this time. These languages used more English-like commands and phrases, and would use a "compiler" to translate the commands into the machine language necessary for processing. FORTRAN (Formula TRANslator) had been developed by IBM and would soon be essential for scientific applications. COBOL (Common Business Oriented Language) was developed by a team of engineers (led by Grace Hopper) as a suitable language for business applications. |
1956 |
|
1957 |
|
1958 |
|
1959 |
|
1960 |
|
1961 |
|
1962 |
|
1963 |
|
No comments:
Post a Comment