The History of Computers in Business
In many ways, the history of computers in business goes hand-in-hand with the advent of the industrial revolution and the subsequent advances in science and technology. With almost every advance in our knowledge of computer technology, the business world changed fundamentally, transforming the dusty old ledgers of New York trading companies, into modern networks connecting millions of businesses worldwide and moving billions of dollars across fiber optic cables.
The history of computers in business begins with the early tabulation machines of the 19th century. It was pioneers such as Charles Babbage, often cited as the "father of computing" who created the first known mechanical device used to compute numbers. While designing two machines and building one capable of only the most basic of calculations and only crude memory storage, Babbage's invention set the precedent for thinking of mechanical devices as aids to the information processing capacity of human beings. In that sense, he is the originator of the essence behind "information technology".
After Babbage, we have the first commercial use of a computer by the United States government in the 1890 census. This computer was capable of adding and sorting and so was particularly useful in processing all of the information obtained by the census bureau. This was accomplished through the use of punch cards on which census data was recorded and then tabulated to produce the final figures. Its inventor, Herman Hollerich later became instrumental in forming the International Business Machines company or IBM in 1924.
It must be noted that during this early age of computer use in business, the telegram also helped to increase the flow of information across borders in a way that made business flow more smoothly and more efficiently. It is now argued that this period can be thought of as a kind of "first globalization", mirroring the spread of international commerce in our own time. Thus the combination of the telegram and the early mechanical tabulation machines of the late 19th and early 20th centuries, can be thought of as analogous to the personal computer and Internet in more contemporary times.
As our understanding of electricity increased early in the 20th century, it became possible to replace many of the heavily mechanical parts of early computers, with electrical components such as vacuum tubes. The result of this were the first computers capable of more complex calculations such as Konrad Zuse's Binary Computing Machines. With construction beginning in 1936 and ending in 1938, this computer was the first to use the binary code system still used to this day.
The basic technology for the modern computer was further developed for use in World War Two. Examples of this include the "Colossus" the world's first completely electronic computer developed by the British to crack German codes, and the Harvard Mark I, which was the first computer capable of being fully programed.
After the war, the creators of the ENIAC computer, used to process calculations for the atom bomb, produced the UNIVAC. This computer was the first of the modern type to be used in business, and like its mechanical predecessors it was sold to the Census Bureau in 1950. Throughout the '50s, computers were increasingly used to process large amounts of data and began to come into use for high technology design and manufacturing purposes requiring complex calculations.
When the transistor was invented in 1947, this allowed for computers to be developed that could process information 1,000 times faster than previous computers, without the huge bulk and space once needed. When the integrated circuit chip was developed in 1958, its increased processing capacity and paved the way for the work of Gordon Moore who postulated "Moore's Law." Because the integrated circuit could be made in an increasingly complex fashion, Moore argued that their complexity would double every year, while their price would stay the same. The fact that Moore has been proven right over the years is perhaps the biggest reason why computer use in business has grown exponentially over the past forty years.
With more complex transistors, computers came down in price and size enough to be used by many corporations all over the word, brought into service in order to manage inventories, payrolls, files, and produce a wide variety of reports. This allowed companies to increase efficiency and productivity, and helped create some of the first jobs directly related to information technology.
As the microprocessor was invented in 1972, the power of computers increased, while their size again diminished substantially. With processing now situated within a very small chip, personal computers could be developed and made available to the public. For the first time, computers themselves became a product that could be sold to the masses. IBM was the first to capitalize on this, producing its IBM PC in 1981, and innovators such as Steve Jobs followed suit with the creation of Apple Computer and its subsequent "Macintosh" line.
By the late 1980s and early 1990s, the innovation in computers was focused not only on advancements in hardware but also in software. This included the development of Microsoft Windows, which allowed for a much more user friendly experience, thus making computers even more available to the general public and within the business environment as well. By this time, computers were being routinely used for the creation of logos, product design, word processing, report compilation, and, of course, ever more complex calculations used in the high tech industry.
With the advent and widespread use of the Internet in the late 1990s, business benefited from an explosion of efficiency gains including the ability to coordinate design, manufacturing, distribution, and sales, all through computer systems and the networks that connected them. Additionally, real time global trading was possible in a way never before seen, changing the way transactions were carried out, and weakening the relevance of nation states in the process.
As higher bandwidths became available, teleconferencing and commuting were both affordable and effective, allowing for outsourcing and other remote-work features of 21st century business.
Currently, the business environment is benefiting from the miniaturization and increased portability of computers. From the first laptops in the 1990s, to the net books, PDAs, and smart phones of today, it has increasingly become possible to work while on the move, something which appears to be blurring the line between people's work life and domestic experience.