Главная » Файлы » Иностранные языки » Топики _ Доклады _ Сочинеия_Рефераты на английском языке

Dawn of the digital information era
05.12.2013, 18:16

From a business perspective, the mainframe era enabled com­panies to cut costs and improve efficiency by automating difficult and time consuming processes.

Typically, the mainframe, based on proprietary technology developed by IBM or one of a handful of competitors, was housed in an air-conditioned room which became known as the "glasshouse" and was tended by white-coated technicians.

Data were input from "green screen" or "dumb" terminals hooked into the mainframe over a rudimentary network.

The mainframe provided a highly secure and usually reli­able platform for corporate com­puting, but it had some serious drawbacks. In particular, its pro­prietary technology made it costly and the need to write cus­tom-built programs for each application limited flexibility.

The next computing wave was led by the minicomputer makers which built scaled-down main­frame machines dubbed depart­mental minis or mid-range systems. These still used propri­etary technology, but provided much wider departmental access to their resources via desktop ter­minals.

Among manufacturers leading this wave of computing was Digi­tal Equipment with its Vax range of machines and Wang which developed a widely used propri­etary word-processing system.

A key factor driving down the cost of computing power over this period was significant advances in the underlying tech­nology and in particular, semi­conductors.

In 1947, scientists at Bell telephone laboratories in the US had invented the "transfer resistance" device or "transistor" which would eventually provide computers with a reliability unachievable with vacuum tubes.

By the end of the 1950s, inte­grated circuits had arrived - a development that would enable millions of transistors to be etched onto a single silicon chip and collapse the price of comput­ing power dramatically.

In 1971, Intel produced the 4004, launching a family of "processors on a chip" leading to the develop­ment of the 8080 8-bit micropro­cessor three years later and open­ing the door for the emergence of the first mass produced personal computer, the Altair 8800.

The development of the per­sonal computer and personal pro­ductivity software - the third wave of computing - was led by Apple Computer and IBM in con­junction with Microsoft which provided IBM with the operating system for the first IBM PC in 1981.

This year, an estimated 108m PCs will be sold worldwide including a growing number of sub - $500 machines which are expanding the penetration of PCs into households which previously could not afford them.

Sometimes, however, software development has not kept pace. As Robert Cringely, the Silicon Valley technology guru, notes: "If the automobile had followed the same development as the computer, a Rolls-Royce would today cost $100, get a million miles per gallon and explode once a year, killing everyone inside."

Nevertheless, for businesses the arrival of the desktop PCs built around relatively low cost standard components put real computing power into the hands of end-users for the first time. This meant Individual users could create, manipulate and con­trol their own data and were freed from the constraints of dealing with a big IT department.

However, the limitations of desktop PCs as "islands of com­puting power" also quickly became apparent. In particular, people discovered they needed to hook their machines together with local area networks to share data and peripherals as well as exchange messages.

By the start of the 1990s, a new corporate computer architecture called client/server computing had emerged built around desk­top PCs and more powerful serv­ers linked together by a local area network.

Over the past few years, how­ever, there has been growing disatisfaction, particularly among big corporate PC users, with the client/server model mainly because of its complexity and high cost of lifetime ownership.

As a result, there has been a pronounced swing back towards a centralised computing model in the past few years, accelerated by the growth of the internet.

The internet has its origins in the 1970s and work undertaken by Vinton Cerf and otters to design systems that would enable research and academic institu­tions working on military pro­jects to co-operate.

This led to the development of the Ethernet standard and TCP/ IP, the basic internet protocol. It also led Bob Metcalfe to promul­gate "Metcalfe's Law" which states the value of a network is proportional to the square of the number of nodes attached to it.

But arguably, it was not until the mid-1990s and the commer­cialisation of the Internet that the true value of internetworking became apparent. The growth of the internet and the world wide web in particular since then has been astonishing.

With the help of tools like web browsers, the internet was trans­formed in just four years from an arcane system linking mostly academic institutions into a global transport system with 50m users. Today, that figure has swollen to about 160m and esti­mates for the electronic com­merce that it enables are pushed up almost weekly.

According to the latest Gold-man Sachs internet report, the business-to-business e-commerce market alone will grow to £l,500bn in 2004, up from $114bn this year and virtually nothing two years ago.

Two inter-related technologies have been driving these changes:

semiconductors and network communications.

For more than 25 years, semi­conductor development has broadly followed the dictum of "Moore's Law" laid down by Gor­don Moore, co-founder of Intel.

This states that the capacity of semiconductor chips will double every 18 months, or expressed a different way, that the price of computing power will halve every 18 months.
Категория: Топики _ Доклады _ Сочинеия_Рефераты на английском языке | Добавил: alexlat
Просмотров: 561 | Загрузок: 0 | Рейтинг: 0.0/0
Всего комментариев: 0
Добавлять комментарии могут только зарегистрированные пользователи.
[ Регистрация | Вход ]