Студопедия

КАТЕГОРИИ:


Архитектура-(3434)Астрономия-(809)Биология-(7483)Биотехнологии-(1457)Военное дело-(14632)Высокие технологии-(1363)География-(913)Геология-(1438)Государство-(451)Демография-(1065)Дом-(47672)Журналистика и СМИ-(912)Изобретательство-(14524)Иностранные языки-(4268)Информатика-(17799)Искусство-(1338)История-(13644)Компьютеры-(11121)Косметика-(55)Кулинария-(373)Культура-(8427)Лингвистика-(374)Литература-(1642)Маркетинг-(23702)Математика-(16968)Машиностроение-(1700)Медицина-(12668)Менеджмент-(24684)Механика-(15423)Науковедение-(506)Образование-(11852)Охрана труда-(3308)Педагогика-(5571)Полиграфия-(1312)Политика-(7869)Право-(5454)Приборостроение-(1369)Программирование-(2801)Производство-(97182)Промышленность-(8706)Психология-(18388)Религия-(3217)Связь-(10668)Сельское хозяйство-(299)Социология-(6455)Спорт-(42831)Строительство-(4793)Торговля-(5050)Транспорт-(2929)Туризм-(1568)Физика-(3942)Философия-(17015)Финансы-(26596)Химия-(22929)Экология-(12095)Экономика-(9961)Электроника-(8441)Электротехника-(4623)Энергетика-(12629)Юриспруденция-(1492)Ядерная техника-(1748)

Internet bots




Read and discuss the information.

Internet bots, also known as web robots, WWW robots or simply bots, are software applications that run automated tasks over the internet. Typically, bots perform tasks that are both simple and structurally repetitive, at a much higher rate than would be possible for a human editor alone. The largest use of bots is in web spidering, in which an automated script fetches, analyses and files information from web servers at many times the speed of a human. Each server can have a file called robots.txt, containing rules for the spidering of that server that the bot is supposed to obey.

In addition to their uses outlined above, bots may also be implemented where a response speed faster than that of humans is required (eg gaming bots and auction-site robots) or less commonly in situations where the emulation of human activity is required, for example chat bots.

Bots are also being used as organization and content access applications for media delivery. Webot.com is one recent example of utilizing bots to deliver personal media across the web from multiple sources. In this case the bots track content updates on host computers and deliver live streaming access to a browser based logged in user.

IM and IRC

Some bots communicate with other users of Internet-based services, via instant messaging (IM), Internet Relay Chat (IRC), or another web interface. These chatterbots may allow people to ask questions in plain English and then formulate a proper response. These bots can often handle many tasks, including reporting weather, zip-code information, sports scores, converting currency or other units, etc. Others are used for entertainment, such as SmarterChild on AOL Instant Messenger and MSN Messenger and Jabberwacky on Yahoo! Messenger. Another popular AIM bot is FriendBot

An additional role of IRC bots may be to lurk in the background of a conversation channel, commenting on certain phrases uttered by the participants (based on pattern matching). This is sometimes used as a help service for new users, or for censorship of profanity.

AOL Instant Messenger has now introduced a feature that allows you to make a screen name into a bot. This new feature removes the rate limit on the screen name, however it is now limited in the amount of instant messages that can be sent and received.

Commercial purposes

There has been a great deal of controversy about the use of bots in an automated trading function. Auction website eBay has been to court in an attempt to suppress a third-party company from using bots to traverse their site looking for bargains; this approach backfired on eBay and attracted the attention of further bots. The United Kingdom-based bet exchange Betfair saw such a large amount of traffic coming from bots they launched a WebService API aimed at bot programmers through which Betfair can actively manage bot interactions.

Malicious purposes

Another, more malicious use of bots is the coordination and operation of an automated attack on networked computers, such as a denial-of-service attack. (see botnet) Internet bots can also be used to commit click fraud and more recently have seen usage around MMORPG games as computer game bots. A spambot is an internet bot that attempts to spam large amounts of content on the Internet, usually adding advertising links.

There are malicious bots (and botnets) of the following types:

Spambots that harvest email addresses from contact forms or guestbook pages

Downloader programs that suck bandwidth by downloading entire web sites

Web site scrapers that grab the content of web sites and re-use it without permission on automatically generated doorway pages

Viruses and worms

Hackers

DDoS attacks

Botnets / zombie computers; etc.

Bots are also used to buy up good seats for concerts, particularly by ticket brokers who resell the tickets. Bots are employed against entertainment event-ticketing sites, like TicketMaster.com. The bots are used by ticket brokers to unfairly obtain the best seats for themselves while depriving the general public from also having a chance to obtain the good seats. The bot runs through the purchase process and obtains better seats by pulling as many seats back as it can.

Bots are often used in Massively Multiplayer Online Roleplaying Games to farm for resources that would otherwise take significant time or effort to obtain; this is a concern to most online in-game economies.

 

12. Read and discuss the article dedicated to a new technology in communications.

 

PLUGGING IN AT LAST

After years of delay, the provision of internet access over power lines is taking off—though not for the reasons you might expect. This is "a banner day...a historically significant day for communications," declared Michael Powell in mid-October. The chairman of America's Federal Com­munications Commission (fcc) was en­thusing about the prospects of a novel way for customers to receive ultra-fast broadband internet service in their homes: via the stodgy old power grid. The idea of using the power grid as a communications network—known as "broadband over power lines" (bpl) in America and "power-line communica­tions" (plc) in Europe—has been around for ages, but is at last being implemented. Mr Powell made his comments at a meet­ing where the fcc gave its formal bless­ing to bpl. Crucially, the agency ruled that utilities that follow certain rules (chiefly concerning radio interference) would be given a wide berth to operate as "unlicensed" entities, unencumbered by America's baffling Telecoms rules. Why bother with bpl? The fcc is keen because it will bring into the broad­band market a third group of competi­tors, after telephone firms, which use upercharged phone lines (digital sub­scriber lines, or dsl) to deliver broad­band, and cable-TV operators, which use their cables for the same purpose. An­other competing technology should lower prices, bpl may also further the fcc's goal of universal broadband ser­vice. The use of power lines means al­most everyone in the rich world should be able to receive broadband service through this approach. That is not true for cable, which does not have universal cov­erage. Utilities are already eyeing new markets in rural areas.

William Berkman of Current Commu­nications, a leading bpl operator, argues that the technology offers several advan­tages for customers, too. Connection speed is not dependent on the distance from a telephone exchange (as with dsl) or on the number of customers online at once (as with cable), bpl, unlike its rivals, offers uploads at the same speed as downloads. And, says Mr Berkman, it will ultimately offer far more capacity than today's cable networks. "This is a bet on bandwidth," he says. "You want video and TiVo over the internet? You'll get it."

Yet old hands remain sceptical, for bpl is hardly a new technology. Indeed, engi­neers tried to make it work for many years, but failed due to snags in the "last mile": in particular, the final "step-down" power transformer, at the point where the power from the grid enters a home or office, interfered with the flow of data.

But two new ways have been devised to solve this problem. One is to route around the step-down transformer using wireless technology. The transformer is often on a utility pole outside the cus­tomer's premises, so it need only be a short hop to a wireless receiver indoors. The other approach routes the data signal around the transformer and then feeds it back into the domestic electricity supply. A special modem plugged into an electri­cal outlet then deciphers the signal. This approach also allows domestic electrical wiring to double as a home network.

Despite these steps forward, several problems may yet derail bpl. One is secu­rity. Some worry that it may be too easy to pilfer your neighbour's e-mails by in­tercepting the signal from the wireless transmitter sitting on the pole between your houses. But early field trials-and there have been several dozen such trials, in America, Canada and Europe—suggest that this worry is misplaced. The most ad­vanced trial is taking place in Cincinnati, Ohio, where some 15,000 customers now receive bpl on commercial terms.

Jim Rogers, the boss of Cinergy, the trail-blazing local utility, points to the sec­ond potential snag facing bpl: the stodginess of utilities. Traditionally, the sector has not been very innovative, but in this case caution might be justified. Mr Rogers notes that risky broadband investments would need to be made today out of un­regulated profits-but with no guarantee that regulators would not claw back bpl profits tomorrow as "regulated" earnings.

Another concern is radio interference. Ham-radio operators have complained vociferously that bpl signals are "bleed­ing" from power lines and spoiling their hobby. The emergency services use some frequencies that may be affected too. So the fcc's October ruling establishes "ex­cluded frequency bands" for aeronauti­cal communications and public safety. Firms that manufacture bpl gear must "notch out" emissions within those bands, and win pre-certification from the fcc for their equipment. The regulators also created a database for tracking re­ports of alleged bpl inter­ference. Even so, the ham-radio operators were not satisfied. Yet they claim that the technology will fail to take off anyway. Their trade associ­ation claims that other broadband tech­nologies, such as Wi-Fi and new fibre-optics networks, will "leave bpl in the footnotes of technology along with the eight-track tape player."

Perhaps. But there is reason to believe that bpt. might finally start to live up to its potential, thanks to an unexpected ace in the hole that dsl and cable do not have: the big blackouts that hit North America, London and Italy last summer. Those fiascos were the result not of a shortage of generating capacity, but rather of the fail­ings of an antiquated grid. Today's electri­cal system is a creaking relic lacking the sophisticated command-and-control tools necessary to ensure reliability. As last summer's outage in America showed, local operators often do not know when or where power has gone out on their system.

bpl could change all of that. A beauti­ful side benefit of the technology is that data-enabling their power networks will let utilities monitor what is happening on their power grids in real time, down to local substations. The technology could also allow them to read power and water meters without entering customers' premises. Mr Rogers of Cinergy notes that it might also allow utilities to manage peak loads by, for example, turning down your residential air conditioner remotely while you are at the office, in return for a lower tariff.

"The reliability aspects are unbeliev­able!" gushed Nora Brownell in October as she toured a utility in Virginia that has implemented bpl. Her view matters, for she is a commissioner on the Federal Energy Regulatory Commission (ferc), the country's top regulator of power utili­ties. In a telling and unorthodox show of support for the new technology, the ferc's commissioners turned up at the fcc meeting at which bpl was given the green light. Pat Wood, the ferc's boss, even offered this forecast: "It's my hope that a year from now boards of directors and shareholders and cus­tomers are all asking utilities, 'Why aren't you in bpl?'" It finally looks as though bpl's day has come. The happy collision of Mr Powell's desire for broadband competition and Mr Wood's dream of grid reliability is spurring on bpl technology. Revealingly, European officials, who have in the past been cautious about plc technology, applauded the fcc's decision in October: similar pan-European rules may be in the offing. The result could be better internet access for customers-and, just possibly, a step towards the intelligent, self-heal­ing power grid of tomorrow.

 

13. Read and discuss the article. Explain your opinion on today’s contest between HD-DVD and BLU-ray.

BATTLE OF THE BLUE LASERS.

Another year, another standards war in the consumer-electronics business. From the people who brought you the contest between vhs and Betamax in the 1970s comes a new saga in which two rival—and, inevitably, incompatible-formats struggle to establish themselves as the higher-capacity successor to the wildly popular dvd standard. Both new formats rely on blue lasers, which can dis­cern finer details than the red lasers used in dvd players, to squeeze more data on to each disc. This capacity can be used in two ways: to boost quality, by providing a more detailed "high definition" picture, or to increase quantity, enabling more foot­age (at dvd quality) to fit on a single disc.

In one corner is the hd-dvd format, backed by Toshiba, nec and Sanyo. The details are still sketchy-the specification will not be finalised until February—but hd-dvd will offer at least three times the storage capacity of dvd, while improved video-compression software will further boost capacity. The new format has the backing of the dvd Forum, which means it is the "official" successor to the dvd for­mat. Proponents of hd-dvd claim the discs can be made cheaply using existing dvd production lines with very little modification. The first hd-dvd devices will go on sale next year.

In the other corner is Blu-ray, backed by a consortium that includes Sony, Mat­sushita, Hitachi and Philips. Blu-ray discs have around five times the capacity of dvds, allowing each disc to store around two hours of high-definition video, or 13 hours of standard video. Sony has been selling Blu-ray recorders in Japan since 2003, and Matsushita and Sharp have both launched Blu-ray devices this year.

The battle between the two standards has heated up in recent months as the two camps fight to sign up hardware vendors and content producers, notably Holly­wood studios, which have determined the outcome of previous standards wars So how will the battle play out? Previous standards wars have been resolved in one of four different ways.

The first possibility is a compromise between the two rival formats, as hap­pened in 1995 with the dvd standard. Originally, Sony and Philips proposed a technology called mmcd, while Toshiba and its allies pushed a rival standard called sd. After much wrangling, a stan­dards war was averted when Hollywood demanded a single format. Sony compro­mised, and the result was the dvd, which is very similar to sd but borrowed some elements from mmcd. This time, how­ever, such a compromise seems unlikely, says Shyam Nagrani, an analyst atiSuppli, a market-research firm. "Nobody wants to bend," he says, since neither side wants to give up the lucrative royalties it stands to make if its standard prevails. Instead, both sides are digging in for a long fight.

A second possible outcome is that the two standards will coexist, and dual-for­mat players capable of handling both kinds of disc will render the standards war irrelevant-as happened in the tussle over recordable dvds. The dvd Forum backed a recordable format called dvd-r, but several firms, including Sony and Philips, chose to back a rival format called dvd+r. Players capable of reading and writing both kinds of disc have, however, now largely rendered the disagreement moot. Vamsi Sistla, an analyst at abi Re­search, believes the same could happen with hd-dvd and Blu-ray. Building a player that works with both kinds of disc will be difficult, he concedes, but consumer-electronics firms support multiple competing formats in other areas. Yet Mr Nagrani argues that in this case the two formats are so different that dual-format players would be far too expensive.

A third possibility is that the market will be stillborn: the lack of a common standard will deter consumers from up­grading, as happened with the two rival successors to the cd audio format, dvd-a (backed by the dvd Forum) and sacd (backed, inevitably, by Sony and Philips). Hybrid players can now play both kinds of disc, but neither format has taken off. Music and video are, however, very dif­ferent. The popularity of the MP3 format suggests that most people are not too bothered about audio quality. In the case of video, however, there is a reason to up grade. The new high-capacity discs will be able to hold an entire series of a sitcom, and there is bound to be demand for high-definition versions of popular movies such as "Lord of the Rings" and "Star Wars" that feature spectacular special ef­fects. And as large televisions become more popular, says Mr Nagrani, people will demand higher image quality.

There can be only one

All of this suggests that the most likely outcome is a fight to the death, as hap­pened with vhs and Betamax. Histori­cally, Sony has often been in the losing camp, starting with Betamax. This time around, however, Sony has several fac­tors in its favour. For one thing, Blu-ray is the more mature technology, and is al­ready shipping. Second, Sony now owns two of the top ten Hollywood studios; its recent acquisition of mgm was widely assumed to be motivated in large part by a desire to bolster support for Blu-ray.

But the potential ace up Sony's sleeve is its next-generation games console, the PlayStation 3, which will use Blu-ray disc: to store games and will double as a Blu-ray player. If Sony can launch the console next year, before hd-dvd devices be­come widely available, it may be able to establish critical mass for Blu-ray. Yet if the Hollywood studios come down firmly in favour of hd-dvd, that could relegate Blu-ray to obscurity. The next few months will be critical, and the battle could still go either way. "By late 2005 it will be very clear who the winner is going to be," says Mr Nagrani. Technology may have moved on since the time of the vh s and Betamax contest—but unfortunatel for consumers, standards wars persist.

 

14. Read and discuss the article. Do you think that grid computing's biggest problem? Do you agree or disagree with this point of view? Use specific reasons and examples to support your answer.

 

GRID COMPUTING

When is a grid not a grid? It depends whom you ask. According to many in the computer industry, grid comput­ing—which roughly means the harness­ing of the collective processing power of many computers in different places-is here today, and is already widespread. Yet according to others, grid computing, while promising, is still years away from becoming a reality. Who is right?

The problem is that "grid" has been co-opted as a buzzword and applied to a number of entirely different things. The term "grid computing" was originally coined by Ian Foster of America's Argonne National Laboratory in the late 1990s. He meant to draw an analogy between the supply of computing power and the supply of electricity, which is delivered along a wire, when you need it and with no need to worry about where it came from.

In 2002, Dr Foster drew up his own three-part definition of grid computing. A grid, he proposed, should co-ordinate computing resources that are not cen­trally controlled, rely on open standards, and provide more reliability than stand-­alone machines. Alas for Dr Foster, his checklist immediately raised hackles within the computer industry, since much existing "grid computing" software fails to meet these criteria. Linking many small computers together to create a more powerful machine, for example, is not new, and is usually called clustering. For marketing purposes, however, some firms like to call it grid instead.

Similarly, grid is often confused— sometimes deliberately, for marketing reasons—with equally nebulous terms such as utility computing, on-demand computing, autonomic computing and data-centre virtualisation. Behind all this terminology is the idea of continuously and automatically adjusting the configu­ration of a corporate data-centre to meet the demands made on it. But Andrew Chien, a grid pioneer at the University of California at San Diego, notes that though useful, such approaches generally eschew the harder part of the grid vision, which requires automated sharing of computing resources between different organisations, not just within one firm.

A well-known example of the sharing of computing resources across the internet is SETi@home, in which over half a million people help to sift radio-telescope readings for evidence of extra­terrestrial life using a glorified screen-saver running on their pcs. Other similar projects, such as ibm's new World Community Grid, conduct medical re­search. But David Anderson, the director of SETi@home, rejects the grid label, pre­ferring the term "public resource comput­ing". Others call it "internet computing" or "cycle scavenging". While it is grid-like in some respects, this approach is very task-specific and is centrally con­trolled—so it is not truly grid.

Some firms, such as United Devices, sell proprietary software for cycle scav­enging within a single company. Idle pcs can. for example, run drug-design soft­ware in a pharmaceuticals company or evaluate a derivatives portfolio for a fi­nancial-services firm. Early adopters of this technology claim impressive bene­fits. Yet since all the resources are con­trolled by a single organisation, purists argue that this is at best an "intragrid",just as an intranet is a private, internal version of the internet.

What of those deliberately decentral­ised systems, peer-to-peer file-sharing networks? Some of them, at least, operate using open standards, and they are certainly robust: repeated attempts to close them down have failed. But they do not count as grid computing either, since they are mostly storage and distribution systems, and do not perform general purpose data-processing.

Grid computing is not entirely fic­tional, however: scientists have been building grids on a national or even global scale for several years. A good ex­ample is the lhc Computing Grid, which links large clusters and storage systems in 87 computer centres around the world, for the benefit of particle physicists. An­other example is TeraGrid, an American effort to link nine large supercomputing centres for scientific use. Even within the academic arena, though, convergence towards common standards is slow, partly because each grid project tends to rein-vent the wheel. To tackle this problem, tbe European Union launched a major initiative called egee this year to provide a common grid infrastructure for scien­tists: America has a similar initiative.

The hope is that such projects will provide the first glimpse of ‘the grid’ a single global computing grid that will do for DTA proceeding what the world wide web did for online publishing. Wolfgang Gentzsch, a former grid guru at Sun Microsysrems who is now director of mcnc North Carolina's statewide grid initiative, says the term "grid" really re­fers to this ultimate goal, towards which today's systems are merely stepping stones. But it would, he admits, be more accurate to refer to them as "grid-like" or using "grid technology".

Constructing a single, global grid will mean solving difficult security, privacy and billing problems. Scientists have a tradition of sharing their results and re­sources, but others do not. Yet the hurdles are not so much technological as politi­cal, economic and terminological. The dream of a single grid, akin to the web in its simplicity and pervasiveness, still seems a long way off-as does agreement about what "grid" really means.

 

15. Read and discuss the article. Despite the legal wrangles over music piracy, peer-to-peer technology has many uses and is here to stay. Do you agree or disagree with this point of view? Use specific reasons and examples to support your answer.

 

 

IN PRAISE OF P2P

 

Imagine an ideal global information-storage system. It would have to be huge, capable of delivering any one of millions of files, some of them of enor­mous size, to anywhere in the world with­in moments. It would have to be self-configuring and self-healing, rather than centrally controlled, to ensure there was no single point of failure. And it would have to be secure, capable of supporting millions of users, while resisting constant assault both from physical attacks on its infrastructure and from malicious soft­ware circulated within the network.

Such a system sounds highly desir­able, particularly when compared with the internet, which has become a piece of critical economic infrastructure but is be­set by constant security scares and can be­come clogged up if too many users try to do the same thing at once. Yet this ideal system already exists, in the form of peer-to-peer (P2P) file-sharing networks such as eDonkey and KaZaA.

The technology, which is used by mil­lions of music lovers to download songs-usually infringing copyrights-is reviled by the entertainment industry. In Amer­ica and Europe, music and film compa­nies are using the courts and lobbying for new laws to outlaw P2P technology. In October, trade groups representing the entertainment industry went so far as to petition America's Supreme Court to con­sider whether makers of P2P software should face "secondary liability" for copyright infringement by their users. Of­ficials at America's Department of Justice have even suggested that using P2P sup­ports terrorism. The technology is also condemned as a distribution system for il­legal pornography.

Yet rather than being demonised, there are good reasons why the technology should be celebrated-and its benefits more widely studied and exploited. Argu­ing that the internet's robustness and se­curity could be improved using technology generally associated with music piracy might seem strange, admits Yochai Benkler of Yale Law School, who raised the idea in a recent paper, but the suggestion is a tribute to "how robust these systems are". P2P networks have, after all, withstood years of legal, techni­cal and physical assault but still work.

The widespread equation of P2P with piracy has obscured the fact that the same technology is also being constructively applied in all sorts of fields, from content distribution and internet-rooted calls to distributed storage and Peer-to-peer technology is emerging as a powerful new approach to building large-scale computer systems, regardless of the entertainment industry's legal efforts.

Technically, "peer-to-peer" refers to a computer's ability to communicate di­rectly with other computers running the same software, without having to go through intermediaries. While this might appear to describe the internet itself, the reality is slightly different. Although the internet was originally designed to be de­centralised, it has evolved into more of a hub-and-spoke system. Personal comput­ers at the edge of the network connect to powerful servers in the centre to do things such as send e-mails or retrieve web pages. What was once a network of equals, made up of machines that were both producers and consumers of con­tent, became something that "looked like television with packets," says Clay Shirky, a technology consultant.

Strength in numbers

Peer-to-peer connects computers di-rectly-and once enjoined, personal com­puters can do things they are unable to do alone. Most P2P systems let users pool re­sources, be it processing power, storage capacity or bandwidth. In the case of mu­sic file-sharing, users are, in effect, creating an enormous shared filing system from which they can all retrieve songs. Over half of all internet traffic is now generated by peer-to-peer applications, accordir. \ CacheLogic, a P2P network-services com­pany in Britain. Figures from BigCham-pagne, an internet-research firm in Beverly Hills, California, suggest that at least io% of the content on P2P networks is legal, and does not violate the entertain­ment industry's copyrights. The most active P2P system, account­ing for an estimated 35% pf all internet traffic according to CacheLogic, is called BitTorrent. It is an open-source software project that is free to use and enables very large files to be stored and retrieved effi­ciently at essentially no cost. Though it is used for pirated music, it comes into its own when distributing really large files such as movies, games and large pieces of software such as the Linux operating system—things that would otherwise be very costly for companies or individuals to make available for download.

Part of BitTorrent's success stemsfromthe way it creates incentives for users to give as well as to take.

 

16.

Read and summarize the article.

AUTHENTIC HERO.

 

LINUX, the free computer operating system developed by thousands of volunteers collaborating over the Internet, is still not taken very seriously in corporate circles. It is used for niche tasks, such as running web servers, but it is generally deemed to be too immature for the most demanding environments, such as heavy-duty database systems. Recent events, however, suggest that Linux—whose mascot is a cheerful penguin—may have outgrown the commune of its birth. On January 4th Linus Torvalds, the Finnish programmer who co­ordinates the development of Linux, quietly released the latest version of the Linux kernel—the software that, as its name suggests, is at the core of the operating system. Many of the enhancements in this new kernel (version 2.4) make Linux more suitable for corporate use. In particular, they make it more "scalable"—in other words, as capable of working on very large computer systems as on small ones. Linux 2.4 can support more processors, more memory, and raster networking and disk access—all prerequisites for industrial-strength corporate use. Just as the software itself has become more solid, so support for Linux within the computer industry has also been growing. IBM, which has embraced Linux across its product range, from PCS to mainframes, announced in December that it would spend $1 billion on Linux-related activities in 2001. And this week the Open Source Development Laboratory, an independent, not-for-profit research centre financed by such in­dustry giants as ibm. Intel and Dell, opened its doors. It is intended to accelerate the adoption of Linux in business computing, and to allow developers to test their software on the largest systems. In other words, with the notable exceptions of Microsoft and Sun Microsystems, the industry is pushing Linux for use in corporate computing. Linux is also proving a popular choice for powering Internet appliances, such as handheld computers and smart telephones. And. at the other end of the scale, it is emerging as a powerful force in the specialist field of supercomputing. By connecting hundreds of PCS Running Linux in a "cluster", it is possible to construct an enormously powerful machine for a fraction of the cost of a conventional supercomputer, ibm recently started installing a 1.024-processor Linux supercomputer at Shell's research centre in the Netherlands, where the oil company plans to use it to analyse geophysical data and to help it find oil. And on January 16th. Amer­ica's National Centre for Supercomputing Applications said that it had agreed to buy two Linux supercomputers from IBM. one of which will be the fourth-fastest supercomputer in the world when it is switched on this summer. There are some fears that the embrace of Linux by big computing companies could prove a mixed blessing. George Weiss of Gartner, a research firm, suggests that IBM. in particular, "looms like a shadow" over the future of Linux: its obvious enthusiasm, he says, might deter new firms from entering the market for Linux support and services. Any attempt by big computing companies to hijack Linux, declares Eric Raymond, an open-source guru, would be counter-pro­ductive, since it would alienate the very people from whom Linux draws its strength. Yet it is inevitable, as Linux becomes increasingly popular, that it will shed the revolutionary cachet which, for some of its supporters, is its greatest appeal.

 

17.

Read and summarize the article.

rebel code: linux and the open-source revolution. By Glyn Moody. When, during its antitrust trial in 1999. Microsoft had to name some competitors to prove that Windows was not a monopoly, it could point to just two. One was its old enemy, Apple, which had been briefly resurgent under Steve Jobs but these days was utterly dependent on Microsoft's willingness to carry on producing a version of its Office software that would run on the Mac operating system. The other was Linux, a free operating system that was the product not of a rival company, but of the work of thousands of anonymous hackers choreographed by a young Finnish student called Linus Torvalds. In truth, Linux as an operating system for desktop pcs provides as little real competition to Windows as docs' old Apple. That may come, although it is far from certain that something made for geeks by geeks will ever win widespread acceptance among consumers. But in the vital market for the operating systems that run the millions of small-to-medium-sized server computers that offer web-pages, handle e-mail and do countless other routine administrative tasks. Linux is already more ubiquitous than equivalent versions of Windows. Even more worryingly for Bill Gates. Linux, actively supported by powerful companies, such as ibm. Dell and even Intel, is getting ready to move up the computing food chain and into the corporate data centre—which is precisely where Microsoft is determined its strategically vital and expensively developed Windows 2000 should prevail. That something created by "hobbyists", as Mr Gates calls them, may be doing more to threaten Microsoft's hegemony than wealthy and aggressive rivals, such as Sun Microsystems and Oracle, is nothing short of revolutionary. This alone makes the story that "Rebel Code" tells important. In its way, Linux and other open-source products are as disruptive to the "traditional" software business as is the Internet. The Internet and open source reinforce each other, the former making possible new models of collaborative working, the latter supporting the Internet's preference for open, non-proprietary standards. It is no accident that open source is better suited to developing reliable utility-type software than sophisticated applications. The virtue of "Rebel Code" is that it largely eschews hype and is clearly written, if at times in rather technical prose. Its weakness is that it conveys little of the unfolding drama and not nearly enough of the personalities and motivations of the extraordinary people who have helped to shape the open-source movement: its spiritual father, the quasi-communist coder. Richard Stallman: the libertarian polemicist - Eric Raymond whose love affair with free software is matched only by his passion for guns: or the enigmatic Linus Torvalds himself, who has guided Linux for ten years and who has become an authentic hero within a community that instinctively distrusts such things.

 

 




Поделиться с друзьями:


Дата добавления: 2015-01-03; Просмотров: 572; Нарушение авторских прав?; Мы поможем в написании вашей работы!


Нам важно ваше мнение! Был ли полезен опубликованный материал? Да | Нет



studopedia.su - Студопедия (2013 - 2024) год. Все материалы представленные на сайте исключительно с целью ознакомления читателями и не преследуют коммерческих целей или нарушение авторских прав! Последнее добавление




Генерация страницы за: 0.086 сек.