Saturday, December 15, 2018
'Disruptive Technology\r'
' libertine engine room Abstract The objective of this project is to explain the go a musical modence of lush engineering science in the IT manufacture that lead en burstted and help the administrations growth in a follow effective manner. One of the hottest abstractics in todayââ¬â¢s IT corridors is the uses and benefits of realisticization technologies. IT companies all(prenominal) everyplace the domain atomic tote up 18 executing virtual(prenominal)ization for a diversity of melody implorements, look atn by prospects to progress boniface tract king and decrease operational bells. InfoTech Solutions being dominant IT resolve provider flock be broadly benefited by implementing the virtualization.\r\nThis paper is in guideed to provide the comp permite expatiate of virtualization, its advantages and strategies for SMEs to migrate. Introduction 2009 IT buzz word is ââ¬ËVirtualizationââ¬â¢. Small, medium and intumescent agate line organizations ser iously started to re organize their e- logical argument dodge towards the successful profuse engineering of virtualization. Virtualization of trade applications permits IT operations in organizations of all sizes to decrease costs, progress IT operate and to decoct risk oversight.\r\nThe almost remarkable cost savings argon the effect of mitigateing hardwargon, use of goods and usefulnesss of space and heftiness, as sanitary as the convergenceivity gains leads to cost savings. In the Small profession firmament virtualization send word be defined as a engineering science that permits application bendloads to be maintained freelance of emcee computer computer hardw be. Several applications thattocks shargon a sole, somatic server. Workloads do- nonhing be rotated from unity multitude to a nonher without any d possess season. IT infrastructure tail assembly be managed as a pool of resources, or else than a collection of physiologic devices. roily enginee ring\r\nDisruptive engine room or riotous launching is an innovation that makes a yield or service better by reducing the price or changing the foodstuff dramatically in a representation it does not expect. Christensen (2000) stated that ââ¬Ëââ¬Ëdisruptive technologies argon typically simpler, cheaper, and more(prenominal) reliable and convenient than realised technologiesââ¬â¢Ã¢â¬â¢ (p. 192). Before we do any seek on disruptive engineering science it is useful and necessary to sum the Christensenââ¬â¢s notion of disruptive engine room. Christensen was projected as ââ¬Å"guruââ¬Â by the bank line (Scherreik, 2000).\r\nHis work has been broadly referred by scholars or researchers operative in varied disciplines and outperformics agreeablered the development of impertinently product, strategies identical food merchandise placeing and tapering and so on. In his book ââ¬Å"The Innovatorââ¬â¢s quandary,ââ¬Â (Christensen 1997) Christens en had d match slight signifi push asidet observations about the circumstances below which companies or organizations that are schematic lose market to an starter that was referred as disruptive engineering. This theory became exceedingly influential in the focus decision fashioning process (Vaishnav, 2008).\r\nChristensenââ¬â¢s arguments, from the academic references (Christensen 1992; Christensen and Rosenbloom 1995; Christensen, Suarez et al. 1996) kind of of flavor in to his famous paperbacks (Christensen 1997; Christensen and Raynor 2003), explains that the entrant might scram more advantage pastce the incumbent and it requires the judgement of three important forces: technical cap cleverness (Henderson and Clark 1990), organisational dynamics (Anderson and Tushman 1990), and honor (Christensen and Rosenbloom 1995).\r\nHe argued further that keep callerââ¬â¢s competitive dodging and mainly its ahead choices of markets to serve, make up wizards minds it s perceptions of economic value in reinvigorated engineering, and rectifys the rewards it pass on expect to obtain through innovation. Christensen (1995) classifies naked as a jaybird technology into two types: sustaining and disruptive. Sustaining technology depends on rising improvements to an already effected technology, at the aforesaid(prenominal) time Disruptive technology is stark naked, and replaces an open up technology unexpectedly.\r\nThe disruptive technologies whitethorn submit escape of refinement and often whitethorn aim cognitive operation problems because these are fresh and whitethorn not learn a verified operable application except. It view ass a attractor of time and energy to induce something unfermented and innovative that pull up stakes importantly influence the way that things are d unrivalled. Most of the organizations are concerned about maintaining and sustaining their products and technologies instead of creating something smart and divers(prenominal) that may better the situation. They give make castrate and minor modifications to improve the on-line(prenominal) product.\r\nThese diversifys will bring out a bit of new spiritedness to those products so that they can increase the sales temporarily and keeps the technology a bit grander. Disruptive technologies by and large emerge from outside to the mainstream. For example the light bulb was not invented by the candle labor seeking to improve the results. Normally owners of recognized technology organizations tend to focalize on their increased improvements to their be products and try to block potential threat to their business (Techcom, 2004).\r\nCompared to sustaining products, disruptive technologies take steps into various directions, climax up with ideas that would work against with products in the current markets and could potentially replace the mainstream products that are being apply. So it is not considered as disruption, further consi dered as innovation. It is not solitary(prenominal) replacing, save re run into ahead what we overhear instantly making things enhanced, quicker, and loosely cooler. Either it may be disruptive or innovative; technologies are changing the ââ¬Å"future jounceââ¬Â in to reality and behindhandly started occupying the universe.\r\nOn one hand, the warning of disruption makes incumbents suspicious about losing the market, composition emerge new entrants confident of inventing the next disruptive technology. Perhaps, such expects and worries produce more competition in the market place. It searchs that every year there is a laundry list of products and technologies that are breathing out to ââ¬Å"change the world as we know it. ââ¬Â One that seems to have potential to achieve the title of a disruptive technology is something that has been around for a while now: virtualization.\r\nGartner (2008) describes disruptive technology as ââ¬Å"causing study change in the accepted way of doing things, including business models, processes, r level rack upue streams, industry dynamics and consumer behaviorsââ¬Â. Virtualization is one of the top ten disruptive technologies listed by Gartner (Gartner. com). This virtualization technology is not new to the world. As electronic data processors eject into more reciprocal though, it became obvious that simply time-sharing a ace computer was not always ideal because the dusts can be misused intentionally or unintentionally and that may crash the entire transcription to alt. To avoid this multi clay concept emerged. This multi system concept provided a parcel of advantages in the organizational environment like Privacy, warrantor to data, Per directance and isolation. For example in organization nicety it is need to keep certain activities performing from diametric systems. A testing application run in a system some propagation may total point the system or crash the system on the whole. So it is obvious to run the application in a separate system that wonââ¬â¢t simu advanced the net work.\r\nOn the other hand placing contrasting applications in the same system may deoxidize the effect of the system as they access the same easy system resources like memory, network stimulation/output, Hard saucer input/output and priority scheduling (Barham, at,. el, 2003). The capital punishment of the system and application will be greatly improved if the applications are set(p) in antithetical systems so that they can have its own resources.\r\nIt is very difficult for most of the organization to target on s unconstipatedfold systems and at times it is hard to keep all the systems busy to its full potential and difficult to maintain and also the summation value keeps depreciating. So investing in quadruple systems becomes waste at times, however having multi systems obviously has its own advantages. Considering this cost and waste, IBM butt ind the first virtual mac hine in 1960 that made one system to be as it was multiple.\r\nIn the starting, this fresh technology allowed individuals to run multiple applications at the same time to increase the exploit of person and computer to do multi line abilities. A ample with this multi tasking portion make outd by virtualization, it was also a great specie saver. The multitasking talent of virtualization that allowed computers to do more than one task at a time become more valuable to companies, so that they can leverage their enthronisations all (VMWare. com). Virtualization is a hyped and much discussed topic recently cod to its potential characteristics.\r\nFirstly it has capacity to use the computer resources in a better potential way maximizing the companyââ¬â¢s hardware investment. It is estimated that however 25% of the total resources are utilized in an average data c wear. By virtualization handsome number older systems can be replaced by a highly modern, reliable and scalable ent erprise servers inflict the hardware and infrastructure cost significantly. It is not undecomposed server consolidation, virtualization offers much more than that like the ability to suspend, resume, checkpoint, and migrate running Chesbrough (1999a, 1999b).\r\nIt is exceptionally useful in handling the long running jobs. If a long running job is assigned to a virtual machine with checkpoints enabled, in any case it lettuce or hangs, it can be restarted from where it stopped instead of starting from the beginning. The main deference of todayââ¬â¢s virtualization compared to the older mainframe age is that it can be allocated any of the serviceââ¬â¢s choice jam and is called as of Distributed Virtual forges that opens a whole lot of possibilities like monitoring of network, validating security policy and the distribution of content (Peterson et, al, 2002).\r\nThe way virtual technology turn backs the single operating system boundaries is what made it to be a significant part of technology that leads in to the disruptive technology group. It allows the users to run multiple applications in multiple operating systems on a single computer simultaneously. (VMWare. com, 2009) Basically, this new move will have a single physical server and that hardware can be made in to package that will use all the available hardware resources to create a virtual mirror of it. The replications created can be used as packet ground computers to run multiple applications at the same time.\r\nThese software lowlyd computers will have the complete attributes like RAM, CPU and NIC interface of the physical computers. The tho variant is that there will be only one system instead of multiple running different operating systems (VMWare. com, 2009) called guest machines. Virtual Machine varan Guest virtual machines can be hosted by a method called as Virtual Machine Monitor or VMM. This should go hand-in-hand with virtual machines. In realty, VMM is referred as the host a nd the hosted virtual machines are referred as guests.\r\nThe physical resources required by the guests are offered by the software layer of the VMM or host. The following code represents the relationship between VMM and guests. The VMM supplies the required virtual versions of processor, system devices such as I/O devices, retentivity, memory, and so forth It also presents separation between the virtual machines and it hosts so that issues in one cannot effect another. As per the research conducted by Springboard nerve into study recently, the spending related to virtualization software and services will piddle to 1. 5 billion US dollar by the end of 2010. The research also adds that 50% of CIOs fire in deploying virtualization to overcome the issues like poor performance systemââ¬â¢s low capacity engagement and to face the challenges of developing IT infrastructure. TheInfoPro, a research company states that more than 50% of new servers installed were based on virtualiz ation and this number is expected to grow up to 80% by the end of 2012. Virtualization will be the supreme impact method modifying infrastructure and operations by 2012. In reference to Gartner, Inc. 008, Virtualization will renovate how IT is bought, planed, deployed and managed by the companies. As a result, it is generating a fresh quaver of competition among infrastructure vendors that will result in market negotiation and consolidation over the coming years. The market share for PC virtualization is also easy rapidly. The growth is expected to be 660 million compared to 5 million in till 2007. Virtualization system for mid- coat businesses Virtualization has turn out to be a significant IT strategy for belittled and mid-sized business (SMEs) organizations.\r\nIt not only offers the cost savings, but answers business continuity issues and allows IT managers to: ââ¬Â¢Manage and skip the downtime caused due to the planed hardware maintenance that will reduce the down time resulting higher system availability. ââ¬Â¢Test, go over and execute the disaster recovery plans. ââ¬Â¢Secure the data, as well as non-destructive backup and re blood Processes ââ¬Â¢ fit out the stability and real-time workloads In these competitive demanding times, SME businesses organizations require to simplify the IT infrastructure and cut costs.\r\nHowever, with various memory, server and network requirements, and also sometimes might not have sufficient physical space to store and maintain systems, the companyââ¬â¢s chances can be restricted by both less physical space and budget concerns. The virtualization can offer likewiseth roots for these kind issues and SMEs can significantly benefit not only from server consolidation, but also with affordable business continuity. What is virtualization for mid-sized businesses? In the Small business sector virtualization can be defined as a technology that permits application workloads to be maintained independent of host h ardware.\r\nSeveral applications can share a sole, physical server. Workloads can be rotated from one host to another without any downtime. IT infrastructure can be managed as a pool of resources, rather than a collection of physical devices. It is assumed that the virtualization is just for great enterprises. But in fact it is not. It is a widely- establish technology that decreases hardware requirements, increases use of hardware resources, modernizes management and diminish energy consumption. economics of virtualization for the midmarket The research by VMWare. om (2009) shows that the SMEs invested on virtualization strategy has received their return of investment (ROI) in less than year. In certain cases, this can be less than seven months with the latest Intel Xeon 5500 series processors http://www-03. ibm. com/systems/resources/6412_Virtualization_Strategy_-_US_White_Paper_-_Apr_24-09. pdf [accessed on 04/09/09] The below image explains how the virtualization simplified a gravid utility company infrastructure with 1000 systems with racks and cables to a dramatically simpler form. Source : http://www-03. ibm. om/systems/resources/6412_Virtualization_Strategy_-_US_White_Paper_-_Apr_24-09. pdf [accessed on 04/09/09] Virtualization SME advantages 1. Virtualization and management cortege presents a stretchable and low -cost development plan and an environment with high capability. 2. Virtualization provides the facility to rotate virtual machines that are live between physical hosts. This ability numerous advantages like business continuity, recovery in disaster, balancing of workload, and even energy-savings by permitting running applications to be exchanged between physical servers without disturbing the service. . Virtualization can help you take full advantage of the value of IT Pounds: ââ¬Â¢ billet alertness in varying markets ââ¬Â¢A flexible IT infrastructure that can scale with business growth ââ¬Â¢ High level performance that can lever th e majority of demanding applications ââ¬Â¢ An industry-standard platform architecture with intellect management tools ââ¬Â¢ Servers with enterprise attributesââ¬regardless of their size or form factor 4. Virtualization can help you to expel IT services: ââ¬Â¢The provision to maintain the workloads rapidly by setting automatic maintenance process that can be configured to weeks, days or even to inutes. ââ¬Â¢Improve IT responsiveness to business involve ââ¬Â¢ carry out times can be eliminate by shifting the ââ¬Â¢To a great extent decrease, even eliminate unplanned downtime. ââ¬Â¢Reducing costs in technical support, training and mainteìnance. Conclusion: This is the right time for Small and mid-sized businesses like InfoTech Solutions to implement a virtualization strategy. Virtualization acts as a significant element of the IT strategy for businesses of all sizes, with a wide range of benefits and advantages for all sized businesses.\r\nIt helps InfoTech Solutio ns to construct an IT infrastructure with enterprise-class facilities and with a with a form factor of Return Of Investment. It is expected that more than 80% of organizations will implement virtualization by the end of 2012. So SME organizations like InfoTech Solutions should seriously look in to their E-business strategy for considering the virtualization or they may be left asshole the competitors. References 1. Adner, Ron (2002). When be Technologies Disruptive? A Demand- Based intellection of the Emergence of Competition. Strategic counselling Journal 23(8):667ââ¬88. . Anderson, P. and M. L. Tushman (1990). ââ¬Å" scientific Discontinuities and Dominant Designs â⬠a Cyclical Model of Technological-Change. ââ¬Â administrative Science Quarterly 35(4): 604-633. 3. Barham, B. Dragovic, K. Fraser, S. Hand, T. Harris, A. Ho, R. Neugebauer, I. Pratt, and A. Warfield. Xen and the art of virtualization. In Proc. nineteenth SOSP, October 2003. 4. Chesbrough, Henry (1999a). Arrested Development: The Experience of European Hard-Disk-Drive Firms in Comparison with U. S. and Japanese Firms. Journal of Evolutionary Economics 9(3):287ââ¬329. 5.\r\nChintan Vaishnav , (2008) Does engine room Disruption Always Mean constancy Disruption, Massachusetts Institute of Technology 6. Christensen, Clayton M. (2000). The Innovatorââ¬â¢s Dilemma. When New Technologies Cause Great Firms to Fail. Boston, MA: Harvard Business School Press. 7. Christensen, C. M. (1992). ââ¬Å"Exploring the limits of technology S-curve: Architecture Technologies. ââ¬Â Production and trading operations Management 1(4). 8. Christensen, C. M. and R. S. Rosenbloom (1995). ââ¬Å"Explaining the Attackers Advantage -Technological Paradigms, Organizational Dynamics, and the Value Network. ââ¬Â Research Policy 24(2): 233-257. . Christensen, C. M. , F. F. Suarez, et al. (1996). Strategies for survival in fast-changing industries. Cambridge, MA, International centerfield for Research o n the Management 10. Christensen, C. M. (1992). ââ¬Å"Exploring the limits of technology S-curve: contribution Technologies. ââ¬Â Production and Operations Management 1(4). 11. Christensen, C. M. (1997). The innovators dilemma : when new technologies cause great firms to fail. Boston, Mass. , Harvard Business School Press. 12. Christensen, C. M. and M. E. Raynor (2003). The innovators solution : creating and sustaining successful growth.\r\nBoston, Mass. , Harvard Business School Press. 13. Cohan, Peter S. (2000). The Dilemma of the ââ¬Ëââ¬ËInnovatorââ¬â¢s Dilemmaââ¬â¢Ã¢â¬â¢: Clayton Christensenââ¬â¢s Management Theories Are Suddenly All the Rage, but Are They Ripe for Disruption? Industry Standard, January 10, 2000. 14. Gartner Says; http://www. gartner. com/it/page. jsp? id=638207 [ accessed on 04/09/09] 15. Henderson, R. M. and K. B. Clark (1990). ââ¬Å"architectural Innovation â⬠the Reconfiguration of Existing Product Technologies and the Failure of r ealised Firms. ââ¬Â Administrative Science Quarterly 35(1): 9-30. 16. MacMillan, Ian C. nd McGrath, Rita Gunther (2000). Technology Strategy in Lumpy Market Landscapes. In: Wharton on Managing Emerging Technologies. G. S. Day, P. J. H. Schoemaker, and R. E. Gunther (eds. ). New York: Wiley, 150ââ¬171. 17. Scherreik, Susan (2000). When a Guru Manages Money. Business Week, July 31, 2000. 18. L. Peterson, T. Anderson, D. Culler, and T. Roscoe, ââ¬Å"A Blueprint for Introducing Disruptive Technology into the Internet,ââ¬Â in Proceedings of HotNets I, Princeton, NJ, October 2002. 19. ââ¬Å"VirtualizationBasics. ââ¬Â VMWare. com. http://www. vmware. com/virtualization/ [Accessed on 04/09/09]\r\nDisruptive Technology\r\nOne of the most consistent rules in business is the loser of ahead(p) companies to stay at the top of their industries when technologies or markets change. Goodyear and Firestone entered the radial-tire market quite late. Xerox let Canon create the lilliput ian-copier market. Bucyrus-Erie allowed Caterpillar and Deere to take over the mechanical excavator market. Sears gave way to Wal-Mart. The pattern of failure has been especially striking in the computer industry. IBM rule the mainframe market but missed by years the emergence of minicomputers, which were scientificly much simpler than mainframes.\r\ndigital Equipment dominated the minicomputer market with innovations like its VAX architecture but missed the personal-computer market virtually completely. Apple data processor led the world of personal compute and established the standard for user-friendly computing but lagged quin years behind the leaders in rescue its portable computer to market. Why is it that companies like these invest assertively-and successfully-in the technologies necessary to retain their current customers but then fail to make certain other expert investments that customers of the future will demand?\r\nUndoubtedly, bureaucracy, arrogance, tired e xecutive director blood, poor planning, and short-term investment horizons have all played a use. But a more fundamental reason lies at the heart of the puzzle: spark advance companies succumb to one of the most popular, and valuable, management dogmas. They stay button up to their customers. Although most managers like to judge they are in control, customers wield extraordinary proponent in directing a companys investments. Before managers decide to launch a technology, develop a product, realise a plant, or establish new transmit of distribution, they must look to their customers first: Do their customers urgency it?\r\nHow voluminous will the market be? depart the investment be advanceable? The more sharply managers ask and answer these questions, the more completely their investments will be aligned with the take of their Customers. This is the way a well-managed company should operate. Right? But what happens when customers reject a new technology, product concept , or way of doing business because it does not address their needs as in effect as a companys current approach? The large photocopying centers that represented the core f Xeroxs customer base at first had no use for small, slow tabletop copiers. The excavation contractors that had relied on Bucyrus-Eries big-bucket steam- and diesel- advocateed cable shovels didnt want hydraulic excavators because, ab initio they were small and weak. IBMs large commercial, government, and industrial customers motto no immediate use for minicomputers. In to each one instance, companies listened to their customers, gave them the product performance they were flavor for, and, in the end, were harm by the very technologies their customers led them to ignore.\r\nWe have seen this pattern restately in an ongoing study of prima(p) companies in a variety of industries that have confronted technological change. The research shows that most well-managed, established companies are consistently ahead of their industries in developing and commercializing new technologies- from additive improvements to radically new approaches- as long as those technologies address the next-generation performance needs of their customers.\r\nHowever, these same companies are rarely in the forefront of commercializing new technologies that dont initially play the needs of mainstream customers and appeal only to small or emerging markets. Using the rational, analytical investment processes that most well-managed companies have developed, it is n primeval im possible to progress to a cogent case for diverting resources from know customer needs in established markets to markets and customers that seem insignificant or do not yet exist.\r\nAfter all, meeting the needs of established customers and fending off competitors takes all the resources a company has, and then some. In well-managed companies, the processes used to identify customers needs, forecast technological trends, assess profitability, al locate resources across competing proposals for investment, and take new products to market are concentrate-for all the right reasons-on current customers and markets. These processes are designed to weed out proposed products and technologies that do not address customers needs.\r\nIn fact, the processes and incentives that companies use to keep focused on their main customers work so well that they blind those companies to important new technologies in emerging markets. Many companies have well-read the hard way the perils of ignoring new technologies that do not initially meet the needs of mainstream customers. For example, although personal computers did not meet the requirements of mainstream minicomputer users in the former(a) 1980s, the computing power of the desktop machines mproved at a much faster rate than minicomputer users demands for computing power did. As a result, personal computers caught up with the computing needs of many of the customers of Wang, Prime, Nixdor f, Data General, and digital Equipment. Today they are performance-competitive with minicomputers in many applications. For the minicomputer makers, property close to mainstream customers and ignoring what were initially low-performance desktop technologies used by seemingly insignificant customers in emerging markets was a rational decision-but one that proved disastrous.\r\nThe technological changes that violate established companies are usually not radically new or difficult from a technological point of view. They do, however, have two important characteristics: First, they typically present a different package of performance attributes- ones that, at least at me outset, are not valued by existing customers. Second, the performance attributes that existing customers do value improve at such a rapid rate that the new technology can later invade those established markets. scarcely at this point will mainstream customers want the technology.\r\n alas for the established supplier s, by then it is often too late: the pioneers of the new technology dominate the market. It follows, then, that of age(p) executives must first be able to office the technologies that seem to fall into this category. Next, to commercialize and develop the new technologies, managers must protect them from the processes and incentives that are geared to avail established customers. And the only way to protect them is to create organizations that are completely independent from the mainstream business.\r\nNo industry of staying too close to customers more dramatically than the hard- turn-drive industry. amidst 1976 and 1992, disk-drive performance improved at a stunning rate: the physical size of a 100-megabyte (MB) system shrank from 5,400 to 8 cubic inches, and the cost per MB furious from $560 to $5. Technological change, of course, drove these breathtaking achievements. About memberal of the improvement came from a host of radical advances that were captious to continued i mprovements in disk-drive performance; the other half(a) came from incremental advances.\r\nThe pattern in the disk-drive industry has been repeated in mar/y other industries: the take, established companies have consistently led the industry in developing and adopting new technologies that their customers demanded- even when those technologies required completely different technological competencies and manufacturing capabilities from the ones the companies had. In spite of this aggressive technological posture, no single disk-drive manufacturing business has been able to dominate the industry for more than a hardly a(prenominal) years.\r\nA series of companies have entered the business and arise to prominence, only to be toppled by newcomers who ensued technologies that at first did not meet the needs of mainstream customers. As a result, not one of the independent disk-drive companies that existed in 1976 survives today. To explain the differences in the impact of certain ki nds of technological innovations on a given industry, the concept of performance trajectories â⬠the rate at which the performance of a product has improved, and is expected to improve, over time â⬠can be helpful. Almost every industry has a detailed performance flight of steps.\r\nIn mechanical excavators, the critical escape is the yearly improvement in cubic yards of priming coat moved per minute. In photocopiers, an important performance trajectory is improvement in number of copies per minute. In disk drives, one crucial measure of performance is storage capacity, which has advanced 50% each year on average for a given size of drive. assorted types of technological innovations affect performance trajectories in different ways. On the one hand, sustaining technologies tend to maintain a rate of improvement; that is, they give customers something more or better in the attributes they already value.\r\nFor example, thin-film components in disk drives, which replaced co nventional ferrite heads and oxide disks between 1982 and 1990, enabled information to be preserve more densely on disks. Engineers had been pushing the limits of the performance they could wring from ferrite heads and oxide disks, but the drives employing these technologies seemed to have reached the natural limits of an S curve. At that point, new thin-film technologies emerged that restored- or sustained-the historical trajectory of performance improvement.\r\nOn the other hand, disruptive technologies introduce a very different package of attributes from the one mainstream customers historically value, and they often perform far worsened on one or two dimensions that are peculiarly important to those customers. As a rule, mainstream customers are unwilling to use a disruptive product in applications they know and understand. At first, then, disruptive technologies tend to be used and valued only in new markets or new applications; in fact, they generally make possible the eme rgence of new markets. For example, Sonys early transistor adios sacrificed sound fidelity but created a market for portable radios by whirl a new and different package of attributes- small size, light weight, and portability. In the history of the hard-disk-drive industry, the leaders stumbled at each point of disruptive technological change: when the diameter of disk drives shrank from the original 14 inches to 8 inches, then to 5. 25 inches, and ultimately to 3. 5 inches. severally of these new architectures, initially offered the market substantially less storage capacity than the typical user in the established market required.\r\nFor example, the 8-inch drive offered 20 MB when it was introduced, while the primary market for disk drives at that time-mainframes-required 200 MB on average. Not surprisingly, the leading computer manufacturers rejected the 8-inch architecture at first. As a result, their suppliers, whose mainstream products consisted of 14-inch drives with more than 200 MB of capacity, did not pursue the disruptive products aggressively. The pattern was repeated when the 5. 25-inch and 3. 5-inch drives emerged: established computer makers rejected the drives as inadequate, and, in turn, their disk-drive suppliers snub them as well.\r\nBut while they offered less storage capacity, the disruptive architectures created other important attributes- internal power supplies and small size (8-inch drives); allay smaller size and low-cost stepper motors (5. 25-inch drives); and ruggedness, light weight, and low-power consumption (3. 5-inch drives). From the late 1970s to the mid-1980s, the availability of the three drives made possible the development of new markets for minicomputers, desktop PCs, and portable computers, respectively. Although the smaller drives represented disruptive technological change, each was technologically straightforward.\r\nIn fact, there were engineers at many leading companies who championed the new technologies and built working prototypes with bootlegged resources before management gave a formal go-ahead. Still, the leading companies could not move the products through their organizations and into the market in a seasonable way. Each time a disruptive technology emerged, between one-half and two-thirds of the established manufacturers failed to introduce models employing the new architecture-in stark contrast to their timely launches of critical sustaining technologies.\r\nThose companies that finally did launch new models typically lagged behind entrant companies by two years-eons in an industry whose products life cycles are often two y. ears. Three ranges of entrant companies led these revolutions; they first captured the new markets and then dethroned the leading companies in the mainstream markets. How could technologies that were initially inferior and useful only to new markets eventually threaten leading companies in established markets?\r\nOnce the disruptive architectures became e stablished in their new markets, sustaining innovations raised each architectures performance along steep trajectories- so steep that the performance available from each architecture soon satisfied the needs of customers in the established markets. For example, the 5. 25-inch drive, whose initial 5 MB of capacity in 1980 was only a fraction of the capacity that the minicomputer market needed, became fully performance-competitive in the minicomputer market by 1986 and in the mainframe market by 1991. (See the graph ââ¬Å"How Disk-Drive Performance Met Market Needs. )\r\nA companys tax revenue and cost structures play a critical role in the way it evaluates proposed technological innovations. Generally, disruptive technologies look monetaryly unattractive to established companies. The potential revenues from the overt markets are small, and it is often difficult to project how big the markets for the technology will be over the long term. As a result, managers typically conclude t hat the technology cannot make a meaningful contribution to corporeal growth and, therefore, that it is not worth the management drift required to develop it.\r\nIn addition, established companies have often installed higher cost structures to serve sustaining technologies than those required by disruptive technologies. As a result, managers typically see themselves as having two choices when deciding whether to pursue disruptive technologies. One is to go downmarket and accept the swallow profit margins of the emerging markets that the disruptive technologies will initially serve. The other is to go upmarket with sustaining technologies and enter market segments whose profit margins are alluringly high. For example, the margins of IBMs mainframes are still higher than those of PCs).\r\nAny rational resource-allocation process in companies serving established markets will choose going upmarket rather than going down. Managers of companies that have championed disruptive technolog ies in emerging markets look at the world quite differently. Without the high cost structures of their established counterparts, these companies kick downstairs the emerging markets appealing.\r\nOnce the companies have secured a foothold in the markets and mproved the performance of their technologies, the established markets above them, served by high-cost suppliers, look appetizing. When they do attack, the entrant companies find the established players to be easy and unprepared opponents because the opponents have been looking upmarket themselves, discounting the threat from below. It is tempting to stop at this point and conclude that a valuable lesson has been learned: managers can avoid missing the next wave by paying careful attention to potentially disruptive technologies that do not meet current customers needs.\r\nBut recognizing the pattern and figuring out how to break it are two different things. Although entrants invaded established markets with new technologies thr ee times in succession, none of the established leaders in the disk-drive industry seemed to learn from the experiences of those that vicious before them. Management myopia or lack of foresight cannot explain these failures. The problem is that managers keep doing what has worked in the past: serving the rapidly growing needs of their current customers.\r\nThe processes that successful, well-managed companies have developed to allocate resources among proposed investments are incapable of funneling resources into programs that current customers explicitly dont want and whose profit margins seem unattractive. Managing the development of new technology is tightly linked to a companys investment processes. Most strategical proposals-to add capacity or to develop new products or processes- take shape at the lower levels of organizations in engineering groups or project teams. Companies then use analytical planning and budgeting systems to select from among the candidates competing for funds.\r\nProposals to create new businesses in emerging markets are particularly challenging to assess because they depend on notoriously unreliable estimates of market size. Because managers are evaluated on their ability to place the right bets, it is not surprising that in well-managed companies, mid- and top-level managers back projects in which the market seems assured. By staying close to lead customers, as they have been trained to do, managers focus resources on fulfilling the requirements of those reliable customers that can be served profitably.\r\n take chances is reduced-and careers are safeguarded-by giving known customers what they want. Seagate Technologys experience illustrates the consequences of relying on such resource-allocation processes to evaluate disruptive technologies. By almost any measure, Seagate, based in Scotts Valley, California, was one of the most successful and aggressively managed companies in the history of the microelectronics industry: from i ts beginning in 1980, Seagates revenues had grown to more than $700 million by 1986.\r\nIt had pioneered 5. 5-inch hard-disk drives and was the main supplier of them to IBM and IBM-compatible personal-computer manufacturers. The company was the leading manufacturer of 5. 25-inch drives at the time the disruptive 3. 5-inch drives emerged in the mid-1980s. Engineers at Seagate were the second in the industry to develop working prototypes of 3. 5-inch drives. By early 1985, they had made more than 80 such models with a low level of company funding. The engineers forwarded the new models to key trade executives, and the trade water closet reported that Seagate was actively developing 3. -inch drives. But Seagates tip customers- IBM and other manufacturers of AT-class personal computers- showed no interest in the new drives.\r\nThey wanted to incorporate 40-MB and 60-MB drives in their next-generation models, and Seagates early 3. 5-inch prototypes packed only 10 MB. In response, Sea gates marketing executives lowered their sales forecasts for the new ââ¬Ëdisk drives. Manufacturing and financial executives at the company pointed out another drawback to the 3. 5-inch drives. consort to their analysis, the new drives would never be competitive with the 5. 5-inch architecture on a cost-per-megabyte basis-an important metric that Seagates customers used to evaluate disk drives. Given Seagates cost structure, margins on the higher-capacity 5. 25-inch models therefore promised to be much higher than those on the smaller products.\r\nSenior managers quite rationally refractory that the 3. 5-inch drive would not provide the sales majority and profit margins that Seagate needed from a new product. A ââ¬Ëformer Seagate marketing executive recalled, ââ¬Å"We needed a new model that could become the next ST412 [a 5. 5-inch drive generating more than $300 million in annual sales, which was nearing the end of its life cycle]. At the time, the entire market for 3. 5- inch drives was less than $50 million. The 3. 5-inch drive just didnt fit the bill- for sales or profits. ââ¬Â The shelving of the 3. 5-inch drive was not a signal that Seagate was complacent about innovation. Seagate subsequently introduced new models of 5. 25-inch drives at an accelerated rate and, in so doing, introduced an impressive array of sustaining technological improvements, even though introducing them rendered a significant portion of its manufacturing capacity obsolete.\r\n'
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment