Microsoft plans to place data centers on the ocean floor

Welcome to our readers in the pages iCover blog! According to the publication The New York Times, a research division of Microsoft Research to Microsoft, brings together more than 1, 1 ths. Scientists studying the possibility of placing the world of data centers on the sea and ocean depths. 3-month field trial of the first working prototype of underwater data center within the new global project «Project Natick» Company completed successfully.





High-performance server hardware, installed in the storage and data processing centers (DPC) and is responsible for many operations, from ensuring the stable operation of social networking sites and email services - to play streaming video and provide the rapidly growing needs of the Internet of Things develops a tremendous amount of heat energy. This, in turn, necessitates the use of expensive continuous operation in cooling systems. That is why the leaders of the IT market, including IT giants such as Facebook and Google place their data centers in the Nordic countries with a cold climate. "When you are using your smartphone you believe that everything happens by the will of this wonderful little computer, but actually you are using more than 100 computers in your environment, which is called a cloud" - said Peter Lee, corporate vice president of Microsoft Research. "Multiply the result by billions of people, and then you can very vaguely imagine how enormous scale of the computational work.»

The idea of ​​organizing the data center at the bottom of the ocean was born from the company's employees in 2014 during research work in Microsoft data centers. Interestingly, one of the team members Sean James once worked on a submarine of the US Navy. According to the draft waters of the ocean depths is quite the strength prodelyvat a great job of cooling the server hardware effectively and absolutely free of charge, and a system using energy or tidal turbine - to generate electricity in an amount sufficient to complete the work equipment. For Microsoft offers the required level of insulation to put the server in a special watertight steel containers.



The Natick Team: Eric Peterson, Spencer Fowers, Norm Whitaker, Ben Cutler, Jeff Kramer. (Left to right) sub> i>

According to the information in this issue, the project has passed the first stage of testing. To conduct a series of tests a kilometer from the coast of central California, not far from San Luis Obispo was used submerged to a depth of 9m steel capsule diameter of 2, 4 meters. Control test mode is carried out from the company's campus office.

In order to gather data for subsequent modeling of the situation as close to natural as ... "to send a repairman to fix the problem at night is physically impossible", the container was equipped with hundreds of different sensors for measuring pressure, humidity, movement and other essential parameters. The main cause for concern professionals involved in the experiment was the high, in their opinion, the probability of hardware failures and data loss. Fortunately, their fears were not realized, and the system is adequately passed all the pilot tests. This allowed the team to decide to extend the originally scheduled time of the experiment up to 105 days, and even successfully deployed in the further test some commercial projects with the processing of data in the cloud Microsoft Azure.

Very important is the fact that, as shown by the results of the experiments, the capsule during operation "under load" warmed "... very little", vol. E., "... At a distance of a few inches of its outer walls the water temperature changes detected; "- Dr. Li said. At the end of the first series of experiments, a prototype underwater data center, Leona Philpot, named after a character in the video game series Halo was successfully returned to the ocean floor and brought to the campus of a slightly overgrown with small shells.



According to The New York Times , to date, Microsoft has invested in the creation of the existing global network of more than 100 data centers $ 15 billion., and provides more than 200 different online services. The company is planning deployment of a whole network of data centers especially in those European countries where national programs on alternative energy.



Successful implementation of the first phase of the experiment allows the research team to plan the development of the next version of the capsule three times larger. To participate in the design division at this stage planning to attract a group of experts on alternative energy.

Make proactive conclusions about the probability of the technical implementation of this ambitious project, and even more so, the timing of the input of the first facilities in operation is clearly premature. But if successful, the company will be able to significantly reduce not only operating costs, but also the time required to run a data center.

Another no obvious advantage - the possibility of placing data centers at a much smaller distances from the user, who lives next door to the coastline, than is the case in current practice when placing data center on land.

Briefly:

Natick project promises to help improve the quality of customer service in the region, located adjacent to the coastline. Microsoft claims (... Half of the world's population lives within 200 km of the ocean ...) at a distance of 200 km from the coastline, which can be installed Natick data center, home to nearly 50% of the world population. Placement of data centers in the coastal shelf will dramatically reduce latency and improve the quality of the received signal.

One advantage of the technology of deep data center - the rapid deployment of complex (up to 90 days instead of 2 years). In addition, these systems are able to respond quickly to market needs, quickly deployed in the event of natural disasters or special events.

The company's specialists are studying the possibility of placing highly reliable server equipment with a limit period cadence once a decade. At the same time, the company's engineers to plan, the data center will be developed for a period of 5 years, after which its hardware and the configuration will change to the current at the time of replacement. Target lifetime of a data center, at least 20 years, after which it will be sent for processing.

Despite the successful start and the first results, the participants of the experiment any specific timing of the introduction of deep data center is not called into operation: & quot; ... Project Natick is currently at the research stage. It's still early days in evaluating whether this concept could be adopted by Microsoft and other cloud service providers & quot; - Commented on the timing of the project can be found on page
.
Sources sub> i>
natick.research.microsoft.com sub> i>
The New York Times sub> i>
businessinsider.com

Dear readers, we are always happy to meet and wait for you on the pages of our blog. We will continue to share with you current news, review materials and other publications, and will do everything possible to spent time with us was spent by you to use. And, of course, do not forget to subscribe to our heading .

Our other articles and events

Review Smartphone ZTE Axon Mini - oh, these "mini" Game Cougar peripherals. Press «X» to win Banda Tesoro: cyber gaming kit -chempiona Эксклюзивные Records from iCover and Warner Music Один manufacturer - different fates: the HDD external Lacie 302000 Lacie 1TB and 2TB 9000448

Source: geektimes.ru/company/icover/blog/270330/