"Big data" becomes a hot topic and a comparative analysis of the data center
How big is the PB level? Where did the exact location of EB come from? Clarifying such issues as big data is still big business. Although there is a lot of hype about "big data". I have to admit the value of EMC's ChuckHollis in the scale and benefit of the information factory. statistics There are a lot of blogs, articles, vendor courses and records that are broken around the amount of storage and growing at an incredible rate. Looking at examples of sensor data, social and mobile data and storage technologies that are making progress, there is no doubt that "big data" is a hot topic ... and opportunities. Some other big data examples: 1) IDC predicted in 2007 that in 2010 we will have 601EB (1000 megabytes) bytes. It turns out that we have about 1200EB bytes. They predict that 2020 will be 35,000 EB bytes. 2) CERN's Large Hadron Collider generates 15 gigabytes per year. This was the same amount of flash memory sales by Fusion-io in 2010. 3) Each layer of Internet files stores 650TB for a total of 5.8PB 4) IBM Watson can win Jeppardi's less than 1TB of stored data. analysis If big data is big business, then analytics and data warehouses are huge business opportunities. Of course, there is no lack of analysis of progress news or the company's purchasing in this area. Recently, data warehouse leader Teradata announced that it has acquired the Aster data system. VMware VMware announced that Tuesday is the real analysis-using vcenter to operate the cloud environment and application analysis to help customers automate their cloud operations. The virtual environment drives a new management model. The purpose of vcenter operation is to extract wealth from the underlying physical components (server, storage, network) and provide timely information on related matters, and then simply visually understand through the dashboard. VM also recently acquired WaveMaker, creating a focus on allowing users to build Java cloud applications without writing code. DCIM file analysis The data center infrastructure management (DCIM file) tool is about the basic information factory of the data center, and the mine is automated task data or visual dashboard data to assist the effective operation of the facility. Analysis software provider Netuitive on Tuesday expected their enhanced virtual data center dashboard to be available in the second quarter of 2011. The behavioral learning engine that uses predictive analytics to analyze its patents is focused on solving virtualization management problems in enterprises. "We are very pleased to be able to provide a new dashboard and look forward to announcing all the features," said Netuitive's CEONicola Sanna. "Predictive analysis IT continues to grow as a virtualized management and cloud infrastructure for global enterprises." Data center intelligent software provider CIRBA announced that its efficiency and risks are deployed on CIRBA version 6.1. CIRBA's efficiency and risk dashboard includes efficiency and spectrum risks, and provides a visual representation of the data center's unique supply level status of hosts and virtual machines or customers in the environment. CIRBA6.1 requires an inventory of all components in an environment and provides an intuitive visualization or infrastructure with specific recommendations. Data center opportunities Of course, the data center industry has not been absent from big data. Gigaom's structured big data conference will be held in New York within two weeks. The gold sponsor is Equinix. Rackspace is also a sponsor. Cloud storage is also the focus of big data, because there are many activities in this area. High Frequency Wand,Nuderma High Frequency Wand,Frequency Wand,Best High Frequency Facial Wand Foshan Liqia Hardware Products Co., Ltd. , https://www.liqiamei.com