Category Archives: Cloud Computing

Suggestions On How To Manage Consumer Clouds, While Protecting The Privacy

1. Think Before You Download

When you load the private information to a remote server, do not forget about the risks. Before downloading, think whether you want to store specific information on the cloud. The cloud is great for storing email, photos, and entertainment, but I do not advise to keep it private information such as birth certificates, tax returns and other important documents.

2. Know Your Cloud Hosting Provider in India, And Its Policies

Before you start to use the service, read the rules of use and privacy policy. Most services are working on hosting providers, so it makes sense to learn more about the policy of the secondary platforms. In politics, the provider must respond to the following questions:

  1. Who owns the data when they are uploaded to the server?
  2. Which provider has the right to the data after they are loaded?
  3. Under whose law is the contract?
  4. Is there a right to data portability? (How easy to transfer data from one service to another?)
  5. What happens if you decide to refuse services? Leave if the right provider at the data, and for how long?
  6. Deactivates a service into your account and you will destroy the data after a certain period of inactivity?
  7. Can your relatives or designated individuals request access to the data and the action to cancel your account, if you become disabled or cannot have access to the account?
  8. Does the provider of independent third-party verification of compliance with privacy and security, to ensure that he adheres to his own policies?

3. Keep A Copy

If you choose to upload files to the cloud, always keep a spare copy. Especially for family photos and home videos.

4. Keep It A Secret

It is reasonable to deal with their accounts in the service as well as the mail account. User names and passwords must be kept secret and change them often. Do not share these data with anyone unless absolutely necessary.

Resource:

Cloud Computing and Enterprise: What Slows Down The Adoption

The enterprise cloud computing to be sluggish, and the problem may be in the CIO and the legacies on the cloud due to a mindset of IT infrastructure management somewhat antiquated. There is talk of control subtracted administrators, lack of security, data protection difficult, uptime is not guaranteed and lockin. But is it really so?

Despite cloud computing is making remarkable strides and many actually begin to look forward with confidence to the cloud, there are still cultural legacies of pre-cloud cow CIOs and therefore slow down the adoption of cloud computing in the enterprise.

Because of these concerns, the enterprise cloud computing to be sluggish, but often these fears are unfounded almost.

To understand this, you can make a list of at least six anxieties that CIOs complain about when it comes to cloud computing enterprise cloud and the opportunity to pass on the entire corporate IT infrastructure.

No to the enterprise cloud computing solutions for fear of losing control

Who decides what is the hardware to buy on the cloud to host IT infrastructure? What happens if one or more disks do not work? Who intervenes physically? And if there is a network problem?

These and many more are the questions that CIOs ask themselves when they think of enterprise cloud computing. And behind these questions is actually the fear of losing control of the IT infrastructure, to delegate responsibility for their own work to someone who does not even know. Indeed, the enterprise cloud computing raises CIOs from these types of responsibilities, but only with regard to the “dirty work”, leaving in their hands a form of control, perhaps higher, due to the elastic resources, APIs and flexibility of the proposed pricing.

No to the enterprise cloud for safety related fears

The loss of control is associated with another fear, linked to safety: other clients can access our business data? The security patches are up to date? Even in this case, it takes very little to dispel this anxiety, reasoning in terms of economy of scale. The security problem, in fact, is often due to lack of time, expertise and resources . A cloud provider will take care of the well-equipped security of its customers every day with qualified staff, with greater dedication than I can do a corporate IT department overstretched, with insufficient human resources and funds allocated to security aspect are always tight.

No to the enterprise cloud computing for data protection

Hand in hand with the safety moves the uncertainty of data protection. The case pointed out that many countries are quite labile privacy of data stored at different facilities. This does not mean that there are not many countries that guarantee a very high level of protection. The solution to this is therefore the choice of providers and residents in relying solutions preferably on the old continent.

No to the enterprise cloud computing for lack of performance and uptime

According to the research, 43.5 per cent of the global IT managers are concerned about the loss of business, revenue and reputation problems of disruption of cloud computing enterprise. In fact, you have to change the mentality, thinking that there is an infrastructure that guarantees the absence of failover and trying to organize a consequence of the model of enterprise cloud computing so that it is able to cope with failover in an automated manner, ensuring however, the continuity of the service. The resources and immediately bootable elastic cloud computing are well suited to this perspective, better than any other type of infrastructure on-premise.

No to the enterprise cloud computing for the lock-in

The lock-in is another hesitation that CIOs complain, thinking to find himself tied to a single provider or to the development of applications designed for a particular architecture and non-transferable. Even in this case, in the choice of the provider is important to look at the presence of API capable of dialoguing with other infrastructures, the possible support platforms recognized, Standardize and opensource, the possible implementation of a higher level of abstraction that allows the passage between providers seamlessly. Even in this case, therefore, the overcoming of the difficulties is in the hands of those who work on decisions and the correct choice of the provider best suited to implement the enterprise cloud computing.

The data traffic in the cloud will grow by 6 times in 2016

Cisco announces its second annual Cisco Global Cloud Index, the traffic cloud will grow six times in the period evaluated. In this report, Cisco advances that traffic of data center worldwide will grow four times and will reach a total of 6.6 zettabytes annually in 2016.

The company also predicts that the overall traffic in the cloud, the fastest growing traffic of global data center will grow six times, 44% of the combined annual growth rate (CAGR) of 683 exabytes of annual traffic in 2013 4.3 zettabytes in 2016.

Note that the figure of 6.6 zettabytes is equivalent to:

  • 92 trillions of hours of streamed music: about year and a half of music continuously transmitted to the world population in 2016.
  • 16 trillions of hours of web conferencing business: about 12 hours of Web conferencing professionals worldwide in 2016.
  • 7 trillion daily broadcasting hours for high-definition video (HD) Online: about two and a half hours of high definition video (HD) for the world population in 2016 transmission.

Most of the traffic in data center is not generated by end users but by the data centers and workloads in the cloud calculated and used in virtual activities that are invisible to end users. Cisco expects that approximately 76 percent of data center traffic will continue within the center itself will be generated and the storage, production and development data. An additional 7 percent of traffic data center will be generated between data centers, primarily by replication of data and software / systems. The remaining 17 percent of data center traffic will be fueled by end users accessing clouds for browsing web , email and video transmission .

Areas, the Cisco Global Cloud study predicts growth Index highest traffic in the cloud it generated Middle East and Africa, while Asia Pacific will have the largest workloads in the cloud hosting services, followed by North America.

How Cloud Computing Is Transforming The Mega Datacenters

Lately, we read a lot about the NSA (National Security Agency) U.S. be accessing the major service providers servers on the Internet. Too much water will still roll until the case is properly clarified and exit the field of speculation, but the theme has opened my eyes to a phenomenon that these companies have set: the mega datacenters.

These huge mega data centers are quite different from traditional datacenters we know, even the most major banks and industries. A mega datacenter has at least 15,000 servers, and cost at least another $200 million to build.

They have a common characteristic: Automation to the extreme. And can scale massively and employ few people to be managed. This means you can add a few thousand servers (or tens of thousands) with an increase of almost negligible cost.

Its operating model will, in my opinion, spread by other datacenters and increase attention to the cloud computing model. The mega datacenters will not end with traditional datacenters – at least in the foreseeable hoorizonte because they were not built to workloads from legacy systems. Were designed for typical Internet workloads such as Facebook, Google and others. Recalling that cloud is essentially a limit on automation with software managing the whole operation, detecting and recovering from faults automatically managing the inclusion or exclusion of new servers, automatically installing new versions of operating systems and so on.

A new dedicated server is put into operation in a few hours. Its growth rate is fantastic. Each round of delivery may reach a thousand servers. For example, Facebook had approximately 30,000 servers in 2009 and in late 2012 was already between 150 thousand. Google, meanwhile, had about 250,000 servers in 2000, 350 thousand in 2005 and 2010 has reached to 900,000. Today must be over one million servers! How many companies in the traditional world, acquire lots of thousand servers at a time?

Automation (cloud model) is essential for the operation of these mega datacenters. Generally, in traditional datacenters there is a relationship of a sysadmin for every 100 to 200 servers. On Facebook, a sysadmin compared to 20,000 servers. This means that a sysadmin on Facebook does the job of 100-200 professionals from traditional datacenters. As the infrastructure management is automated, we see that the main actor in this process is the software. Pinterest, for example, when it had 17 million unique visitors per month, held just a sysadmin to manage cloud. Staff cost us mega datcenters nor is among the Top 5 items of cost, unlike traditional datacenters.

The services provided by these mega data centers and its economy of scale allow the creation of new businesses unviable in the traditional model of upfront investment. An example is Netflix, which manages more than 36 million users video streaming in the public cloud solutions. In the first quarter this year went over three million new users. Other typical internet companies world could only grow the way they grew up using the economy of scale provided by mega datacenters, like Instagram, Zynga, FourSquare and Pinterest.

Pinterest is an interesting case. Spent twenty terabytes of data stored to 350 in just seven months, using a cloud on a mega datacenter. In the traditional model of purchasing physical servers would be absolutely impractical to achieve this expansion in a timely manner.

Investments in its creation are outside the curve. Some estimates suggest that the costs of building some of the datacenters are fabulous. For example, Facebook mega datacenters cost $ 210 million in the U.S. state of Oregon, reaching U.S. $ 450 and U.S. $ 750 million in the other two new, state of North Caroline. The Apple in Oregon cost $ 250 million. Google is not far behind: $ 300 million in the mega data center in Taiwan and $ 1.9 billion in the state of New York. Microsoft, in turn spent $ 499 million on its mega data center in Virginia and the NSA, National Security Agency of the U.S. government, is building a mega datacenter $ 2 billion in the state of Utah.

Well, way these mega datacenters are created? Basically to meet public cloud offerings and Internet services direct to consumers (B2C). Investments in capex of mega cloud providers is immense. It is estimated, for example, that the capex Google reach the level of $ 1 billion per quarter.

Interesting that the mega datacenters are creating a new industry in the IT industry. In general, they do not buy traditional suppliers servers, but use other sources, based on own and assembly performed in Chinese companies such as Quanta Computer designs. Quanta, for example, provides 80% of the Facebook servers and is a leading supplier of Google. This model leads to a paradigm shift. For the mega datacenters, it is much cheaper to exchange server to fix it and thus the time to buy machines with higher index MTBF (Mean Time Between Failures), prefer cheap and disposable machines. If there is a problem, the automation software simply puts the server crash out of the air and reconnects service in another, without human intervention. On the other hand, this is one of the reasons that this current model cannot be applied in traditional datacenters: the software that runs on them, such as public Google services were designed to operate in seamless integration with automation software. Very different from a corporate environment, where a SAP was not designed to work seamlessly with any vendor servers.

The mega datacenters adopt magic formula of cloud: Virtualisation + Standardization + Automation. The standardization facilitates this replacement, since each box is equal to another. The same version of the same operating system version and the public software is operating on all machines.

Recently, Facebook opened its black box. Adopting described five types of servers, one for each specific activity. Basically, it divides its servers into five types: those that meet Web, database, Hadoop, and Haystack Feed load. Each of these services demands different settings. For example, a server that handles photos and videos need more memory than computing power.

The mega datacenters represent the phenomena of consumerization to the traditional data center model. Buy machines by the thousands and are redefining the industry of Intel based servers. An Intel study shows that in 2008, 75% of processors for servers were manufactured by IBM, HP and Dell. Today, these 75% are ditributed from eight manufacturers and interestingly Google is already the fifth “manufacturer” of servers on the market. Really, we are experiencing a disruption and arguably the cloud computing model that has much to do with these changes in the industry.

Importance of data management in the cloud

Cloud computing has been recognized as a platform that enables the organization to reduce the high infrastructure cost and allow them to migrate their application to the cloud technology. Organizations are deploying cloud-computing technology to save money and to increased flexibility in the business.

With the rapid growth of businesses, the amount of data being generated is also growing continuously. Businesses all over the world are gathering and storing more and more data on a daily basis. As data is becoming more voluminous and the use of cloud-based services is, increasing it is very important for organizations to address the need for cloud data management.

Why data management is important?

For any cloud service provider efficient managing of the data in the cloud is very importance to maintaining satisfactory service levels. A good cloud services promises to keep the data and applications they are managing available at all times. For organizations who is adopting cloud-computing, data management is extremely important. To avoid loss, the cloud system must provide data protection and resiliency. If loss does occur, the environment must be able to recover the data quickly in order to restore access to the cloud services.

Cloud computing allows the organization to save and efficiently managed data without keeping costly in-house infrastructure. This initially reduces the expenses, which a company would bear to manage its own huge data. Data management also include data security, Data when shifted in clouds would safeguard companies from any kind of physical damage. This will definitely ensure that the amount of data in which financial companies deal will remain safe.

Cloud computing provide a good opportunity to better manage and optimize their information processes with data management. Cloud computing offers various services one such service is cloud storage. Cloud storage is the most capable option for data storage. Many companies are opt for cloud storage, with the increasing volume of data and documents, storing data in the cloud seems to be the best option.

One of the great benefits of storing valuable data online is that companies only pay for the storage space they actually use. Another important advantage is ease of access. Rather than being locked down to a local storage unit, users can access the r data and documents from any location at any time. Thus for this reason more and more companies are going for cloud data center.

Thus, we can say that a proper data management in the cloud is important for smooth functioning of the organization.