Cloud Computing Driving Outsourced Data Centre Market Up in Australia, says …

~ CAGR of 13.2% forecasted to 2020 – managed hosting services to outpace co-location ~

SYDNEY, Sept. 2, 2015 /PRNewswire/ – As a result of increased adoption of cloud computing, driven by the consumer segments increased consumption of videos, social networks, mobile data and gaming, and the corporate sectors use of data intensive applications, the Australian outsourced data centre market continues to grow strongly.


In 2014, data centre services revenue in Australia totalled A$826 million; a growth of 18.3% over 2013. Co-location service accounted for approximately 69% of the total data centre services market. According to Frost Sullivan’s new report, Australian Data Centre Services Market 2015, Australia’s high growth phase of outsourced data centre adoption will peak in 2015 and ease off in 2016 and 2017 as the rate of new data centre capacity entering the market slows down.

Data centre services revenue for 2015 is predicted to grow by 18.2%, but whilst managed hosting continues to see strong revenue growth, co-location revenue growth is beginning to ease as an increasing proportion of data centre clients migrate their co-location and managed hosting services to cloud services. Phil Harpur, Senior Research Manager, Australia New Zealand ICT Practice, Frost Sullivan said that wholesale data centre providers and those that focus on co-location services only, face significant pressure because of this trend. However, the growth of cloud services has been a key factor in developing new business opportunities for data centre specialist providers.

Frost Sullivan predicts the Australian data centre services market to grow at a CAGR of 13.7% from 2015 to 2020. Managed hosting will experience stronger growth than co-location over this period, as demand decreases due to companies migrating from co-location to cloud services.

Cloud providers, especially larger global providers such as AWS, Microsoft and IBM SoftLayer are driving strong growth in the market and rapidly expanding their cloud capacity, whilst the government sector continues to increase its use of third-party hosted data centres. Demand is also growing for disaster recovery and business continuity services. Connected, multi-tenanted data centres are best placed to provide these services. Most third-party data centre providers in Australia have multiple data centres in multiple locations.

The average power density requirement of data centres is now up to 40KW to 50KW per rack and continues to increase in line with the increasing demand for high-performance computing applications. As rack densities decrease, physical data centre space needed declines. This trend impacts data centre providers offering co-location services on both a retail and wholesale level.

Harpur said, “As the Australian data centre services market expands, diversifies and matures, there are growing opportunities for niche providers specialising in specific verticals to enter the market. For example, Canberra Data Centres and Australian Data Centres focus on the government sector in Canberra. The Australian Liquidity Centre (ALC), which is owned by the Australian Stock Exchange, services organisations in the financial services segment.”

“To cater to the growing demand for data centre services, specialist providers, including local providers such as NEXTDC, Metronode and Canberra Data Centres, and global providers such as Equinix, Global Switch and Digital Realty, have added data centre capacity, either by expanding their existing data centre facilities or building new ones. A growing trend for large IT service providers and telcos that own their own data centres is to consolidate their data centre footprint by shutting down older, less efficient data centres and leasing data centre space within the larger and newer facilities of these data centre specialists, as it is more cost effective, “added Harpur.

Specialist data centre service providers are carrier neutral, which encourages the development of business ecosystems within their data centres. This is attracting both local and global cloud providers to their data centres. Cloud providers are driving greater diversity as they attract a range of other companies, such as IT service providers. Thus a virtuous cycle has been created with these data centres.

The adoption of modular data centres is still in an early growth phase, however, momentum is beginning to build in the market and stronger adoption will occur as prices fall further. Modular data centres cater to niche segments of the market where companies or government departments require their own built facilities. They have higher relative cost, and most are deployed in outdoor and often remote locations, in industries such as healthcare, education, construction, mining, defence, manufacturing, oil and gas and renewable energies.

“Another growing trend over the last two years is for commercial property owners to acquire existing data centres or build new data centres and then lease them to data centre specialist providers, IT service providers or individual companies. Examples include Asia Pacific Data Centres (APDC) and Keppel DC Real Estate Investment Trust, both of which have purchased facilities from major local data centre providers,” said Harpur.

Data centre providers have several challenges. Significant new data centre capacity has entered the market over the last few years causing lower than average occupancy rates, and placing downward pressure on data centre pricing. However, additional capacity is generally being absorbed quickly. Securing sites in CBD locations and gaining access to sufficient power is increasingly challenging and it is becoming increasingly difficult for data centre owners to plan for additional capacity.

Frost Sullivan’s Australian Data Centre Services Market 2015 forms part of the Frost Sullivan Australia and New Zealand Cloud, Data Centre and Infrastructure 2015 research program. All research services included in this subscription provide detailed market opportunities and industry trends evaluated following extensive interviews with market participants. If you are interested in more information on these studies, please send an e-mail with your contact details to Donna Jeremiah, Corporate Communications, at [email protected]

About Frost Sullivan

Frost Sullivan, the Growth Partnership Company, works in collaboration with clients to leverage visionary innovation that addresses the global challenges and related growth opportunities that will make or break today’s market participants. For more than 50 years, we have been developing growth strategies for the global 1000, emerging businesses, the public sector and the investment community. Is your organization prepared for the next profound wave of industry convergence, disruptive technologies, increasing competitive intensity, Mega Trends, breakthrough best practices, changing customer dynamics and emerging economies? Contact us: Start the discussion


Donna JeremiahCorporate Communications – Asia PacificP: +61 (02) 8247 8927F: +61 (02) 9252 8066E: [email protected]

Article source:

Can cloud computing improve new-business submissions?

Application submissions are one of commercial underwriting’s biggest time sinks. Most quotes are a jumble of formats and paperwork that keep underwriters from getting to more important tasks like risk analysis.

Automating new-business submissions can help underwriters save valuable time. With intelligent scanning and rules-based engines, cloud-based technologies can convert quote data into standard formats. There’s no rerouting, re-reviewing and rekeying. Instead of a burden, new-business submissions can become a competitive advantage.

There’s no question that while the cloud has evolved rapidly, carriers maintain real concerns regarding the technology’s performance, data integrity and reliability. Here are four steps to evaluate the cloud’s potential for an organization’s new-business submissions:

1. Understand the company’s current new-business process and the associated time and expense. Map out the firm’s current underwriting processes. Maps are powerful visual tools for tracing intake forms’ paths through the underwriting process. More importantly, they can highlight an organization’s strengths and weaknesses. Does the process allow the opportunity to differentiate among the agents who submit new business? Can it identify the high performers? Does the organization quote all new business or is there a filter for weeding out risks that are beyond the parameters of the company’s risk appetite? Mapping provides an invaluable birds’-eye view of the new-business process.

2. Determine value opportunities for the organization. It is critical for underwriters to be able to spot their best prospects. Can the company identify the new-business submissions that align with the risk appetite and geography it wants to underwrite?

Cloud-based processing’s automated rules can boost the hit ratio. Using extraction and business rules to identify and prioritize submissions, for example, an organization can route the best submissions more quickly to underwriters.

Automated rules also save agents valuable time. Instead of requiring them to manually re-enter data from their agency management system into a different form, intelligent process automation lets agents simply upload the forms they already use, then extract data directly from them.

3. Identify the business benefits. Don’t be wowed by technology. Make sure the technology works for the organization. Tally the benefits to be realized. For example, what financial and competitive advantages will the company gain by providing timely service to its best agents? What’s the projected yield ratio of business quoted versus underwritten? Calculate the potential reductions in general expenses the organization could potentially reap from more efficient processes.

In addition to enabling more underwriting and less processing, cloud-based intake propels an organization toward data-driven results. Decision-making becomes more objective when based on the analytics that digitized processes can generate. Analytics speak for themselves, so they can help ease the traditional tensions among underwriting, agents and sales distribution.

4. Build a business case and run a pilot. Engage a partner who can assist the organization with both steps. Build a business case that identifies pain points and opportunities for the organization and also establishes benchmarks for future benefits. Next, run a proof of concept. Cloud-based services can launch and be evaluated within short time periods. The pilots provide real-world demonstrations of how automation can impact the submission process, integrate with downstream software, and fit into employees’ work processes.

Technology is constantly changing and insurers are faced with the pressure to operate faster and more efficiently. Fortunately, there are many options available to help them find the solutions that will improve their processes and provide the data they need.

Dan Pitcher, CPCU, is a property and casualty transformation leader with Cognizant, responsible for bringing new operations solutions to clients and has almost 30 years of experience in the PC industry. He can be reached at [email protected]

Ajoy Kumar Palanivelu is an experienced insurance business architect and venture solution leader for Cognizant’s proprietary OptimaWrite solution suite. He has successfully led several consulting and transformational programs in the insurance PC space with a focus on commercial underwriting and claims transformation programs. 

Article source:

Amazon, Microsoft ink $108M cloud computing deal with FAA

FAAAmazon and Microsoft are reeling in some serious dough to help the Federal Aviation Administration with its cloud computing needs.

IT consultant and systems integrator Computer Sciences Corporation just inked a $108 million contract with the FAA and will use both Amazon Web Services and Microsoft Azure — two Seattle-based cloud rivals — to help “consolidate FAA data centers and migrate FAA data and systems to a hybrid cloud environment.” The 10-year contract is initially valued at $108 million but could reach $1 billion over 10 years.

It’s a key government agency-related deal for both Microsoft and Amazon. Business Insider notes that IBM was left out of the partnership, which is notable in part because the company has worked with the FAA in the past. IBM was also ousted by Amazon in 2013, when the Seattle-based company won a $600 million contract with the CIA.

Article source:

Cisco: “Second Wave” of Cloud Adoption is Here

A majority of IT professionals are looking once again to the cloud as a new way to drum up profit-but this time, this “Second Wave” of cloud adoption could come in the form of business innovation as well as revenue increases.

According to a new study from Cisco (CSCO) and IDC called “Don’t Get Left Behind: The Business Benefits of Achieving Greater Cloud Adoption,” IDC surveyed more than 3,600 enterprise executives about their current and planned cloud usage, and found that 53 percent of companies expect cloud to drive increased revenue over the next two years.

Cisco to Acquire Cloud-based IT Security Provider OpenDNS for $635M Cash

Cisco Pulls Microsoft Azure Into Intercloud Orbit

Of those surveyed, 64 percent of respondents are either using or planning to utilize a hybrid cloud strategy, with 44 percent using or planning to use a private cloud for their business.

So why the sudden renewed interest in cloud? Cisco’s Senior Vice President of Global Cloud and Managed Services Sales Nick Earle said customers realized private and hybrid cloud offer several distinct advantages, including increased security, better performance and cost reductions that can help businesses get a leg up on the competition.

However, low cloud adoption among many companies may put a damper on the proposed profitability of cloud computing, as only one percent of companies surveyed currently have an optimized cloud strategy in place, according to IDC. Thirty-two percent of companies do not have any cloud strategy at all.

Companies with the most underdeveloped cloud programs in place have the most work ahead of them but also the greatest opportunity for growth and expansion, according to the study. IDC said companies with little to no cloud maturity stand to increase their revenue by more than 10 percent as well as reduce IT costs by up to 77 percent. Even companies with the highest level of cloud adoption stand to gain an average of $1.6 million in revenue per application deployed on private or public cloud, with $1.2 million in savings per cloud-based application.

Cisco said private cloud adoptees are most likely to benefit from better resource use, greater scale and faster response time in addition to a set of dedicated resources. Hybrid cloud adoptees are bound to have a slightly trickier path ahead of them, but benefit from the ability to keep some of their resources in-house.

Currently, companies in the United States and Latin America exhibit the highest level of mature cloud adoption worldwide, with companies in Japan lagging the farthest behind. Subsequently, manufacturing, IT, finance and healthcare are the industries with the highest level of mature cloud adoption, with government/education and professional services ranked among the lowest adoptees.

Cisco used the study to launch its new Business Cloud Advisor tool, a free service that allows Cisco partners to benchmark themselves against other companies and help them determine which cloud deployment option is best for their needs. The tool will work in tandem with Cisco’s previously announced Partner Cloud Professional Services offerings to help partners build professional services based on the company’s portfolio.

Article source:

Tech Savvy: How to Talk Cloud at Cocktail Parties

You know the deal. Business mixers––a fine line between work and pleasure. A very fine line.

So there you are, helping yourself to a few blue cheese stuffed olives and trying to balance your light beer without dropping your phone, and you overhear this conversation: “My private cloud needs rapid elasticity so I’m opting for PaaS as it enables me to transform my own enterprise applications into SaaS applications. How about you?”

Say what? Unfortunately, the hors d’evours table hasn’t created an invisible force field and you see that the conversation is headed in your direction.

Are you ready? If you are, excellent! If not, this may be a good time to get yourself another beer. Even though cloud technology is seemingly everywhere, it can still be somewhat mysterious to the non-techie crowd. While the term “cloud” is everywhere, few can pin down what it really means. The reason is that there are many flavors and components of cloud computing.

According to the National Institute of Standards and Technology Information Technology (NIST) Laboratory, “Cloud computing is a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computer resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction.” In other words, anything computer related. You can find the NIST’s definition of cloud computing in their Special Publication 800-145 named “The NIST Definition of Cloud Computing.”

Even when distilled down to its basics, it’s not so basic. Let’s see…there are three sets of fundamentals to digest: the characteristics of the cloud model, the deployment models, and the service models.

Related Article: What Is Fog Computing? The Low Down on Cloud Computing’s Newest Iteration

The 5 Essential Characteristics of the Cloud Computing Model

You’ll have the vague term “cloud” thrown at you many times in the future, and understanding these five characteristics will help you judge whether the latest offering you’re being shown is really cloud computing.

1. On-Demand Self-Service

You can quickly and easily configure the computing resources you need all by yourself, without filling out forms or emailing the service provider. An important point is that what you’re using is service-based (“I need 15 computing units”), not resource-based (“I need an HP ProLiant DL380 G6 with 32GB of RAM”). Your computing needs are abstracted from what you’re really being allocated. You don’t know, and in most cases you shouldn’t care. This is one of the biggest hurdles for IT departments that want to create their own internal private-cloud computing environment.

2. Broad Network Access

You can access these resources from anywhere you can access the Internet, and you can access them from a browser, from a desktop with applications designed to work with them, or from a mobile device. One of the most popular application models (such as iPhone apps) is a mobile application that communicates with a cloud-based back end.

3. Resource Pooling

The cloud service provider, whether it’s Amazon or your own IT department, manages all of its cloud’s physical resources; creates a pool of virtual processor, storage, and network resources; and securely allocates them between all of its customers. This means your data might physically reside anywhere at any moment, although you can generally make certain broad choices for regulatory reasons (e.g., what country your data resides in). But you won’t know whether it’s in the San Antonio or Chicago data center, and certainly not what physical servers you’re using.

4. Rapid Elasticity

You can grow and shrink your capacity (processing power, storage, network) very quickly, in minutes or hours. Self-service and resource pooling are what make rapid elasticity possible. Triggered by a customer request, the service provider can automatically allocate more or less resources from the available pool.

Related Article: 8 Ways Cloud Computing Can Increase Productivity and Profits

 5. Measured Service

Also described as subscription-based, measured service means that the resources you’re using are metered and reported back to you. You pay for only the resources you need, so you don’t waste processing power like you do when you have to buy it on a server-by-server basis.

Cloud Computing Service Models

A service model describes how the capability is provided to the customer. The easiest way to understand cloud service models is with a layered approach, very similar to the OSI networking model, with the infrastructure at the bottom and the upper layers the user sees at the top.

Infrastructure as a Service (IaaS)

Most straightforward.

  • IaaS is the virtual delivery of computing resources in the form of hardware, networking, and storage.
  • May also include the delivery of operating systems and virtualization technology to manage the resources.
  • Rather than buying and installing the physical required resources in your office, you rent them, as needed.

Platform as a Service (PaaS)

A set of software and product development tools hosted on the provider’s infrastructure.

  • Developers create applications on the provider’s platform over the Internet.
  • PaaS providers may use APIs, website portals or gateway software installed on the customer’s computer
  • (an outgrowth of and Google’s App Engine are examples of PaaS.

Software as a Service (SaaS)

The topmost layer of the service model. SaaS is when you use the provider’s applications running on a cloud infrastructure. The vendor supplies the hardware infrastructure, the software product and interacts with the user through a front-end portal. SaaS applications hide the entire IT infrastructure running in the cloud and present only the application to the user.

Typically, these applications can be accessed only through a web browser, although some SaaS applications require installing components on a user’s desktop or in the user’s IT infrastructure for full functionality. This is by far the most popular and best-known service model, with thousands of examples, from Gmail to hosted Exchange Server to Salesforce to Facebook to Twitter. Microsoft is using its Azure PaaS offering as a platform to transform its own enterprise applications into SaaS applications.

Because the service provider hosts both the application and the data, the end user is free to use the service from anywhere. SaaS is a very broad market. Services can be: web-based email, inventory control, and database processing

Other “as a Service” Models

Cloud service models don’t have to necessarily follow the layered approach; practically any aspect of software can be abstracted into the cloud and provided to the customer as a service. For example, Federation as a Service (FaaS) takes the work of establishing federated trusts between an enterprise and various cloud service providers away from the enterprise. The FaaS provider establishes trusts with hundreds of cloud providers (usually SaaS), and the enterprise simply connects to a portal with all the providers represented in a menu. A good example is where you can use your Facebook login to identify yourself at a seemingly unrelated website. That website is trusting Facebook to manage the identity of its users instead of doing it themselves.

Cloud Computing Deployment Models

The other set of definitions for cloud computing relates to how these services are physically deployed for the customer to use.

Public cloud

Sells service to anyone on the internet. This is the best-known cloud computing deployment model, and it’s what’s usually being referred to when “cloud” is used with no qualifiers. A public cloud is hosted by a service provider, and its resources are pooled across many customers (although the resources appear to be dedicated to the customer). Amazon Elastic Compute Cloud (EC2), Windows Azure, and are all public cloud providers. Note that although they share the same deployment model, they have different service models. Amazon is best known for its IaaS services, Microsoft provides PaaS, and Salesforce uses a SaaS deployment model. Public cloud service providers represent the most mature technology and practices at this point.

Private cloud

A proprietary network or a data center that supplies hosted services to a limited number of people. NIST defines a private cloud simply as a cloud infrastructure that’s operated solely for an organization—in other words, it’s not shared with anyone. The major driving factors for private cloud are security and regulatory/compliance requirements; if you want to take advantage of cloud computing’s flexibility and cost savings, but you have strict requirement for where your data resides, then you must keep it private. Many of the big security concerns being voiced about cloud computing can be remedied with a private cloud.

Learn more: Microsoft and the Private Cloud

Note that this definition makes no distinction for whether the private cloud is hosted on-premise by your company’s IT organization, or off-premise by a service provider; the erroneous assumption is often made that if it’s private, it must be in-house. Many companies are just beginning to explore what it takes to build their own private cloud, and technology companies are marketing hardware and software to make this enormous task easier.

Hybrid cloud

The hybrid cloud is pretty self-explanatory. It’s a combination of both public and private clouds, maintained separately but that have, for example, the same application running in both. The best-known use case for this deployment model is an application that runs in a private cloud but can tap into its public cloud component to provide burst capacity (such as an online toy retailer during the holiday season).

This model is still in its infancy, but the hybrid cloud is the future of cloud computing for enterprises. It will eventually become the most widely used model because it provides the best of both public and private cloud benefits.

Community cloud

A relatively unknown variation of the public cloud, a community cloud is shared across several organizations, but these organizations have shared concerns or goals.

Related Article: 6 Industries that Could Benefit from the Cloud

In case of an awkward pause, WOW them with some trivia!

  • The name cloud was inspired by the symbol that’s often used to represent the Internet in flowcharts and diagrams.
  • In 1997, the firm NetCentric tried to trademark the term “cloud computing,” but gave up the effort two years later. Dell tried the same stunt a decade later and failed.
  • Amazon Elastic Compute was the first major cloud computing service to embrace the cloud terminology. Google’s Eric Schmidt stole Amazon’s thunder by using the term “cloud computing” in an August 2006 speech just weeks before EC2’s unveiling.

Now instead of seeking sanctuary at the buffet table when the topic of cloud computing rears its head, blow them away with your Cloud Speak!

Article source:

DOD Issues Interim Rule Addressing New Requirements for Cyber Incidents and …

You are responsible for reading, understanding and agreeing to the National Law Review’s (NLR’s) and the National Law Forum LLC’s  Terms of Use and Privacy Policy before using the National Law Review website. The National Law Review is a free to use, no-log in database of legal and business articles. The content and links on are intended for general information purposes only. Any legal analysis, legislative updates or other content and links should not be construed as legal or professional advice or a substitute for such advice. No attorney-client or confidential relationship is formed by the transmission of information between you and the National Law Review website or any of the law firms, attorneys or other professionals or organizations who include content on the National Law Review website. If you require legal or professional advice, kindly contact an attorney or other suitable professional advisor.  

Some states have laws and ethical rules regarding solicitation and advertisement practices by attorneys and/or other professionals. The National Law Review is not a law firm nor is  intended to be  a referral service for attorneys and/or other professionals. The NLR does not wish, nor does it intend, to solicit the business of anyone or to refer anyone to an attorney or other professional.  NLR does not answer legal questions nor will we refer you to an attorney or other professional if you request such information from us. 

Under certain state laws the following statements may be required on this website and we have included them in order to be in full compliance with these rules. The choice of a lawyer or other professional is an important decision and should not be based solely upon advertisements. Attorney Advertising Notice: Prior results do not guarantee a similar outcome. Statement in compliance with Texas Rules of Professional Conduct. Unless otherwise noted, attorneys are not certified by the Texas Board of Legal Specialization, nor can NLR attest to the accuracy of any notation of Legal Specialization or other Professional Credentials.

The National Law Review – National Law Forum LLC 4700 Gilbert Ave. Suite 47 #230 Western Springs, IL 60558  Telephone  (708) 357-3317 If you would ike to contact us via email please click here.

Article source:

EMC Advances Case for Hybrid Cloud Computing


EMC is slowly, but surely pulling together the elements of its federated ecosystem to create a hybrid cloud computing platform.

Berna Devrim, director of EMC cloud solutions marketing

EMC is slowly, but surely pulling together the elements of its federated ecosystem to create a hybrid cloud computing platform.

Under the auspices of The EMC Federation today at the VMworld 2015 conference previewed components that will be added to a Federation Enterprise Hybrid Cloud platform that combines elements of products and services from EMC and its VMware and Pivotal sister companies.

Berna Devrim, director of EMC cloud solutions marketing, said the goal is to make it a lot simpler to set up, deploy and manage hybrid cloud computing environments than it is today.

OneCloud Lets VMware Sites Use AWS for Disaster Recovery

Vormetric Launches Security Solution for SaaS Providers

Intel Sets Sight on Clouds for All

At the core of the Federation Enterprise Hybrid Cloud Platform will be new application lifecycle automation tools that will enable IT organizations to deploy applications on topo of both VMware vSphere and OpenStack environments as well as instances of infrastructure-as-a-service (IaaS) offerings based on the Virtustream platform that EMC acquired in May.

Also included under this initiative will be additional “pay-as-you-grow” deployment options based on the VCE vxRack Systems and configurations of those systems that support the EMC XtremIO all-flash array. In addition, EMC today revealed that SAP has joined the The EMC Federation partner program alongside EMC sister company VMware.

Finally, The EMC Federation is also announcing the first in a series of new Federation End-User Computing solutions that promises to make it simpler to deploy virtual desktops across a hybrid cloud. Scheduled to be available under “directed availability” in the third quarter, The Federation End User Computing offering can be deployed in the cloud or on premise.

The storage systems for this environment will be optimized in a way that puts virtual desktop images on EMC XtremIO all-Flash arrays and end-user data to reside on EMC VNX and EMC Isilon storage. The solution includes VMware’s Horizon Enterprise Suite along with a self-service catalog, IT automation and user experience monitoring tools.

First launched earlier this year at the EMC World 2015 conference, the Federation Enterprise Hybrid Cloud Platform is an effort to simplify one of the more complex undertakings that any solution provider and their customers are likely to undertake. In fact, that complexity is seriously hampering adoption of private clouds, which results in the number of workloads being pushed into the public cloud at rates that are orders of magnitude greater than private clouds. Of course, by definition a true hybrid cloud requires a private cloud running on premise or a hosting service to exist in some form of another. But once that private cloud exists, it’s usually a short period of time before that private cloud goes hybrid.

Upcoming Webcasts Training

September 10: Profitable Providers: Selling File Sync to Medical Clients

September 15: The Seven Layer Security Burrito: How it All Stacks Up

September 17: Demystifying Compliancy to Explore New Revenue Opportunities in Healthcare IT

September 22: How to Generate Sales with Key Strategic Global Alliances

September 24: How to position and make money with Disaster Recovery for SMBs

September 29: Delivering Better Managed Security Services at Higher Margins

Featured Authors
My Farewell Address To The Channel
Google Container Registry Offers Hosting, Sharing of Containers
Jenne Partners With Alert Logic, Mitel
EMC Advances Case for Hybrid Cloud Computing

Guest Bloggers

  • AppRiver Guest Blog
  • Carbonite Guest Blog
  • Easy Office Phone Guest Blog
  • StorageCraft Guest Blog
  • VMware Guest Blog


MSP Mentor




Sponsored Introduction Continue on to (or wait seconds) ×

Article source:

Cloud storage promising, but security concerns remain

  • A cloud rack containing servers and hard drives stands inside pod one of a data center in Dallas. A cloud relieves the burden of large local data storage, but it raises questions about data confidentially and integrity protection. Photo: Ben Torres /Bloomberg /  2014 Bloomberg Finance LP



A cloud rack containing servers and hard drives stands inside pod one of a data center in Dallas. A cloud relieves the burden of large local data storage, but it raises questions about data confidentially and integrity protection.

Photo: Ben Torres /Bloomberg

We are living in a digital world. People are producing a massive amount of data every hour — every minute.

It’s challenging for many organizations to manage such a huge amount of data, which requires high storage capacity and qualified personnel. In recent years there has been a drop in the cost of storage hardware, but managing this storage is costly and complex.

Taking one step further, it is not only the amount of data that can be problematic but also the processing of such data. New versions of operating systems and software applications are launched every year or so. These new software packages have many advanced features and functions that facilitate data processing. Organizations must periodically replace and update their servers and software to catch up with new technologies. This is not an overnight task; it’s a very expensive and time-consuming process.

Imagine if we could outsource massive data and update servers with just a few clicks. This is not a dream anymore. It’s the beauty of cloud computing, which represents a vision of providing computing services as a public utility, like water and electricity. Thus, organizations can concentrate on their core business and leave information technology operations to experts.

This technology has received considerable attention due to its cost-effectiveness, low management overhead, immediate access to a wide range of applications, flexible IT capacity and easy mobile accessibility.

Although cloud computing has obvious benefits, there are still challenges — namely security. The nature of the cloud computing model subjects cloud services to increased attacks.

The cloud relieves the burden of large local data storage, but it raises questions about data confidentiality and integrity protection. Once data has been put on remote servers, owners no longer physically possess it, creating concerns about the data being corrupted or accessed by unauthorized users.

In some practical applications, data confidentiality is not only a privacy concern but also a juristic issue. For example, eHealth applications in the United States should meet Health Insurance Portability and Accountability Act, or HIPAA, standards, making data privacy within remote storage sites imperative.

Encryption mitigates cloud confidentiality worries, but then the possibility of data corruption becomes an issue. Cloud customers seek assurance that their data will remain intact on remote servers.

Mobile devices work hand in hand with the cloud but expose the user to authentication problems. Rather than storing data on the device itself, users will authenticate — often via password — to access data with cloud applications, making secure authentication more important than ever.

The use of passwords is a major point of vulnerability in computer security, as they are easy to crack with programs running dictionary attacks.

There’s a growing desire among resource-constrained clients to outsource operations such as big data analysis to powerful cloud servers. To save resources, a dishonest cloud service provider may ignore the computations, or execute just a portion of them. The ability to verify critical computations and validate the results is vital for cloud customers.

Cloud computing is a promising computing paradigm that can potentially offer important advantages; however, there are many security challenges facing this computing model.

Recently, research studies have focused on remote data auditing as a technique that allows an entity to prove that data is in its possession for validating data integrity over remote servers. Moreover, implicit authentication is considered a solution to the user authentication problem. A model can be constructed for the user based on past behavior, and then recent behavior is compared with that model to authorize legitimate users. The area of verifiable computations has attracted many researchers to resolve the challenge of outsourcing computational tasks to cloud servers that might be untrusted.

Ayad Barsoum, Ph.D., is assistant professor of computer science and director of the cybersecurity program at St. Mary’s University.

Article source:

Healthcare Cloud Computing Industry 2015 Global Market Research Report


Access Report @Healthcare Cloud Computing Industry 2015 Global  Market Research Report 


The Global Healthcare Cloud Computing Industry 2015 Market Research Report is a professional and in-depth study on the current state of the Healthcare Cloud Computing industry.

The report provides a basic overview of the industry including definitions, classifications, applications and industry chain structure.The Healthcare Cloud Computing market analysis is provided for the international markets including development trends, competitive landscape analysis, and key regions development status.

Development policies and plans are discussed as well as manufacturing processes and cost structures are also analyzed. This report also states import/export consumption, supply and demand Figures, cost, price, revenue and gross margins.

The report focuses on global major leading industry players providing information such as company profiles, product picture and specification, capacity, production, price, cost, revenue and contact information. Upstream raw materials and equipment and downstream demand analysis is also carried out.The Healthcare Cloud Computing industry development trends and marketing channels are analyzed. Finally the feasibility of new investment projects are assessed and overall research conclusions offered.

With 185 tables and figures the report provides key statistics on the state of the industry and is a valuable source of guidance and direction for companies and individuals interested in the market.

Avail Sample Report @Healthcare Cloud Computing Industry 2015 Global  Market Research Report 

……. Continued


1 Industry Overview
1.1 Definition and Specifications of Healthcare Cloud Computing
1.2 Classification of Healthcare Cloud Computing
1.3 Applications of Healthcare Cloud Computing
1.4 Industry Chain Structure of Healthcare Cloud Computing
1.5 Industry Regional Overview of Healthcare Cloud Computing
1.6 Industry Policy Analysis of Healthcare Cloud Computing
1.7 Industry News Analysis of Healthcare Cloud Computing

2 Manufacturing Cost Structure Analysis of Healthcare Cloud Computing
2.1 Raw Material Suppliers and Price Analysis of Healthcare Cloud Computing
2.2 Equipment Suppliers and Price Analysis of Healthcare Cloud Computing
2.3 Labor Cost Analysis of Healthcare Cloud Computing
2.4 Other Costs Analysis of Healthcare Cloud Computing
2.5 Manufacturing Cost Structure Analysis of Healthcare Cloud Computing
2.6 Manufacturing Process Analysis of Healthcare Cloud Computing

3 Technical Data and Manufacturing Plants Analysis
3.1 Capacity and Commercial Production Date of Global Key Manufacturers in 2014
3.2 Manufacturing Plants Distribution of Global Key Healthcare Cloud Computing Manufacturers in 2014
3.3 RD Status and Technology Source of Global Healthcare Cloud Computing Key Manufacturers in 2014
3.4 Raw Materials Sources Analysis of Global Healthcare Cloud Computing Key Manufacturers in 2014

View complete Toc @ Healthcare Cloud Computing Industry 2015 Global  Market Research Report 

Contact Us:

Norah Trent

[email protected]

+1 646 845 9349 / +44 208 133 9349

Article source:

Three Pakistani experts write first-ever book on cloud computing

ISLAMABAD: Three Pakistanis have authored a cloud computing certification book, the first of its kin in the world.


The book titled ‘Deploying and Managing a Cloud Infrastructure’ offers help to those IT professionals who want to become cloud administrators.


One of the authors of the book describes cloud computing as ‘services which are hosted on multiple servers for instance Gmail, Google Maps, Yahoo etc’.


The authors are Zafar Gilani, Abdul Salam and Salman-ul-Haq. Zafar Gilani, an IT specialist at GfK, Nuremberg.


He was an Erasmus Mundus scholar at Polytechnic University of Catalonia and Kungliga Tekniska högskolan (KTH), with postgraduate studies focused on distributed computing. He is also a PhD candidate at the University of Cambridge.


Abdul Salam is a senior consultant with Energy Services, and author of numerous blogs, books, white papers, and tutorials on cloud computing. He earned his bachelor’s degree in Information Technology, followed by an MBA-IT degree and certifications by Cisco and Juniper Networks. He is a frequent contributor to Salman-ul-Haq is co-founder and CEO of TunaCode Inc, that delivers GPU-accelerated computing solutions to time-critical application domains. He is a frequent contributor, and holds a degree in computer systems engineering.


Zafar Gilani while talking to The News said that the book is first of its kind in the world as it helps prepare candidates for the CompTIA Cloud+ Certification (CV0-001) cloud computing certification exam. Designed for IT professionals with 2-3 years of networking experience, this certification provides validation of cloud infrastructure knowledge, the author maintained.


It is worth mentioning here that Zafar Gilani along with five other foreign colleagues has won the 9th edition “Best Short Paper Award” of Association for Computing Machinery (ACM) 2013, the largest and most prestigious scientific and educational computing society for the advancement of computing.


The ACM publishes computing journals and has special interest groups (SIGs) and holds conferences (where conference papers are presented and published, Conext being one of them) and offers access to huge digital library to its members. ACM Conext stands for “Conference on emerging Networking Experiments and Technologies”.


It is an international conference held every year for the past 9 years (Santa Barbara, CA in December 2013 was the 9th edition). It is a platform that offers presentations and discussions of novel networking technologies that will shape the future of the Internet. The Conext conferences focus on stimulating exchanges between various international research communities.


The conference acceptance rate is around 18%. The conference also offers two awards: best short paper award and best full paper award. Zafar Gilani along with his five colleagues presented their short paper titled “Is there a case for mobile phone content pre-staging?” in December 2013.


The authors of the papers were Alessandro Finamore, Marco Mellia, Zafar Gilani, Konstantina Papagiannaki, Yan Grunenberger, and Vijay Erramilli.


Their paper has been awarded the best ACM Conext short paper. FOR “A great example of a short paper: a first step that investigates an interesting and timely idea. Although the results are not definitive either way, this just indicates there is more to be done.


Gilani commented that book authored by him and two others has been published by John Wiley Sons, Inc., also referred to as Wiley which is a global publishing company that specializes in academic publishing and markets its products to professionals and consumers, students and instructors in higher education, and researchers and practitioners in scientific, technical, medical, and scholarly fields.


Article source: