Xbox One Would “Absolutely” Benefit From Cloud Computing

xbox_one_boot

Speaking out recently on The Inner Circle Podcast, Stardock CEO Brad Wardell has spoken on how the Cloud can bring benefits to the Xbox One. The Cloud was one of the most talked about features of the Xbox One and it’s yet to be used in any meaningful way.

Wardell thinks that the Cloud would “absolutely” benefit the Xbox One given that it’s apparently faster to draw information from the Cloud than it is the Xbox One’s DVD Drive. Solid State Drives (SSD’s) would be a far better solution, but this is an unlikely dream.

A simple given example of how the Cloud could benefit games was given, if players were taking on a computer in a game of Chess, the AI would compute its moves in the Cloud while the console itself would only have to handle the rendering of the game. Because of this, Wardell thinks the Cloud has “endless numbers” of applications in the gaming space.

Cloud computing was, and still is, a viable solution to many problems faced by video game developers and programmers, so it’s still a mystery why it’s yet to be properly implemented in at least an optional way.

Article source: http://gamingbolt.com/xbox-one-would-absolutely-benefit-from-cloud-computing

Myth Busting the Open-Source Cloud Part 1

Open-source cloud computing offers compelling potential—cost savings, innovation, low barriers to software deployment, avoiding vendor lock-in and a broad community of support, to name a few.

Despite this, CIOs and IT executives still have misconceptions about the challenges surrounding open-source cloud technologies, such as a perceived lack of security or the potential inability to handle business-critical applications.

We’re here to address these perceptions in a five-part series to show that these myths are just that: misconceptions about open-source cloud.

First up, the lack of security. The idea here is that there’s inadequate security with cloud computing in general. For several years, concerns about security have kept a large number of organizations from moving data and workloads to the cloud, particularly public cloud services.

The fact that some services use infrastructure that’s shared among a number of customers creates a concern that data from one organization will somehow be exposed to or accessed by others. And the idea of storing information on a service provider’s servers, vulnerable to whatever weaknesses the service provider’s infrastructure might have, makes some companies think this is too much of a corporate risk.

IT departments have multiple cloud deployment options to choose from – private clouds, including managed private clouds, public clouds and hybrid clouds. IT organizations are best served by embracing multiple deployment models, with the appropriate security level in mind. For example, mission-critical workloads containing sensitive data are best suited for private and managed private clouds.

With open-source cloud computing, there is a stereotype that these products and services are being created by amateur developers who are not skilled enough to build enterprise-grade security into the software they are developing.

On the contrary, open-source cloud computing products are designed from the outset with security in mind. For example, there are features such as identity management to monitor who has access to content, and data encryption to safeguard information while it’s at rest or in transit.

Furthermore, open-source cloud software is peer-reviewed by community participants, leading to continuous improvements in the quality of security features and mechanisms. This community also monitors and rapidly discloses vulnerabilities and issues, and provides security updates to address them.

It’s important to keep in mind that much about security involves using common sense and ensuring that users follow security policies and procedures. For instance, application developers should use Transport Layer Security, which employs cryptographic protocols designed to provide security over networks.

Managers need to emphasize that information security is the responsibility of everyone in the organization, including developers, network administrators, support personnel, managers and end users. Cloud development and use today involves virtually everyone across the enterprise. In other words, security is everyone’s job.

That also extends to the relationship between the cloud provider and buyer. CIOs and their vendors must understand the security roles and responsibilities and to whom they belong.

Research from International Data Corp. (IDC) released earlier in 2015 showed that many organizations expect to rely on hybrid cloud architectures, and that open hybrid cloud will become the de facto enterprise IT architecture.

Article source: http://www.cio.com/article/2912233/cloud-computing/myth-busting-the-open-source-cloud-part-1.html

NetApp takes cloud computing to a higher level through flexibility | #AWSSummit

IMG_4744

“Cloud computing,” by definition, refers to the on-demand delivery of IT resources. Cloud computing provides a simple way to access servers, storage, databases and a broad set of application services over the Internet with a pay-as-you-go pricing. Amazon Web Services, Inc. is one of the main providers of cloud; however, customers suggested a change in provider would provide an opportunity to access the features of cloud with flexibility.

During an interview on theCUBE at AWS Summit 2015, Phil Brotherton, VP of the Cloud Solutions Group at NetApp, Inc. said he is seeking to satisfy customers’ requests. “I like the Southwest Airline slogan: Set Yourself Free. That is what we are trying to do, set the data free,” he said.

Innovating in the cloud

 

Innovation is what the cloud is about, and based on a customer’s review, “Cloud is a winner because it is quicker to build, easier to adapt and update, and of a lower cost.” Cloud services in combination with NetApp make users more capable to understand the pressure of an always-evolving marketplace and to respond rapidly to changing business and customer needs.

NetApp gives users flexibility on the operating systems they work on, while most other providers are not so customer friendly. By integrating NetApp flexibility with cloud computing, customers will have more control and choice and also be able to partake in its innovation capabilities.

Reducing risk in the enterprise cloud environment

 

theCUBE co-host Marc Farley directed an important question to Brotherton during the interview, stating, “Trying to satisfy customers who want to do more on the cloud, are you developing services and what are the services that you’ve got?”

Brotherton responded, “Yes we do. An example is Cloud ONTAP that was launched last November.” According to Brotheron, Cloud ONTAP is proven to improve operational efficiency, lower cost and reduce risk in the enterprise cloud environment. Apart from flexibility and data management, NetApp delivers secure multi-tenancy, non disruptive operations and proven storage efficiency with Cloud ONTAP.

Working with customers, NetApp is building on its proven portfolio of cloud solutions so that organizations can take the cloud to the next level.

Watch the full interview below, and be sure to check out more of SiliconANGLE and theCUBE’s coverage of AWS Summit 2015.

Article source: http://siliconangle.com/blog/2015/04/20/netapp-takes-cloud-computing-to-a-higher-level-through-flexibility-awssummit/

Microsoft Charts Course Toward Containers, Cloud Computing’s Next Frontier

News

Microsoft Charts Course Toward Containers, Cloud Computing’s Next Frontier

Perhaps you’ve heard the buzz about containers and how they’re poised to become the next big thing in cloud computing.

Containers, long used as a way of providing a secure sandbox that separates an application from the OS and in some cases network infrastructure, are poised to take on computing tasks deemed unachievable by today’s virtual machines (VMs) and cloud infrastructures.

In less than a year, a once-little-known startup provider of open source software called Docker put containers on the map. The company’s open container platform is designed to let developers build and systems administrators manage distributed applications on any OS, VM and cloud. Among those supporting Docker are Amazon Web Services (AWS), Google, IBM, Rackspace, Red Hat, VMware and even Microsoft.


Docker last month released the first beta versions of its forthcoming orchestration software designed to let organizations build and deploy this new class of distributed application components. For its part, Microsoft is supporting the new Docker Machine, the orchestration tool that automates the development, provisioning and management of Docker containers on Linux or Windows Servers; and Docker Swarm, a tool that lets developers select infrastructures for their apps, including Microsoft Azure.

The Docker Machine lets administrators select an infrastructure to deploy an application built in the new environment. Microsoft has contributed drivers in Azure that allow for rapid and agile development of Docker hosts on Azure.

“There are several advantages to using Docker Machine, including the ability to automate the creation of your Docker VM hosts on any compatible OS and across many infrastructure options,” said Corey Sanders, director of Program Management, Azure, in a post on the Microsoft Azure blog. “Additionally, Docker Machine provides you the ability to manage and configure your hosts from a single remote client.”

With Docker Swarm, which spins a pool of Docker hosts into one virtual host, an administrator can deploy their container-based apps and workloads using the native Docker clustering and scheduling functions, according to Sanders. It also lets customers select cloud infrastructure such as Azure, enabling them to scale according to their needs. Using the Docker command-line interface (CLI), customers can deploy Swarm to enable scheduling across multiple hosts. While this will initially be useful for dev and test, it portends a day when partners can build and deploy apps that are OS-, hardware-, VM- and cloud provider-independent.

Docker isn’t the only container solution Microsoft is embracing. Sphere 3D last month said its Glassware 2.0 Microvisor can virtualize infrastructure components and the application stacks from both Windows and non-Windows-based systems and claims it can “outperform” any existing hypervisor-based infrastructure. Furthermore, Sphere 3D said it can be used for systems and cloud management, orchestration and clustering.

It’s still early days for containers in the world of cloud computing, but if they live up to their promise, they can raise the bar for performance and portability.

More Columns by Jeffrey Schwartz:

About the Author


Jeffrey Schwartz is editor of Redmond magazine and also covers cloud computing for Virtualization Review’s Cloud Report. In addition, he writes the Channeling the Cloud column for Redmond Channel Partner. Follow him on Twitter @JeffreySchwartz.

Article source: http://rcpmag.com/articles/2015/04/01/containers.aspx

Amazon to reveal revenue from cloud computing business for the first time this …

aws2We’ll learn more about how much revenue Amazon’s cloud computing business generates for the company later this week.

For the first time, Amazon will break out Amazon Web Services financial results starting with Q1 2015 in its earnings call on Thursday.

“We just think it’s an appropriate way to look at our business for 2015,” Amazon CFO Tom Szkutak said earlier this year in regard to showing AWS financials.

Amazon has never disclosed AWS financials and lumps them under an “other” category for North American sales — which also includes advertising services and co-branded credit card agreements — that generated $1.67 billion during the last quarter, up from $1.17 billion a year ago.

However, analysts from Deutsche Bank estimate that yearly revenues from AWS are about $6 billion — ten times that of Microsoft’s own cloud computing arm. And last month, analysts at Robert W. Baird said it was valuing Amazon Web Services at $40 to $50 billion, or $95 per share, on a standalone basis, TheStreet reported.

This past November, AWS chief Andy Jassy showed the chart below that revealed AWS revenue growth of more than 40 percent year-over year. He also noted how AWS is the “fastest-growing enterprise IT company in the world.”

awschart

Previously, Amazon has said that it has more than one million active AWS customers, and that usage grew 90 percent in the Q4 2014 compared to the year earlier.

Other tech giants like Microsoft and Google also do not specify exact financials from their cloud services in earnings reports.

Seeking Alpha notes that “if AWS revenues are higher than expected, Amazon shares will likely hit an all-time high.” We’ll be reporting on Amazon’s financials on Thursday afternoon, so check back then for more details.

Article source: http://www.geekwire.com/2015/amazon-to-reveal-revenue-from-cloud-computing-business-for-the-first-time-this-week/

ISC Cloud & Big Data Keynote to Focus on IT Transformation at DZ Bank

Dr. Jan Vitt, DZ Bank

Dr. Jan Vitt, DZ Bank

Today the ISC Cloud Big Data Conference announced that Dr. Jan Vitt from DZ Bank will keynote their event in September. The event will take place Sept. 28-30 in Frankfurt and is expected to attract over 250 attendees in the fields of cloud computing and big data.

As Head of IT Infrastructure at DZ Bank, Dr. Vitt will describe how his organization effectively adopted cloud computing to address the IT needs of their various business divisions. As the fourth largest cooperative bank in Germany, DZ Bank supports the business activities of over 900 other cooperative banks in the country. Dr. Jan Vitt, the Head of IT Infrastructure at DZ Bank will be talking about how a conservative institution like his is effectively adopting cloud computing to address the IT needs of their various business divisions.

Founded 150 years ago, DZ Bank started as a regional cooperative bank in Frankfurt to support the entrepreneurial ventures of its members. Today it is recognized as an established financial institution, after having demonstrated its bullishness during the 2008 financial crisis.

Jan Vitt has been overseeing the internal IT operations of DZ Bank for the last seven years. He is in charge of the bank’s data center facilities, operations of servers, computing platforms, particularly the infrastructure for desktop services. Under his leadership, the bank’s IT division operates several private cloud services internally and also uses the services of some public clouds.

DZ Bank was reluctant to use the services of professional cloud providers as they were very concerned about data security. The regulatory authorities such as the European Central Bank and German Bafin (Federal Financial Supervisory Authority) have very strict regulations regarding IT outsourcing making it doubly hard to make the move into cloud computing.

isc cloud  big dataIn his keynote titled “Cloud Computing in a Conservative German Bank – Still a Long Way to Go?” Vitt will be sharing how he and his team surmounted these hurdles to successfully install the bank’s own private clouds. He says that private and public clouds are just their starting points and they will be evaluating other opportunities to employ cloud computing in the future.

This year’s conference offers a unique program consisting of business, technology and research tracks. For more information, visit .

The conference organizers will be announcing more keynote talks in the near future.

Sign up for our insideHPC Newsletter.

Article source: http://insidehpc.com/2015/04/isc-cloud-big-data-keynote-focus-transformation-dz-bank/

Google Boosts Cloud Services To Tackle Big Data

At the Hadoop Summit in Brussels on Thursday, Google announced significant updates to two of its cloud Relevant Products/Services services. The changes will facilitate the processing of large quantities of data Relevant Products/Services and open up the company’s services to European data centers.

Big data promises to give companies faster and better insight, Google Product Manager William Vambenepe said Thursday on the Google Cloud Platform Blog. “Big data the cloud way means being more productive when building applications, with faster and better insights, without having to worry about the underlying infrastructure Relevant Products/Services,” he said.


Google Cloud Dataflow

Since first announced at its developers conference last summer, Google’s Cloud Dataflow managed processing service for live data has been in private alpha testing. With today’s launch of the beta version, the resource is now available to any software Relevant Products/Services developer interested in testing it. However, because Cloud Dataflow is just in beta, no service-level agreement is currently available.

In his blog post, Vambenepe offered three key benefits of the new service: noOps (no operations in-house); cost effectiveness; and safe and easy collaboration. He also noted that the Cloud Dataflow service will automatically scale to meet the needs of its customers

“Today, nothing stands between you and the satisfaction of seeing your processing logic, applied in your choice of streaming or batch mode, executed via a fully managed processing service. Just write a program, submit it, and Cloud Dataflow will do the rest,” Vambenepe said.


Google BigQuery

Google’s other big announcement consisted of a number of upgrades to its five-year-old BigQuery service that offers users a SQL-interface for very large unstructured data sets.

The most significant change, given the recent regulatory issues in Europe, is that users can now host their data sets on Google servers located within Europe. While not directly addressing the search preference concerns raised by regulators, the move could be seen as an attempt to minimize European complaints.

Google also took steps to make BigQuery more secure. The company introduced what are known in the SQL community as row-level permissions. This will enable designers and users of SQL tables to implement access controls with greater levels of granularity. For instance, a sales manager can be given access to certain information within a database without necessarily having access to information relevant to human resources or accounting.

Finally, Google turned up the speed of BigQuery, giving it the capability of processing 100,000 rows from a database table in just one second.

Article source: http://www.cio-today.com/article/index.php?story_id=103001PHLJ8I

Alibaba Intensifies Expansion into Global Cloud Computing Market, Launches …

  • Alibaba founder and chairman Jack Ma during the CeBIT trade fair in Hanover, Germany, on March 16, 2015.

Alibaba founder and chairman Jack Ma during the CeBIT trade fair in Hanover, Germany, on March 16, 2015. (Photo : Reuters)

Chinese e-commerce giant Alibaba has stepped up its efforts to increase its presence and get more potential clients in the booming global cloud computing market as Aliyun, its cloud computing agency, recently opened a data center in the United States.

Alibaba’s first data center overseas opened in March at Silicon Valley, home of the world’s largest tech firms and thousands of startup companies.

An article by technologynewschina.com said that the new data center would initially offer cloud computing services to Chinese companies in the U.S before they take on international clients.

Yu Sicheng, Aliyun’s vice president and head of its international business, explains how the data center works.

“We have a professional team called ‘IDST’ in the United States. It researches our capacity to process big data so as to help us apply cutting-edge technology to our big data platforms. The team has recruited both Chinese American and local American big-data talents in order to build the world’s largest data exchange platform,” Yu was quoted as saying.

Yu added that the expansion into the U.S. is only an initial step in their international strategy as the company is planning to set up more overseas data centers in other countries in Europe, the Middle East and Japan.

The report said that the company’s effort to take advantage of the global cloud computing market has earned it nearly $150 billion as of the end of 2014.

In March, the company also showed its new data venture and its enhanced capacity in cloud computing when Alibaba unveiled its facial recognition payment technology called “Smile to Pay,” enabling users to make payments by mobile phones using selfies.

“Online payment to buy things is always a big headache. You forget your password. You worry about the security. Today we show you a new technology in the future how people buy things online, ” Chairman Jack Ma said during his keynote speech at the CeBit trade fair in March.

Alibaba’s online retail platform Taobao, supported by Aliyun cloud services, also made it possible to create 80,000 orders per second at last year’s annual local online shopping spree dubbed “Double 11.”

The report said that Alipay managed to handle 2.85 million payments every minute using the cloud service.

Article source: http://en.yibada.com/articles/27761/20150418/alibaba-cloud-computing-market-u-s-data-center.htm