Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
[Most Recent Entries]
[Calendar View]
Monday, December 14th, 2015
| Time |
Event |
| 1:00p |
How Microsoft Plans to Win in Enterprise Hybrid Cloud While Microsoft is behind Amazon in public cloud, it has no need to play catch-up inside the enterprise data center. That combined with the second-largest public cloud business puts it in a good position to dominate in hybrid cloud, which is touted overwhelmingly as the cloud strategy of choice among enterprises.
The only other players with existing presence in enterprise data centers similar in scope are VMware and IBM. Of the two, IBM may be the harder one for Microsoft to compete with in hybrid cloud, since it also has made massive investments in public cloud, while the scale of VMware’s public cloud infrastructure is quite small in comparison to the other players.
The hybrid cloud opportunity is enormous. Cloud was the number-three 2016 investment priority for CIOs who participated in Gartner’s latest global survey, following business intelligence and analytics (their first priority) and infrastructure and data center (their second).
The survey included nearly 3,000 CIOs from 84 countries, together representing $11 trillion in revenue and $250 billion in IT spend. A quarter of them said cloud was third on their list of top investment plans for next year.
According to Microsoft and many others, hybrid cloud is the approach most enterprises choose, blending on-premise infrastructure with cloud services so they can get both control and performance of on-prem and flexibility and scale of public cloud. “They’ve got existing investments in their on-premises environments, but they’re all wanting to have a public cloud strategy, and many of them are already using it,” Mike Schutz, general manager of cloud platform product marketing at Microsoft, said in an interview.
Thousands of applications and services running on-prem at large enterprise data centers need to be bridged to the cloud, and Microsoft’s strategy is to help them do it, he said. Microsoft has invested heavily in building a consistent platform for both internal environments and Azure, its infrastructure services cloud, and invested more than $15 billion in the global data center infrastructure that supports those public cloud services.
Much of the company’s hybrid cloud strategy will start to come together next year, when it releases Windows Server 2016 and new capabilities in Azure Stack, its infrastructure offering for on-prem deployments.
Its server virtualization platform Hyper-V already runs in customers’ private clouds and in Azure so VMs can be moved between the two types of infrastructure if needed. Upcoming Windows Server 2016 will have software defined networking capabilities – an SDN controller and virtual network functions – “cut from the same cloth as the Microsoft Azure SDN capabilities,” Schutz said.
The next release of Windows Server will also support distributed object storage – a type of storage common for applications running in the cloud. It will have robust set of capabilities around application containers, including Docker-compatible containers.
The next wave of on-premise Azure products, called Azure Stack, will have VM orchestration capabilities that are similar to public Azure. These capabilities will be in preview next quarter.
With consistency across private and public cloud on the back end, there will be the same self-service portal for both public Azure and private Azure Stack.
Realizing that users may want to use non-Microsoft clouds, the company has built into its operations management suite capabilities that work with AWS and other public clouds, as well as with VMware and OpenStack private clouds for both Windows Server and Linux.
The idea is not to try to force users to lock themselves into an all-Microsoft environment or replace entire systems they have already built. “We’re trying to meet them where they are,” Schutz said.
Once the new hybrid cloud capabilities are out, Microsoft will have to prove that it can deliver on that promise of seamless compatibility between on-prem and public cloud. With IBM, VMware, and the multitude of OpenStack offerings, there are many ways to set up private cloud, and extensibility to public cloud is something all vendors have invested in.
Microsoft’s vision of the future of enterprise cloud is different from the vision of its main public-cloud rival Amazon, to whom the role of hybrid cloud is to help companies eventually transition from on-premise infrastructure to AWS completely. Amazon has been investing a lot in being able to give enterprises the same capabilities they have in-house as cloud services.
Another contender is Google. While its enterprise cloud play hasn’t been on par with those of its rivals, Google’s VP of technical infrastructure Urs Hölzle recently said a lot of changes were coming that would change the perception of Google as a not entirely serious enterprise cloud player. He stopped short of sharing what those changes would be, but said Google Cloud Platform would replicate the success the company had with Android, which came to the mobile market much later than Apple’s iOS but eventually became the number-one operating system. | | 5:51p |
From Server Sprawl to Scale Todd Pavone is the Executive Vice President of Product Strategy and Development at VCE.
IT professionals don’t like to admit it, but networks don’t always grow by intent or design. The infrastructure expands organically as new technologies find an audience, either with outside partners or internal users. In other words, steady innovation hampers routine operation.
Imagine a data center that was once considered massive but quickly runs out of space and capacity because it relies on tape backup. Picture an enterprise resource planning (ERP) infrastructure that can’t keep up with growth and bogs down the supply chain. These aren’t horror stories—they’re real.
For example, Wake Forest Baptist Medical Center in Winston-Salem, NC, always sought to stay ahead of the curve in its critical field. However, it eventually had an IT infrastructure that featured space constraints, caused project delays and couldn’t respond promptly to service requests. At the same time, it had considerable levels of unused capacity—a result of unplanned sprawl. In a very different market, Fox Sports Australia made its reputation broadcasting more than 11,000 hours of live programming annually, but then found itself grappling with a business model that blended one-thirds broadcast with two-thirds new media streaming. Reaching fans on their phones and tablets mandated a very different kind of scaling.
So what does it take to move from organic sprawl to optimal scale? What’s the best way to achieve operational excellence in a hyper-growth environment?
To be clear, there’s no magic bullet: It takes a systematic strategy that encompasses specific elements from solution design and implementation to training and support. For their part, CIOs want flexible and secure data centers that can adapt to changing technology and business requirements. However, enterprises increasingly have new types of applications—think “third-platform” apps for social and mobile computing, cloud services, Big Data, and the Internet of Things—that can grow to touch hundreds of thousands of virtual servers. This multi-tiered growth puts heavy pressure on the infrastructure.
Still, the innovation isn’t on the app side alone: Advances in infrastructure management, such as converged and hyper-converged solutions, help break down these silos and manual processes by sharing resources and leveraging software-defined approaches like cloud computing and on-demand business models. They deliver greater business value through simple automation, centralized management and policy-based processes that are attuned to the dynamic requirements of complex business applications. Therefore, even as the application scales up or down, the in-built intelligent software enforces the preset policies while maintaining a superior user experience.
Going back to the examples cited earlier: Fox Sports Australia knew that staying with its longtime IT approach—which had previously been so successful—meant disassembling and reassembling dozens of disparate hardware systems. So, instead, it built a private cloud with a converged infrastructure system, ensuring lower risks, a faster move to its new facility and greater business agility. In fact, the company now delivers video clips and sports statistics to mobile and otherwise online devices in a fraction of the time. System reliability and IT efficiency are higher, and IT has evolved from a reactive break- and fix-focused organization to a proactive service provider.
Wake Forest Baptist Medical Center purchased and deployed no less than three converged infrastructure solutions to boost its new software-defined data center. The environment has been designed to support not only a major medical center upgrade, but also more than 750 other applications—from storing and offering access to the medical center’s imaging studies to supporting its virtual desktop infrastructure. It delivers near-100 percent availability, enhances performance by 30 percent, speeds provisioning and slashes and maintenance costs.
As next-generation applications become more intelligent and bring the resiliency to manage dynamic workloads built into them, businesses can focus on a hybrid approach to managing their data center, and benefit from cost efficiencies of scale. Ultimately, innovation can’t flourish in a vacuum—in this circumstance, it needs to be balanced with existing realities, such as the type of application, the number of users and the level of workload. A highly scalable software-defined converged infrastructure system helps businesses find the right blend of technology advances, market shifts and user preferences to help drive greater innovation throughout the organization.
Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library. | | 6:11p |
IT Innovators: A Journey Toward Bolstering Security in the Private Cloud 
Article courtesy of WindowsITPro
As a provider of cloud-based solutions in the document outsourcing industry, Novitex recognizes that the success of its business hinges on the cloud. “The cloud is enabling us to work in ways that were never possible before, collaborating more smoothly and accessing data from anywhere, anytime,” says Anthony Dupree, CIO of Novitex. However, while Novitex uses the cloud to bring solutions to clients more quickly and solve problems more seamlessly, the company recognized early on that there is also more at stake: “Working in the cloud raises critical security concerns,” Dupree explains. “When we work with clients’ data, our top priority is that their information is protected at all times.”
Novitex recognized that internally, they needed to get better security with their private cloud for a number of reasons. First and foremost, they knew their clients needed an integrated solution that was secure, since many clients were facing strict regulation requirements. Second, as an outsourcing provider, the company knew that a lapse in security could be costly, disruptive and detrimental to their reputation. “We knew we needed to take a very proactive approach to security, especially when dealing with clients’ data,” Dupree explains.
While the company knew that creating the ideal security solution would take a lot of work, they began the process by conveying their end goal to all involved in the project. Dupree says the team was aware that they needed to deploy a “defense-in-depth” approach that was “based on the military principle that it is more difficult for an enemy to defeat a complex and multilayered defense system than to penetrate a single barrier.” They then dove into a “very collaborative effort,” Dupree says, by polling clients, IT security experts and the IT team at Novitex.
Once the objectives were clearly stated, Dupree says the next challenge involved getting company leadership on board to help facilitate the process. His major takeaway: the importance of a gap assessment. “Understanding where your gaps are is critically important, and then you must report it to the company leadership or board so they buy into your strategy and make the funding available to fix those gaps,” Dupree says. Only once you help leadership understand the risks can you get the funding necessary to alleviate your gaps, Dupree explains.
Dupree said that once they had the support necessary to embark on this journey, they then set out to find a way to collapse all the types of regulations and security requirements that clients need to adhere to. By doing this, they were able to develop a framework to manage all security controls from one view—which made everything much more seamless.
The next challenge involved using very diversified tools, including firewalls, detection systems, tools that look at email phishing, and anti-virus software, to fully protect the infrastructure. Finally, Dupree and his team conducted many evaluations and head-to-head testing to determine with tools were best to implement.
Even though Dupree says his team arrived at a solution that they are currently satisfied with, he explains that the process is far from complete. “You need to maintain your level of cyber security by understanding that it’s a continuous process and that you must be very proactive in knowing your security program,” Dupree explains, adding that this also involves anticipating what your security program might need next.
The IT Innovators series of articles is underwritten by Microsoft, and is editorially independent.
This first ran at: http://windowsitpro.com/it-innovators/it-innovators-journey-toward-bolstering-security-private-cloud-2 | | 7:36p |
Bitcoin Mining Data Center Update Much is happening in the bitcoin mining data center world, and not all of it is good. While two major players in the space, BitFury Group and KnC Miner, announced big expansion projects, another player, GAW Miners is going down in flames, the latest in its story being official fraud accusations by the SEC, which called GAW a “Ponzi scheme.”
While some bitcoin mining data center capacity is leased from traditional data center providers, most of the world’s blockchain servers run in massive warehouses quickly outfitted with high-capacity power and cooling systems but not nearly as much redundancy as designed into regular data centers.
If data center providers were somewhat weary of leasing space to bitcoin mining companies that offer mining services or host mining hardware before the shakeout that started last year, caused by a sharp drop in value of the digital currency, they are much wearier now.
C7 Data Centers has sued mining company CoinTerra, which had defaulted on debt and stopped paying to the data center provider for services. CoinTerra also had a sizable deployment with CenturyLink, but CenturyLink was quiet about the 10 MW of capacity the mining firm leased from it.
Here’s a roundup of this month’s developments in the bitcoin mining data center market:
SEC Accuses GAW of Running Ponzi Scheme
Homero Joshua Garza, also known as Josh Garza, and his companies GAW Miners and ZenMiner have been accused of defrauding investors via a simple Ponzi scheme, according to the SEC. Garza allegedly promised investors high returns from his cloud mining business but never built the scale of computing power he was describing to them, paying returns to existing investors using money raised from new ones.
Garza told us in an interview last year that GAW, also known as Geniuses at Work, was operating 12 data centers at the time and that it was on track to making $150 million in sales annually. The company has been sued by the utility Mississippi Power for unpaid electrical bills, and a group of investors and customers have been pursuing legal action against GAW since earlier this year.
KnC Building Fourth Sweden Data Center
KnC Miner announced plans to build a fourth bitcoin mining data center on its campus in Boden, Sweden – a small town about 20 miles north of Luleå, home to Facebook’s massive European data center.
KnC’s new data center will have 30 MW of capacity. The company has been expanding capacity at the site rapidly, first announcing a 10 MW data center in Boden in 2014, and then unveiling plans to build out another 20 MW the same year.
Boden is home to another bitcoin mining data center operated by a company called MegaMine, which leases the facility from data center provider Hydro66.
BitFury to Launch Liquid-Cooled Facility in Republic of Georgia
BigFury Group said it will launch its third data center in the Republic of Georgia this week. This will be a 40 MW facility, where servers will be cooled using immersion-cooling technology by Allied Control, a company BitFury acquired earlier this year. The cooling system is based on a design by 3M, the company that supplies the dielectric fluid used in the system.
The cooling technology enables the company to pack up to 250 kW per rack while substantially reducing energy consumption of the cooling system, the company said in a statement. | | 9:56p |
Cloudyn Raises $11M to Expand Private Cloud Monitoring Chops For every dollar large enterprises spend on public cloud services, they will spend $11 on private cloud, Cloudyn CEO Sharon Wagner, said.
This is why expanding private cloud monitoring capabilities is the primary R&D focus for the Tel Aviv-area-based cloud monitoring startup that this morning announced an $11 million Series B funding round led by Israeli VC Carmel Ventures. Cloudyn’s focus is helping companies manage capacity and cost of their cloud infrastructure.
Wagner believes the size of the private cloud market is hundreds of billions of dollars. Cloudyn works with large enterprises in aviation, ecommerce, travel, and other verticals. These companies run from 10,000 to 30,000 VM instances concurrently, and “they mostly use private cloud,” he said.
The company’s platform is monitoring a massive amount of public and private cloud instances, and only about 30 percent of them are running in the major public clouds by Amazon, Microsoft, or Google, according to Wagner.
In private cloud environments, Cloudyn wants to manage VMware or Hyper-V VMs but has no plans to extend further down into bare-metal server management, Wagner said. He also expects Cloudyn to support microservices capabilities enabled by application containers on top of VMs.
In addition to R&D, the Cloudyn plans to use the new capital for marketing and to expand its North American presence. “We don’t have boots on the ground yet in North America,” the CEO said.
The plan is to launch an office on the East Coast first, followed by one on the West Coast.
Carmel Ventures, the latest funding round’s lead arranger, is an $800 million fund. It is a member of the Viola Group, a technology-focused private equity group with $2 billion under management.
As part of the funding deal, Carmel general partner Ronen Nir is joining Cloudyn’s board of directors.
Cloudyn raised $4 million in Series A funding round last year. | | 10:19p |
|
|