Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
 
[Most Recent Entries] [Calendar View]

Monday, November 10th, 2014

    Time Event
    4:30p
    A Case for Compromise: Growth of the Hybrid Cloud

    Rick Delgado is an enterprise tech commentator and writer.

    Many companies have begun investing in cloud computing but that doesn’t mean their employees are on board. When it comes to IT departments, a fairly large number of workers would prefer to keep data right where they can see it. They’re called “server huggers” and they’ve been some of the staunchest defenders of in-house data centers for years.

    A recent study shows that roughly 46 percent of business IT workers consider themselves server huggers, preferring to maintain total control over their data. Many of their peers, however, believe that migrating to the cloud is a better option. While both sides in the debate make valid points, there may be a solution that addresses the concerns of everyone involved.

    A case for server hugging

    Data is valuable for businesses. Being able to control that data can be equally valuable. Put simply, by choosing to move to the cloud, businesses would be giving up at least some of their control over data. That is one reason server huggers are insistent that data be kept in-house at all times. A move to the cloud also means having less control over hardware and software updates, which could affect business decisions.

    Related to the issue of control, another problem server huggers point to is the increasingly complicated landscape involving security. With headlines of security breaches and privacy leaks appearing all over the news, putting that kind of trust in a cloud vendor for storing valuable data may seem like an unnecessarily risky move. This is seen in industries such as healthcare and financial services, where a data breach may not only hurt the business but could have legal ramifications. By keeping data in on-site servers, IT workers can respond to these challenges directly and immediately tackle any problems that may arise.

    A case for cloud adoption

    There are good reasons why cloud storage has gained so much momentum over the past few years. One major reason is that the amount of data being generated and collected has grown tremendously in size. Big data is now driving many business decisions, and that requires data storage facilities that were previously out of reach of all but the largest corporations. Processing that information is made easier through cloud solutions, as many providers have the tools and software needed to make the most of the data being generated.

    On the business side of things, supporters of cloud computing contend that using the cloud will mean less reliance on local hardware, which will significantly save on costs. There would also be less need to spend precious time and resources on managing and maintaining servers and updating crucial software. Without vast data centers to house data, there would also be less energy consumption, which would result in a lower energy bill and even more savings. As for security concerns, cloud defenders say even on-site servers need or will soon need some connectivity applications, meaning many of the same security risks in moving to the cloud would still be present if businesses choose to keep everything in-house.

    A case for compromise

    Both sides have good reasons to defend their positions, but luckily as technology advances, solutions usually bubble up to the surface. The most promising is the growth of the hybrid cloud, an option that combines using a public cloud with in-house data centers via a split infrastructure.

    Support for the hybrid cloud is showing signs of growth in the IT sector as well. In the same study mentioned earlier, 37 percent of those who responded have embraced a hybrid strategy, with more planning to introduce the cloud soon. In many ways, hybrid clouds feature the advantages of both solutions, with lower costs and greater efficiency, while still keeping the most sensitive, confidential information in-house and away from a third-party where ownership and security remain at issue.

    In the case of embracing the cloud, the movement will likely continue to grow though the pace of that growth may slow. More businesses are likely to adopt a hybrid cloud, which may accelerate the transition into a full-fledged adoption of cloud computing. Time will tell if every industry will make this change, but as confidence in the cloud grows and technology advances, businesses will likely be more comfortable making the switch.

    Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.

    4:36p
    TIA Data Center Workshop

    The Telecommunications Industry Association (TIA) will hold its 2014 Data Center Workshop November 11-12 at TIA Headquarters in Arlington, Virginia.

    TIA’s Data Center Workshop will address cybersecurity and other critical issues facing the entire data center value chain – from cloud service providers and equipment vendors to end-users involved in enterprise, government, finance, and more.

    The event kicks off with a tour of CenturyLink Technology Solutions’ data center in Sterling, Virginia. Day two will feature lively discussion with experts from Google, Microsoft, Dell, Intel, CenturyLink, the U.S. Chamber of Commerce and others. The group will debate emerging cybersecurity solutions for enterprise and government, developments in cross-border data flows and trade negotiations, technological advances in software-defined networking, architecture and cabling for “Beyond-100G” data centers, and much more.

    For more information – including speakers, sponsors, registration and more – follow this link.

    To view additional events, return to the Data Center Knowledge Events Calendar.

    5:18p
    Equinix to Build Fifth Tokyo Data Center

    Equinix is building its fifth Tokyo data center roughly a year after it built the fourth one there. The $43 million data center will be about 55,000 square feet, providing total capacity of 725 cabinets in two phases. The first phase will provide 350 cabinets and is schedule to open in the first quarter of 2016.

    The data center is in response to strong demand from financial services, cloud, and content providers for interconnection services in Tokyo. TY5 is located in a single-tenant facility and is being built in close proximity to TY3, both in close proximity to Tokyo’s financial district and the major financial exchanges. TY3 is about 80,000 square feet and was reported to cost $70 million.

    Financial services are attracted to the location thanks to low-latency connectivity to exchanges and an established ecosystem of financial services customers. Equinix has about 60 customers in the financial services space in Tokyo.

    The appeal extends beyond financial services however. As it does elsewhere in the world, Equinix also caters to cloud and content providers in Japan because of rich connectivity options at its facilities. Equinix Tokyo has access to more than 1,000 domestic and international network providers.

    “TY5 will be built in response to the various demands being driven by the digital economy across the cloud, mobile, and financial services industries,” said Kei Furuta, managing director for Equinix Japan. “As Tokyo is a major international finance center that houses some of the world’s largest investment banks, trading platforms, and insurance companies, it is important that we provide the necessary interconnection services to meet these increasing demands.”

    According to research firm Forrester, the data center market in Japan is forecasted to reach US$14.2 billion by 2016.

    Equinix continues to expand in Japan and across Asia Pacific in general. Shortly after announcing its fourth Tokyo data center in 2013, the company announced another one in Osaka, Japan’s second-largest market. The first Osaka data center was opened in partnership with K-Opticom, a large access provider, with support from O-BIC, an Osaka government agency.

    The company recently launched a service that provides private network links to Google’s cloud in 15 markets, including Tokyo.

    5:47p
    Survey: Nearly Everybody Will be Using DevOps by End of 2015

    DevOps adoption is expected to reach 93 percent by the end of 2015, according to a new survey commissioned by Rackspace. A big portion of that number consists of respondents who have already implemented DevOps practices (66 percent), with more than a quarter of respondents planning to implement DevOps by the end of the year. A significant majority (close to 80 percent) of respondents are using some sort of outsourced DevOps services, a key finding for providers that have services that help customers use DevOps tools and practices, such as Rackspace.

    DevOps combines many of the roles of systems administrators and developers. It is a set of principles driving greater collaboration between the different groups responsible for taking a product or service to market. IT automation products are DevOps tools, and agile software development is the cornerstone of DevOps, as the overall environment is becoming more dynamic. Major technology shifts of Internet business and collaboration technologies, open source software, and cloud computing are prompting the shift.

    Rackspace has reason to both commission the survey and tout the results. It has been building up its DevOps services, and the survey supports this strategy. The Texas company has focused on managed services for cloud in order to distance itself from other cloud infrastructure service providers.

    It’s known for its “Fanatical Support,” a message and philosophy the company is evolving for the cloud services world. Fanatical support extended to DevOps and automation last year, and 18 new services were launched this year. DevOps Automation is the highest tier of managed services.

    Vanson Bourne performed the study, interviewing around 700 IT decision makers across the U.S., U.K., and Australia. The key takeaways, according to Chris Jackson, CTO of DevOps at Rackspace, are that the operations team is the primary driving force behind the change to DevOps, and customer satisfaction is the biggest benefit.

    Given that DevOps is still in its infancy, such high penetration is almost hard to fathom. It is also hard to capture the maturity and stage of a company’s DevOps transition. The survey does examine which DevOps practices have been implemented, with half saying that development and operations teams have been fully integrated, suggesting mature DevOps adoption:

    A granular breakdown of how DevOps is being executed across the organization

    A granular breakdown of how DevOps is being executed across the organization, courtesy of Rackspace

    The survey shows that DevOps is important, almost all companies strategizing how to move to this approach. The key priority for future implementations is to align DevOps goals with business goals.

    “The results of the DevOps Adoption Study validate that there is significant recognition among global businesses that DevOps is fundamental to fully exploiting the cloud in the pursuit of driving rapid innovation,” said Prashanth Chandrasekar, general manager of Rackspace’s DevOps Business Segment.

    What’s the big deal about DevOps?

    The biggest business benefit of switching to DevOps is the increase in customer satisfaction (over 60 percent), according to the survey. Other benefits include a reduced spend on IT infrastructure and a reduction in application downtime or failure rates, cited by about half of respondents as the biggest benefits. One third reported increase in sales and employee engagement as the benefits.

    Over 700 IT decision makers breakdown the IT benefits of DevOps (Source: Rackspace)

    Over 700 IT decision makers breakdown the IT benefits of DevOps (Source: Rackspace)

    Through cultural alignment, automated deployments, and agile infrastructure, businesses are using DevOps methodologies to reduce time to market by responding rapidly to customer feedback — ultimately driving significant business value and efficiency,” said Chandrasekar.

    The technical benefits of DevOps are faster delivery of new features, a more stable operating environment, increased innovation, and better collaboration.

    6:18p
    Stratoscale Raises $32M to Build Docker-Supporting OpenStack Clouds on Commodity Servers

    Data center software startup Stratoscale announced it has raised $32 million to take on VMware and other cloud giants with its data center operating system that creates a converged infrastructure atop commodity x86 servers that supports both Docker containers and traditional virtualization.

    Attracting some big industry names, the Israeli vendor’s Series B funding round was led by Intel Capital, with participation by Cisco and SanDisk, among others. Its Series A investors Battery Ventures and Bessemer Venture Partners participated as well, bringing the total funding amount to $42 million.

    Stratoscale’s software automatically distributes physical and virtual assets in real time. The hardware-agnostic solution supports hypervisor-based virtualization, OpenStack, and Docker containers.

    “With over $40 million total funding since inception, we are positioned to deliver the first pure-play software architecture for hyper-convergence,” said Stratoscale co-founder and CEO Ariel Maislos. “This additional funding will allow us to complete our product development as well as build out global sales and marketing.”

    “Stratoscale is onto something foundational in enterprise IT,” said Scott Tobin, general partner at Battery Ventures. “Today’s data centers are looking for solutions which help increase overall efficiency. Stratoscale’s technology delivers this in a software-based solution which allows customers to select their own hardware platforms to run on.”

    7:39p
    Enterprise Hadoop Startup Hortonworks Files for IPO

    Hortonworks has become one of the first startups that built their business around the open source big data software framework Apache Hadoop to file for an IPO.

    The company submitted IPO documents to the U.S. Securities and Exchange Commission Monday. The amount and price of shares it expects to sell has not been determined, according to a Hortonworks statement.

    Created by Yahoo, Hortonworks was one of the first companies to go after the enterprise Hadoop market, turning the open source technology into a software business. The framework, which enables users to turn cheap commodity servers into powerful compute clusters that can crunch through a lot of data using parallel processing techniques, has grown in popularity over the recent years as companies look to monetize the massive amount of user data in their possession.

    Some of the biggest Hortonworks competitors in the enterprise Hadoop space are Intel-backed Cloudera, MapR (which has close ties to Google), and Pivotal, an EMC subsidiary led by former VMware CEO Paul Maritz.

    Revenue grows quickly, but so do losses

    While its revenue has been growing quickly, Hortonworks has not yet been able to turn a profit.

    Its platform is open source and available free of charge. The company makes money by selling support contracts and professional services.

    Hortonworks reported about $7.7 million in support subscription revenue and about $3.3 million in professional services revenue for fiscal year ended in April 2013. Total revenue of nearly $11 million for that 12-month period was a significant increase over the preceding fiscal year’s revenue of about $1.6 million.

    The company’s revenue has been growing much faster recently, however. It made about $33.4 million in sales for the nine months ended on September 30 this year, including about $19.2 million in support subscriptions and $14.2 million in professional services.

    Taking away the cost of providing support and professional services, as well as operational costs, such as R&D, sales and marketing, and administrative expenses, Horrtonworks’ total loss for the nine months ended September 30 was about $86.7 million.

    The company acknowledged that it has incurred losses every year it’s been in business in the documents it filed with the SEC. It also warned potential investors that because of the uncertainty of the market it is operating in, it may never be able to become profitable.

    “Because the market for our solution is rapidly evolving and has not yet reached widespread adoption, it is difficult for us to predict our future results of operations,” the document read. “We may not achieve sufficient revenue to attain and maintain profitability.”

    The company said it expects its operating expenses to continue growing as it hires more people, expands distribution channels, and invests more in R&D.

    Hundreds of paid customers; high-profile partnerships

    Hortonworks was founded in 2011 and launched its core offering, a Hadoop platform for enterprises, one year later. It had about 230 support subscriptions and nearly 300 customers total as of the end of September of this year.

    The company has integration partnerships with a handful of IT vendor and service provider heavy-hitters, including HP, Microsoft, Red Hat, SAP, Teradata, Yahoo, and Rackspace. HP also holds a $50M equity stake in the company.

    Yahoo got dibs on 6.5M very cheap shares

    Hortonworks’ entire founding team came out of Yahoo, where they worked on developing and deploying Hadoop and MapReduce at the Internet company.

    As the company that created the startup, Yahoo holds the right to buy 6.5 million shares of preferred stock at $0.005 per share when the offering commences. Yahoo secured the “preferred stock warrant” when it spun Hortonworks out in 2011.

    9:00p
    At Least 18 Election Websites Offline During the U.S. Midterm Elections

    logo-WHIR

    This article originally appeared at The WHIR

    On the day of the U.S. midterm elections, the Contra Costa County Department of Elections website for was inaccessible starting at 7:20 a.m. local time.

    And it wasn’t alone, the Bay Area News Group reported that 18 election websites run by Florida-based SOE Software across the country were down for most of the election day.

    According to local news reports, Contra Costa County officials said the hosting of the website was contracted to SOE Software, which was also offline at the time. Election officials said SOE Software was working trying to fix the problem, and the sites were back online this week.

    The main function of election websites is to provide information on where voters can find polling stations, but they also provide features such as Vote by Mail ballot registration.

    Officials recommended that voters needing to find their polling station visit Get to the Polls, a website sponsored by the Pew Charitable Trust and others.

    It’s possible that the election websites were unprepared for the amount of traffic they would get on election day, but it’s also likely that a Distributed Denial of Service attack flooded SOE Software’s servers with requests, blocking legitimate traffic from reaching the websites it hosts.

    SOE Software did not respond to a request for comment from The WHIR.

    This article originally appeared at: http://www.thewhir.com/web-hosting-news/least-18-election-websites-offline-u-s-midterm-elections

    9:16p
    Joyent Wants to Provide Bare-Metal Cloud for Docker Containers

    Cloud infrastructure service provider Joyent is getting behind Docker containers in a big way.

    The San Francisco-based company is working to give users the ability to run Docker container images directly on the hardware in its data centers. Joyent uses application containers of its own to deliver its Infrastructure-as-a-Service offering but now wants the industry to standardize on Docker.

    Although it has been around for less than two years, Docker has enjoyed widespread support from developers and many heavyweight IT vendors and service providers, such as Google, IBM, Microsoft, and Red Hat. Led by the eponymous San Francisco-based company, the open source technology makes it possible to deploy an application quickly on any type of infrastructure, be it a laptop, a bare-metal server, a VM, or a cloud.

    Joyent’s operating system SmartOS uses a concept similar to Docker, but the company has not advertised that fact until recently, when it saw all the hype about application containers Docker has created.

    Late last month, Joyent raised a $15 million funding round and said it would work to make Docker containers part of its service portfolio, but the details of that integration were scant. Last week, however, Joyent CTO Bryan Cantrill wrote a blog post fleshing out the company’s strategy around Docker, indicating a lot of support.

    “We see the great promise of Docker, and we look forward to working with the community to develop (and upstream!) the abstractions that will make Docker the de facto standard for application containers for developers,” he wrote.

    Joyent has what Cantrill described as a “nascent Docker API endpoint” for its SmartDataCenter orchestration software. The company’s engineers are working to combine that API with an ability to execute Linux binaries on SmartOS natively and enable users to run Docker images directly on its hardware. That effort is still “primordial,” Cantrill wrote.

    Earlier this month, Joyent open sourced SmartDataCenter and Manta, its object storage platform. SDC is the container-based orchestration software the company credits with high performance of its cloud since it circumvents a server virtualization layer that is usually present in typical cloud infrastructure stacks.

    9:30p
    Google Reveals Alarming Success Rates For Manual Hijacking of Accounts

    logo-WHIR

    This article originally appeared at The WHIR

    A Google study released Thursday found hackers located mostly in China, Ivory Coast, Malaysia, Nigeria, and South Africa are much more successful at obtaining account information than expected. Requests for personal and login information through fake websites works a huge amount of the time, up to 45 percent. The researchers examined Google data from 2011 to 2014 and found people entered information into such sites at the alarming rate of 14 percent.

    “Online accounts are inherently valuable resources—both for the data they contain and the reputation they accrue over time. With the advent of the cloud, the most intimate details of our lives are contained on remote servers in a single account,” according to the study. “This makes account theft, or account hijacking, a lucrative monetization vector for miscreants.”

    Despite public awareness of phishing tactics and other cyber attacks, hackers are still able to get enough information to access email accounts and eventually, bank accounts. Using phishing, malware or simply guessing the account password hackers are able to gain control of an email account. Within 30 minutes of the hacker obtaining the target’s login, they are already changing passwords to lock the owner out and looking for financial account details they can exploit.

    With a number of recent high profile hacks at JP Morgan, Home Depot, Kmart and Dairy Queen, it’s not surprising that a recent Harris poll found American’s concern over cybersecurity is even higher than worries over national security.

    While most previous studies have focused on attacks by automated botnets or professional spamming infrastructure, Google chose to focus on manual hijacking. “Manual hijackers spend significant non-automated effort on profiling victims and maximizing the profit—or damage—they can extract from a single credential,” the report explains. “In contrast to automated hijacking, manual hijacking is exceedingly rare. We observe an average of 9 incidents per million Google users per day. However, the damage manual hijackers incur is far more severe and distressing to users and can result in significant financial loss.”

    If the information they find isn’t lucrative enough, they quickly move on. “The existence of this profiling phase is one of the most surprising insights we gained by fighting manual hijackers,” said the researchers. “Instead of blindly exploiting every account, hijackers take on average 3 minutes to assess the value of the account before deciding to proceed.”

    The study was able to link manual hijacking with phishing, which has has been anecdotally perceived as the main way hackers steal user credentials. Although app stores and social networking logins are sometimes the focus of hackers, they usually concentrate efforts on the victims’ email 35 percent and bank information 21 percent of the time.

    Fortunately there is a high rate of account recovery when backup systems are in place. When a phone number is given, SMS was used to recover accounts at a rate of 81 percent while using a secondary email is successful three-quarters of the time. Without these systems in place, account recovery drops to 14 percent when secret questions or manual review are utilized.

    Google has several strategies in place to detect suspicious activity on its side of the equation. For users, they recommend the best strategies to prevent and mitigate this type of hack are two-factor authentication and account recovery strategies. “Iron tight Account recovery–Finally we can’t stress enough how important it is to invest into having a very secure and reliable account recovery system. We continuously improve our recovery process to ensure that it is easy for legitimate users to get their account back while keeping hijackers out,” said the researchers. “Developing novel ways to validate user identity both for login challenge and account recovery purpose is something that we view as critical and we would love to see more research done in this space.”

    This article originally appeared at: http://www.thewhir.com/web-hosting-news/google-reveals-alarming-success-rates-manual-hijacking-accounts

    << Previous Day 2014/11/10
    [Calendar]
    Next Day >>

Data Center Knowledge | News and analysis for the data center industry - Industry News and Analysis About Data Centers   About LJ.Rossia.org