Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
 
[Most Recent Entries] [Calendar View]

Monday, August 22nd, 2016

    Time Event
    12:00p
    Peak 10 and GI Find Winning Formula in Secondary Data Center Markets

    While secondary data center markets don’t often garner headlines, they certainly must be profitable.

    Since 2000, Charlotte-based Peak 10 has evolved with its customers to provide infrastructure, cloud, and managed services. The company has national partner alliances and continues to expand service and support offices. However, Peak 10 remains focused on serving small and medium-sized enterprise with 27 data centers located in 10 secondary markets.

    In 2014, the company was sold to GI Partners, a private equity firm that specializes in companies that lead in “fragmented or overlooked markets” and can grow via acquisition. In other words, strong consolidators in secondary markets.

    While Peak 10 doesn’t disclose its revenue numbers publicly, it has grown to more than 400 employees and has a customer base of more than 2,600 customers, its senior VP and chief operating officer, Jeff Spalding, told Data Center Knowledge in an interview. The company recently opened a sales and engineering office in Chicago to support a key channel partner, Avant Communications. It operates similar support offices in Philadelphia, Orlando, Phoenix, Salt Lake City, Memphis, and Austin. Small and medium businesses looking for high-touch colocation, cloud, and managed solutions are the company’s bread and butter, which is why local staff presence in its data center markets is important.

    Peak 10 offers “tailored solutions” rather than custom solutions for customers, Spalding explained. It is a subtle difference, but it allows for faster implementation and standardization to make support easier across markets.

    Over the years, the company has added more “building blocks” to construct these tailored solutions, including: Infrastructure-as-a-Service, Disaster Recovery-as-a-Service, cloud and managed hosting, and network services. Two of the latest building blocks are Encryption-as-a-Service, and Object Storage.

    See also: How TierPoint Quietly Built a Data Center Empire in Secondary Markets

    Measured Growth

    Peak 10 began in 2000 with a single Jacksonville data center. Charlotte, Tampa, and Raleigh followed between 2001 and 2001, one market per year.

    Over the following five years Peak 10 grew primarily by acquisitions, including: Xodiax in Louisville in 2004, Intercerve in Charlotte in 2006, bayMountain in Richmond in 2007, and 1Vault Networks in Ft. Lauderdale in 2009.

    There were no acquisitions from 2010 to 2014 while private equity firm Welsh, Carson, Anderson, & Stowe was in control. In June of 2014, GI Partners acquired Peak 10 from the private equity ownership group, reportedly for $800-$900 million.

    At that time, Peak 10 operated 24 data centers, totaling 270,000 square feet, in 10 markets throughout the southeast region of the US.

    Breaking New Ground

    The summer of 2014 was a pivotal time for Peak 10. That June, when GI bought the company, Peak 10 was also breaking ground on its first ground-up data center development in the Tampa market.

    Read more: Peak 10 Building Tampa Data Center From Scratch

    David Jones, founder and current CEO of Peak 10 said at the time, “Supporting our mission to continue expanding our robust national presence, we are excited to see the construction of our newest generation of data centers underway.”

    Today, once it completes construction of its latest data centers in Richmond and Tampa, Peak 10 will have close to 30 facilities in the same 10 data center markets.

    GI Partners: Buy It, Grow It, Sell It

    When it comes to having an eye for talent and opportunity in the cloud computing and data center industry it would be hard to come up with a more impressive resume than San Francisco-based GI.

    The firm manages over $12 billion in institutional assets and is currently the sole investor in Peak 10.

    GI is well-known for acquiring and growing portfolio companies and then harvesting the profits by selling, or taking them public. Over the years, its data center and cloud computing investments included:

    • Digital Realty Trust: IPO of first data center REIT in 2004, GI Partners fully-exited by 2007
    • Softlayer Technologies: sold to IBM in 2013
    • The Telx Group: GI sold Telx to ABRY/Berkshire Partners in September 2011; Digital Realty acquired Telx in October 2015

    Read more: Digital Realty Closes $1.9B Telx Acquisition

    This past July, GI entered into a JV with Corporate Office Properties Trust (COPT) to buy half of an existing portfolio of six single-tenant data centers totaling nearly 1 million square feet.

    Being a privately held company, Peak 10 has an advantage of being able to play its cards close to the vest. The executive team has been together for over a dozen years, and Peak 10 has a distinct culture.

    While Peak 10 has a marketing presence in the Northeast, Midwest, and Southwest, there have not been any new market expansions since 2010. Instead, the company has invested in building up its portfolio of solutions.

    Spalding mentioned that providing capital and resources to support large national partners could become a stepping stone leading to new data center markets. However, when it comes to expansion into other markets through more acquisitions, they would have to be a good fit “on the people side, as well as the technology,” he explained.

    The priority remains supporting growth of existing Peak 10 customers with data centers and tailored solutions adjacent to their local and regional business locations. Spalding said, “We started our company in 2000 with a clear focus on the customer. Even though the marketplace has changed dramatically since then, how we operate has not. We have remained grounded in providing the best customer experience possible. It is why we remain successful.”

    3:00p
    HIPAA Breach Case Results in Record $5.5M Penalty
    Brought to you by MSPmentor

    Brought to you by MSPmentor

    The costs for mishandling electronic protected health information (ePHI) continue to skyrocket.

    Advocate Health Care Network has agreed to pay a record $5.5 million to settle claims that it violated the security rule of the Health Insurance Portability and Accountability Act (HIPAA), resulting in data breaches that compromised the records of roughly 4 million people.

    The Aug. 4 settlement – the largest in the history of HIPAA enforcement actions – stemmed from three separate data breaches that occurred within months of each other in 2013.

    Federal authorities said Advocate failed to conduct mandatory risk assessments, properly safeguard laptops containing ePHI or obtain a required business associate agreement with a third-party contractor that handled medical billing.

    See also: What HIPAA Means for Data Centers

    “We hope this settlement sends a strong message to covered entities that they must engage in a comprehensive risk analysis and risk management to ensure that individuals’ ePHI is secure,” said Jocelyn Samuels, director of the U.S. Department of Health and Human Services’ Office of Civil Rights. “This includes implementing physical, technical, and administrative security measures sufficient to reduce the risks to ePHI in all physical locations and on all portable devices to a reasonable and appropriate level.”

    Security of ePHI has become a growing concern for managed services providers (MSPs) with customers in health care.

    MSPs with expertise in HIPAA compliance can realize a huge market opportunity by managing sensitive patient data for health care entities.

    But the lucrative vertical also carries substantial financial risks in the form of penalties and legal costs if ePHI is mishandled.

    See alsoIT Services Provider Pays $650,000 HIPAA Breach Fine

    Under HIPAA rules, MSPs are considered “business associates,” and must sign agreements with the health care customer assuring they will abide by all data security requirements.

    One of the three Advocate breaches involved Blackhawk Consulting Group, which provided billing services.

    In that case, the ePHI of more than 2,000 Advocate patients was compromised when an unauthorized third party gained access to Blackhawk’s network.

    “Advocate failed to obtain satisfactory assurances in the form of a written business associate contract that its business associate would appropriately safeguard all ePHI in its possession,” federal officials said in a statement.

    The other two breaches involved two separate thefts of laptops computers from Advocate facilities containing private information of nearly 4 million people.

    Advocate Health Care Network is the largest fully integrated health care system in Illinois, authorities said.

    The latest penalty brings the total amount of settlements for HIPAA security violations to $20.3 million this year, up sharply from $6.2 million in all of 2015.

    This first ran at http://mspmentor.net/msp-mentor/hipaa-breach-case-results-record-55-million-penalty

    3:30p
    The Best Tools to Predict and Manage Cloud Costs
    Brought to you by IT Pro

    Brought to you by IT Pro

    Cloud pricing can be a frustrating experience. Everything is charged by different metrics. Some of the prices are spelled out, some are hidden behind paywalls or aren’t clear until you get your monthly bill and realize you forgot to turn off an instance that is chewing up your wallet. Some are charged by usage, others by the month. All this means that keeping track of cloud costs can be nearly a full-time job.

    Thankfully, there is software that can help predict your bill and figure out if you would be better off moving your VMs to some other provider. Indeed, there is a lot of choice of these cloud costing comparison services. Keep in mind that these tools are used to estimate your monthly cloud bills and don’t really attempt to get much into the actual application performance monitoring itself. For that you will need other tools such as ManageEngine’s Applications Monitor, Scalr (which offers some costing and management features) or DataDog, just to name a few.

    Before you use these services, first look at the individual costing pages that each cloud provider offers for free: Here they are from Rackspace, Amazon, Azure, Google, andIBM/Softlayer (the latter has registration annoyingly required). The issue with using these free costing pages is that they can be more confusing than helpful, because they assume you know your way around specifying the exact instance size and parameters. For example, the simple question of how many compute hours are included in your monthly bill isn’t as easy as it seems. Some of these calculators assume a 720-hour month, no matter what the calendar says.

    Second, look at some of the free services that are available from third parties that offer comparison reports of the cloud providers. For example one of CloudSpectator’s reports show that Microsoft Azure provides lowest cost block storage, while SoftLayer is the least costly for large Windows instances. Download these reports and study them before you pony up any hard cash for the paid services. If your cloud computing bill is less than $1000 a month you are probably not going to get any further as most of the paid services are for helping you manage larger and more costly cloud installations.

    Figuring out the price tag for these costing services is also an exercise in patience. Some of them are completely free, such as Solarwinds and Cloud Harmony. Some have freemium services: you get a small collection of providers for free, more in depth reports and analyses as you pay.  For example, Cloudyn has a free service that has limited analysis, a paid service at $229 a month, and will also sell you custom reports that cost and cover more.

    Third, figure out whether you should you go wide (with looking at a service that can handle as many players as possible) or deep (diving into just a few cloud providers and examine a lot of their particular service offerings). No one service can do both well, sadly.

    Finally, check out some typical questions that you might want to have answered by each costing service or through your own research:

    • Who is the cheapest cloud provider for a particular workload size and duration? Most of these services can answer this question adequately. When you add other cloud services beyond CPU, disk, and memory though it can be difficult to compare one provider with another.
    • What you can save by using AWS reserved instances (and some of the others) to cut costs if you plan ahead.  Both Cloudability and Plan for Cloud have special tools to help with your planning here. CloudHealth also has an item in their reports that show the impact of reserved instances on your overall bill.
    • Should you be using burstable CPUs? This is another way to cut costs. Some of the players will let you overcommit your workload for a limited time, and some do so for no additional charge. While none of these costing services can do the math, it is something to keep in mind.
    • Should you distribute your workloads? It might make sense to split your workload into smaller instances that are cheaper to run, in some cases significantly cheaper. Cloudorado has the ability to perform this kind of sensitivity analysis. One nice feature about PlanForCloud is that it allows you to assemble deployment scenarios by mixing and matching across their preferred six providers. That is an interesting twist that none of the other costing services offer.
    • How much support do you really need? The level of support that you pay for could be costing you money. However, you may not actually know what you need until you have been using a particular platform for a couple of months, so take a closer look down the road and see if you can reduce your support bill. Both PlanForCloud and Cloudorado (to a lesser extent) offer ways you can analyze the effects of reducing your support plan if you don’t need all the hand-holding.

    One other thought: the cloud is a dynamic place. Instance characteristics come and go. Each player changes prices almost continuously: AWS has been through dozens of them, mostly drops in price over the years. Any costing analysis that you do perform could be outdated in a few months’ time, so do run it repeatedly to check if anything has changed.

    cloud costing itpro

    This first ran at http://windowsitpro.com/cloud/best-tools-predict-and-manage-cloud-computing-costs

    4:49p
    Equinix Launches Its Fifth Australia Data Center in Sydney

    Equinix announced the launch of its latest Australia data center. The company said it has invested $97 million in the SY4 facility in Sydney, which already has 20 customers on board, including Servers Australia, one of the country’s largest dedicated hosting firms, and Infrastructure-as-a-Service cloud provider Zettagrid.

    This is Equinix’s fifth data center in Australia. The company has four facilities in Sydney, including the new one, and one in Melbourne. Equinix, the world’s largest data center provider, has close to 30 data centers in Asia Pacific and close to 150 globally.

    According to Equinix, its Sydney campus hosts equipment for more than 600 companies, including 140 network providers and more than 225 cloud and IT service providers. There, companies get access to key Australian peering points and multiple submarine cable systems, such as the Australia-US Southern Cross and PIPE Pacific Cable, which runs from a coastal town close to Sydney to Guam.

    The facility will also provide access to the Hawaiki cable. Amazon Web Services is one of four anchor customers of the cable – currently in early stages of construction – which will connect US (it will land both in Hawaii and on mainland), Australia, and New Zealand.

    Read more: Amazon’s Cloud Arm Makes Its First Big Submarine Cable Investment

    The data center’s first phase has capacity for 1,500 IT cabinets. The second and final phase will double that capacity, bringing the facility’s total usable floor space to about 135,000 square feet.

    Here’s a cool timelapse video of construction of Equinix’s SY4 data center in Sydney:

    5:20p
    Ascending Tech Dominates S&P 500 Like No Time Since Dot-Com Bust

    (Bloomberg) — Don’t look now, but technology companies are exerting more control over the US stock market than any time since the internet bubble.

    Fueled by three-year rallies in which Microsoft and Alphabet doubled, Amazon tripled and Facebook surged fivefold, computer and software stocks have increased to almost 21 percent of the S&P 500 Index’s value, near a 15-year high. The distance between tech and the next-biggest group, banks, is close to the widest ever.

    While the divergence rings warning bells for anyone who lived through the crash of 2000, tech’s ascent has its virtues, and is in some ways a sign of the market’s health. For one, it reflects the diminishing influence of banks, which held a much larger share of the S&P 500 in the years before the financial crisis. It’s also evidence of rationality: tech is one of the only industries where earnings continue to expand.

    “The underlying economy is moving more toward technology, so to have it make up a bigger part of the market is probably not a disconnect,” said Brent Schutte, Milwaukee-based chief investment strategist of Northwestern Mutual Life Insurance Co.’s wealth-management unit, which oversees $89 billion. “This is not me saying that technology is cheap,” he said. “It’s just taking away that argument that this is a bubble waiting to happen.”

    DCK: Two of the largest US-based data center providers, Equinix and Digital Realty Trust, are included in the S&P 500. Digital Realty was included this past May.

    Market Anxiety

    Swelling in the market’s largest group comes amid warnings from bears such as billionaire investor George Soros that stocks are at risk for a repeat of the 2008 crisis. While widening valuations and demand for safety trades such as utilities and low-volatility shares have stirred anxiety, the resurgence in tech shows one cornerstone of the seven-year bull market is behaving as it normally does.

    The  Nasdaq Composite Index and S&P 500 Information Technology Index have both rallied 23 percent since markets bottomed in February. Semiconductor companies like Nvidia, Applied Materials and Micron Technology have led the S&P tech gauge, while only one of its 68 members, First Solar, is down over the stretch. The Nasdaq closed at an all-time high on Aug. 15, and has climbed for nine consecutive weeks, the longest since 2009.

    Stocks rose at 12:30 p.m. in New York, with the Nasdaq rising 0.1 percent.

    Tech companies are extending leadership at the fastest rate in four years. Their representation in the S&P 500 has increased by more than 1 percentage point to 20.9 percent this quarter, about 5 percentage points higher than financial shares. From Apple to Microsoft, mega techs now occupy half of the top 10 spots in ranks of the most valuable American companies, matching the number at the peak of internet mania.

    Profit Growth

    Unlike the dot-com era, when investors snapped up web companies with promise but little profit, today’s gains are built on earnings, driven by demand for products such as Apple’s iPhone and Google’s web ads. Their strength was on display during the earnings season, when companies delivered the biggest beat among industries. While third-quarter growth estimates just turned negative for the S&P 500, the group is expected to expand profit by 2.8 percent.

    “The phenomenon is as much of a function of tech being a strong growing sector in terms of earnings as it is financials fading the scene,” said Rich Weiss, the Los Angeles-based senior portfolio manager at American Century Investments, which oversees about $154 billion. “It’s healthy growth. I don’t believe we need to worry about a tech bubble here or in the near future.”

    Longer Reign

    Computer and software makers have been the biggest industry in the S&P 500 for the duration of the bull market that began in March 2009, with their influence widening at the end of 2015 and again now. That’s a far longer reign than the 3 1/2-year stretch that began at the end of 1998, which saw their share of the index reaching almost 35 percent. This time around, the weighting has generally held below 21 percent.

    “Does the fact alone that the largest sector in the index eclipses a weighting threshold make the market vulnerable? At a high level, the answer is no, but with a caveat,” said David Kahn, managing director at Convergent Wealth Advisors in Los Angeles, where the firm oversees about $4.5 billion. “What concerns us is when we see a sector commanding far more in market cap than in profit participation relative to history. That is an indication of a momentum-based bubble.”

    By that measure, a bubble does not appear to be in the making. Information technology accounted for about one-fifth of the full index’s operating earnings in the 12 months through March, almost precisely its market weight. In 2000, when technology stocks commanded about a third of the gauge, the group’s profits were 15 percent of the total.

    Expanding earnings have helped keep valuations in check. While faltering profits have driven up the S&P 500’s multiple to 18.6, the highest level in more than a decade, computer and software companies are trading at a ratio that’s 11 percent below their 20-year average. At 18 times forecast income, the industry is valued at a discount to the market.

    “If tech gets to the point where they’re not growing at all, then it would be really a red flag — that would signal one of the strongest and fastest growing area of the economy is stalled out,” said Curtis Holden, a senior investment officer in Houston at Tanglewood Wealth Management, which oversees $870 million. “There is probably some realization in the market that ultimately for stock values to go up, there’s got to be some growth and tech has a little bit of edge.”

    7:17p
    Report: Beijing Bans Data Centers with High PUE

    A recent push by the Chinese government to clean up the environmental footprint of the country’s rapidly growing data center industry is focused on two fronts: energy efficiency and renewable energy.

    While there are no national-level data center efficiency regulations yet, the City of Beijing, one of the world’s most polluted cities, recently issued an outright ban on data centers whose Power Usage Effectiveness, or PUE, is above 1.5, according to a report in Environment 360, a Yale School of Forestry and Environmental Studies publication.

    There are no official statistics on the total number of data centers in China, but the government estimates that the country’s data centers consume more energy than Hungary and Greece combined.

    About 675 million people, or half of China’s population, were connected to the internet in 2014, according to Internet Live Stats. For the sake of comparison, about 280 million internet users lived in the US that year, while the country’s internet penetration was 87 percent.

    A recent US government study estimated that all data centers in the US consumed about 70 billion kilowatt-hours of electricity in 2014. Growth in the industry’s total energy consumption has slowed down substantially since 2010, which was due primarily to data center efficiency improvements.

    Read more: Here’s How Much Energy All US Data Centers Consume

    Most data centers in China are grossly inefficient, operating at PUE of 2.2, according to Environment 360. An official with the Beijing-based China Green Data Center Advancing Federation told the publication that some companies have already left the city because of its PUE restrictions.

    Perhaps the biggest national-level data center energy program in China is the green data center pilot program by the Ministry of Industry and Information Technology. The program enlists companies that are building data centers with innovative energy efficiency design features and serves as a hub for information sharing and establishment of data center efficiency and environmental standards.

    See also: Pollution in China Makes Free Cooling Difficult for Baidu

    Read the full Environment 360 article here

    << Previous Day 2016/08/22
    [Calendar]
    Next Day >>

Data Center Knowledge | News and analysis for the data center industry - Industry News and Analysis About Data Centers   About LJ.Rossia.org