Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
[Most Recent Entries]
[Calendar View]
Tuesday, January 13th, 2015
| Time |
Event |
| 12:46a |
C7 Sues Bitcoin Mining Firm CoinTerra for Unpaid Colo Services C7 Data Centers has filed a lawsuit against its customer CoinTerra, a provider of bitcoin mining services. The customer has allegedly not been paying its bills for services it has been receiving from the Bluffdale, Utah-based data center provider.
The lawsuit, in which the plaintiff is seeking to recover about $5.4 million in damages, comes at a difficult time for all players in the bitcoin ecosystem. The digital currency’s value has been falling steadily since hitting a high of nearly $1,200 at one point in 2014.
It is currently below $280, prompting one cloud mining service provider, CEX.io, to halt mining altogether, saying it was no longer profitable. The company’s CIO Jeffrey Smith told CoinDesk that it will start mining again if the price exceeds $320.
CoinTerra manufactures and sells bitcoin mining hardware and hosts and manages mining hardware for customers as a service. A Redditor who claims to be a customer of CoinTerra’s bitcoin mining services said in a Reddit post two day ago that the company did not send him or her their weekly payout.
The Redditor pasted a statement they received from CoinTerra’s legal department, saying the company had defaulted on debt, and that its lenders had liens on all of its assets, including servers.
CoinTerra has been a C7 customer since April 2014, using its colocation, hosting, monitoring, and management services. The 18-month contract is for services in three separate C7 data centers, according to summary of the allegations in the lawsuit.
As of early December, CoinTerra’s past due balance was about $1.4 million, but C7 is seeking $5.4 million or more in damages, which includes total cost of services for the duration of the contract and other damages.
“Upon information and belief, CoinTerra chose to pay C7 less than the amounts due in bad faith,” the allegations read.
CoinTerra CEO Ravi Iyengar did not return a call seeking comment in time for publication. We’ll update this post once we hear back from him.
Power is the biggest cost of providing data center services. C7 said it has been paying more than $12,000 per day just to supply power to CoinTerra. C7’s latest invoice issued to the client was for about $430,000, and about $350,000 of the total was for electricity.
The data center provider also claims it has incurred losses by turning down other potential customers to accommodate CoinTerra’s needs.
C7 is not the only data center provider CoinTerra uses. CenturyLink Technology Solutions announced in July 2014 a big colocation deal with the bitcoin mining company. Iyengar told Data Center Knowledge at the time that the deal was for north of 10 megawatts of capacity.
A CenturyLink representative declined to comment, citing confidentiality of the contractual agreement between the two companies. | | 1:00p |
Demand in Secondary Data Center Markets Spurs Birth of New Players A lot of building and leasing activity continues to occur in core data center markets like Ashburn, Virginia, New York, and Santa Clara, California, but activity is also on the rise in smaller U.S. markets, in cities like Cleveland, Tampa Bay, Florida, and Pittsburgh, Pennsylvania.
A sizable enterprise base looking to operate fewer on-premise data centers and content and cloud providers’ desire to host content and infrastructure close to users means unprecedented activity in cities not traditionally known for their data center industry.
Several providers that focus on finding these non-core markets where demand outpaces supply have emerged. They are using different strategies to tackle the opportunity.
As more applications and services are being outsourced to the Web, data centers are being built closer to end users, Gillis Cashman, managing partner at M/C Partners, explained. The private equity firm recently invested $50 million in Involta, a data center provider that has strong market positions serving enterprise customers in places like Akron, Ohio.
“In smaller markets, people have built their own data centers,” Cashman said. “They are coming to end of life, and there is a new wave of demand. [In] many of these smaller markets, the requirements are exactly the same as Tier I markets. The interesting dynamic is they lack of quality, highly redundant enterprise-grade infrastructure.”
A Switch & Data Legacy in Secondary Markets
vXchnge is another company that focuses on high-growth markets. The company exists thanks in part to the acquisition of Switch & Data by Equinix in 2009. Several Switch & Data executives joined a number of others others to form the company. Switch & Data operated in many secondary markets.
“Knowing where we operated, there’s a lot of obsolescence out there,” said Ernie Sampera, vXchnge chief marketing officer. Sampera was formerly with Switch & Data, and believes high-density needs are particularly underserved. “There are a lot of legacy data centers that can’t accommodate [high-density infrastructure], and it’s too expensive to build new or retrofit.”
Many of the former Switch & Data facilities ended up in the hands of another player now working in Tier II markets. In 2012, 365 Main co-founders Chris Dolan and James McGrath resurrected the brand by acquiring 16 data centers from Equinix, seeing an opportunity to create a “national player with a local focus.”
John Scanlon, 365 CEO, sees a big opportunity in markets not traditionally known for data centers. 365 looks for underserved cities and locates in carrier-rich convergence points rather than building new facilities. He cites demographics and ongoing investment in backbone infrastructure as two reasons for these markets’ potential.
“We like that downtown data center that is highly interconnected with adequate battery and backup – downtown sits on the ‘battle grids’,” said Scanlon. “These cities have gone through a lot and they persevered.”
He said the trend was a rebirth of Tier II cities. “There’s a demographic trend occurring in these cities. There are millennials who want to live in or around where they work, so there’s a lot of construction around the cities. There needs to be access to broadband, to have the consumption and use of technology.”
Another example is ByteGrid, a wholesale player that goes into markets where it believes there’s healthy demand but low supply or a complete lack of wholesale options. The company said it completely sold its first phase ahead of opening of its facility in Cleveland.
“Our business model is predicated on the fact that there’s demand and opportunity in secondary markets,” said Ken Parent, CEO of ByteGrid. “A lot of people think that everything is happening in primary markets.”
Moving Up the Stack in Growth Markets
A lot of business in secondary markets comes from small and medium-size businesses, and many providers are finding that these customers need a variety of services in addition to space and power. This is what was behind 365’s introduction of cloud storage services, and ByteGrid’s acquisition of managed hosting provider NetRiver.
“These markets have the best of both worlds,” said Parent. “In Cleveland, there are good large enterprises but you also have a lot of smaller customers. With more of the SMB market, you need to deliver a product that has services atop. It’s a different kind of customer and you need to adapt your business model. It’s why we started moving up the stack and looking at managed services as acquisition targets.”
Living on the Edge
Bringing content and cloud services closer to customers are important drivers for data center growth in Tier II markets. Instead of having 50 servers in one data center in the middle of nowhere, Cashman said, there are now 50 servers in 50 data centers very close to the end user.
Key inhibitors of cloud computing are application performance and security, he said. “It’s not a broad statement that applies to every app, but for the most part, proximity to the data center is increasingly more important.”
“A movement to the edge coupled with increasing metro investment of infrastructure make these areas promising data center markets,” said Sampera. “There’s good connectivity, 4G and LTE investment, good Wi-Fi. You’ll see the secondary, growth markets growing. There’s enough population density. Why would you long-haul from Reston when you can implement in a metro?” | | 4:30p |
Preparing for the Data Center of the Future Steve Carlini is Senior Director of Data Center Global Solutions and Soeren Jensen is Vice President of Software and Managed Services for Schneider Electric.
Data center operators are under more pressure than ever to provide the fastest, most reliable data possible while balancing demands for higher computing power and efficiency. Meanwhile, their use of virtualization, cloud architectures, and security techniques, as well as software-defined networking and storage has given rise to increasingly complex environments.
Given this already challenging environment, the term “future-proofing” is often chalked up to vendor-speak, meant to scare customers into buying oversized equipment for just-in-case scenarios that may actually never happen.
However, future-proofing, or the attempt to anticipate future demands, is an important element to data center management and planning. New IT devices are coming to market with unforeseen capabilities at record volume and pace, using more and more data, making it seemingly impossible to anticipate what the future will bring and how it will affect the data center.
Future-Proof Without the Cost
There are ways to future-proof the data center without having to make costly investments, while still ensuring that IT infrastructure can adapt and change over time to meet evolving business needs – even in a rapidly changing, unpredictable landscape.
Through data center infrastructure management (DCIM) software and prefabricated, modular infrastructure, data center operators can stimulate adaptability and flexibility for existing and new facilities. Data centers can anticipate and respond to current and future data center needs by:
- Accounting for increasing demands for processing power and storage capacity
- Applying a more sophisticated level of monitoring, analysis and management
- Enabling system management integration between facilities and IT
- Providing smart energy management and increased control capabilities
This allows data centers to meet evolving company needs, future technologies and the new environmental factors, while extending the service life of existing infrastructure.
Taking Advantage of Prefabricated Data Centers
In January 2014, we looked at the reasons why prefabricated, modular data center infrastructure can help owners and operators meet challenges related to traditional data center builds, such as having too many parties involved, the complexity of long-duration builds, quality and cost inconsistencies and incompatibility of equipment.
However, the biggest advantage of prefabricated, modular data center infrastructure in terms of future-proofing are closely tied to its ability to easily scale up or down capacity. Not only does this reduce both upfront capital and ongoing operational expenses (CAPEX and OPEX), but owners and operators can quickly add power and cooling capacity to meet increasing demands and actual business needs.
This is compared to the traditional method of installing the power and cooling infrastructure as part of the data center building and sizing the facilities according to potential maximum future needs – an almost impossible (and costly) task that uses up valuable real estate, increases utility bills and decreases efficiency – without truly guaranteeing that estimates will ever match actual requirements.
What if facility power, power distribution, cooling and IT physical infrastructure were not built into the building, but instead were prefabricated building blocks that could be deployed and changed as needed throughout the life cycle of the data center? Owners and operators could deploy prefabricated IT building blocks and raise density or availability levels by adding extra matching power and cooling building blocks, or, in the future, even swap out AC-powered prefabricated building blocks to DC-powered prefabricated building blocks, which could possibly be the standard in the future.
Monitor and Manage the Data Center of the Future
Intelligent, informed data center planning with an eye toward future needs can help owners and operators avoid being caught off-guard by unanticipated changes within the IT environment or changing business landscapes. Planning is most effective when decisions are informed by past and real-time data collected from IT and facility systems, which provides actionable insight from the data collected via DCIM solutions.
The ability to use DCIM data as a benchmarking tool in the planning process is perhaps the most effective method for preparing for future data center needs. This is because DCIM solutions can aggregate data collected from both the physical building structure and IT equipment within the facility – from the building down to the server level – not only bridging the all-too-common gap between facilities and IT, but also allowing owners and operators to identify trends, develop action plans and prepare for potential problems or needs down the line.
DCIM solutions also show how adding, moving or changing physical equipment can affect operations, thereby providing accurate insight for common planning questions and optimizing existing infrastructure capacities.
By increasing data center flexibility through prefabricated, modular data center infrastructure and DCIM software, data center operators can transition their facility from a cost center to a business driver enabling organizations to better mitigate risk and prepare for the future.
Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library. | | 6:30p |
MongoDB Raises $80M to Challenge Relational Database Tech MongoDB has secured about $80 million in new funding, bringing the total the company has raised from investors to $311 million. The NoSQL database startup’s previous funding round of $150 million came in October 2013.
MongoDB touts a strong community of developers and supporters and is considered a leader in the space. The company recently acquired open source storage engine provider WiredTiger to improve performance. The acquisition is what reportedly spurred the round, as the company initially looked to raise enough to cover the acquisition.
MongoDB competitors include Couchbase and DataStax, among others. Couchbase recently raised $60 million and noted solid momentum and partnerships. Another notable NoSQL database startup is Basho, which announced a $25 million funding round just this morning.
These companies have built their offerings around open source software, making money by providing enterprise support and features. MongoDB added a paid support level for the free version of the database in the summer.
MongoDB has over 2,000 customers and has acted as the foundation for companies like Compose, which built its initial offering around MongoDB. In 2013, Rackspace acquired ObjectRocket, a Database-as-a-Service company that uses MongoDB.
The relational database market may be in trouble, as open source NoSQL database take-up is growing the fastest, according to DB-Engines rankings.
DB-Engines named MongoDB the NoSQL database of the year. Gartner named MongoDB the only “Challenger” in its Magic Quadrant for operational database management systems.
“The market has reached a tipping point where most developers and IT organizations realize that modern applications cannot continue to be built on relational database technologies,” Dev Ittycheria, president and CEO of MongoDB, said in a statement. “They are shifting to MongoDB in a big way. MongoDB was designed to make it easy to develop applications that require rapid change, massive scale, always-on operation, and support for a large variety of unstructured and semi-structured data, all at significantly lower costs.”
Venture capital continues to find its way to the database startup world. A sovereign wealth fund led the round, with participation from Goldman Sachs and from existing investors Altimeter Capital, NEA, Sequoia and funds managed by T. Rowe Price Associates. | | 8:00p |
China and US Top Sources of Attack Traffic: Akamai Report 
This article originally appeared at The WHIR
Global broadband adoption reached 60 percent of all connections for the first time in the third quarter of 2014, according to Akamai’s latest State of the Internet report. The average connection speed remained above the broadband threshold for the second consecutive quarter, despite modest decreases in overall average and peak average speeds.
The report outlines the continuation of several trends in broadband adoption, connection speed and attack traffic. While there was significant regional difference within those trends, the general patterns continued, such as increases in broadband and “high broadband” adoption, and “4k readiness,” as well as China’s major lead in attack traffic.
China remains the top source of attack traffic, at 50 percent, but attack traffic observed from Indonesia fell from 15 percent in Q2 to 1.9 percent. The US continued to be the second most common source of attack traffic.
Akamai customers reported 270 DDoS attacks in Q3, the same number as the previous quarter and a 4 percent reduction from a year ago, despite a 25 percent increase in attacks in the Asia Pacific region.
All but nine of the countries measured had increased broadband adoption rates, and several countries not traditionally associated with fast speeds made major strides in connectivity in Q3 2014. Broadband adoption in Indonesia increased by over 1,800 percent to 35 percent, while Slovakia joined South Korea with a “high broadband” mobile connection speed (over 10 Mbps).
“One need only look to the sheer number of connected device- and smart home-related announcements that came out of the 2015 International CES to see that consumers are continuing to adopt and expect more from connected technology and services,” said David Belson, editor of the report. “The strong year-over-year growth trends illustrated in this quarter’s report show that the Internet is evolving and expanding to meet the growing demands of our increasingly connected lifestyles.”
The growing demands Belson refers to helped buoy a pair of strong stock market entries by cloud and big data service providers in December, amid speculation about a looming tech bubble.
That evolution should make things easier for hosts and other service providers, as faster connections and more consistent speeds between countries and regions make end-user experiences richer and more predictable.
This article originally appeared at: http://www.thewhir.com/web-hosting-news/china-us-top-sources-attack-traffic-akamai-report | | 8:09p |
Report: 2014 Best VC Fundraising Year Since 2007 Last year was the strongest fundraising year U.S. venture capital investors have seen since 2007, indicating that many VC-backed companies have been making successful exits in recent years.
Together, VCs raised $29.8 billion in 2014, a 69 percent improvement over 2013, according to a joint report by Thomson Reuters and the National Venture Capital Association. With 254 funds raised, the year was also the busiest for U.S. VC fundraising since 2001.
In a statement, Bobby Franklin, president and CEO of NVCA, said fundraising levels were starting to catch up with the level of investment of the last few quarters. “As a result of the strong exit market for venture-backed companies over the last couple of years, more money is being distributed back to investors who have chosen to redeploy that capital to the venture ecosystem,” he said.
Menlo Park, California-based VC firm Andreessen Horowitz raised the largest fund of the year: $1.7 billion. The firm invests in both consumer and enterprise technology startups.
Its enterprise portfolio includes numerous startups in the data center products and services space. These include Cumulus Networks, which has a Linux OS for bare metal network switches, DataGravity, a data-aware storage vendor, DigitalOcean, an Infrastructure-as-a-Service provider popular with developers, Mesosphere, a startup whose “data center OS” software simplifies management of disparate IT resources, and many more.
Andreessen Horowitz led a $10 million Series A round for Mesosphere in June.
Last year was a strong VC funding year for startups in the enterprise IT space, especially companies doing work in IT automation, big data, databases, OpenStack clouds, and application containers.
Docker, the leading app container startup, raised $40 million in a Series C round led by Sequoia Capital. A company called Stratoscale raised $32 million for its IT automation software that supports Docker and OpenStack.
Mirantis, one of the biggest OpenStack distribution providers, raised $100 million in a Series B led by Insight Venture Partners. A smaller company, Metacloud, which provides both on-premise and hosted OpenStack private clouds, raised $15 million.
DevOps-style IT automation software company Puppet Labs took in $40 million in additional venture capital. Its smaller rival, Moogsoft, raised $11.3 million.
It was a big year for fundraising for major players in the open source NoSQL database space. DataStax, a Cassandra startup, raised $106 million. Another NoSQL heavyweight Couchbase raked in a $60 million round.
Many more companies that deal in intelligent storage management, big data analytics, data warehousing, and cloud management announced new VC funding in 2014.
The year saw 48 follow-on VC funds and 27 new funds raised, according to the Thomson Reuters and NVCA report. The largest new fund raised in the fourth quarter was from San Francisco-based Presidio Partners 2014. The firm’s inaugural fund raised $140.4 million. | | 8:30p |
US Military Social Media Accounts Hacked by ISIS Sympathizers Cyber Caliphate 
This article originally appeared at The WHIR
The Twitter and YouTube accounts of US Central Command (CENTCOM) were hacked on Monday by the Islamic terrorist group Cyber Caliphate. CENTCOM oversees US military operations in the Middle East.
Cyber Caliphate threatened additional attacks in the US after hacking two US news outlets a week ago. The group claimed to have already hacked the FBI at that time.
US Central Command posted to its Twitter account at 7:07 pm saying, “We’re back! CENTCOM temporarily suspended its Twitter account after an act of cybervandalism. Read more:http://t.co/hiwvSp3uWt” According to the statement, US Central Command Twitter and YouTube sites were compromised for about 30 minutes. Both accounts, hosted by commercial sites, not military networks, were taken offline so the agency could investigate the incident.
The group posted pictures posted threatening messages, propaganda videos, and military documents beginning at 12:29 pm EST with the statement, “AMERICAN SOLDIERS, WE ARE COMING, WATCH YOUR BACK. ISIS.” using the hashtag “#CyberCaliphate. Another tweet said, “ISIS is already here, we are in your PCs, in each military base.”
“Later tweets included images of what were apparently spreadsheets labeled as containing the contact info and home addresses of retired US army generals,” according to CNBC. “Other tweets claimed to include military plans from Pentagon networks. One such image showed a map of China with labels of different military assets. Another supposed Pentagon image featured a map of North Korea with labels for nuclear facilities.”
It’s unclear what level of security measures are active on the CENTCOM social media accounts. Often hacks such as those in the last year at JP Morgan, Kmart, Dairy Queen,Home Depot, Xbox, ICANN and Sony are the result of simple security measures being ignored. For example, the hack at JP Morgan was due to two-factor authentication missing from one of it’s servers.
Two-factor authentication is a simple security measure that can thwart many attacks. Just having a stolen password is not enough when this measure is in place since a second one-time password is needed to gain access. Hackers stole an employee password at JP Morgan and had simple two-factor authentication been installed on all servers, this could have been prevented.
Two-factor authentication is available on Twitter accounts and on YouTube through the Google sign-in process.
CENTCOM said no military networks were compromised and that no classified information was posted. The agency notified the Department of Defense and law enforcement. The FBI said on Monday that it was working with the DoD to investigate.
“Let’s remember this is a social media account,” Peter Singer, a strategist and analyst with the New American Foundation in Washington, told the Washington Post on Monday. “This is not a military command and control network. This is not a network that moves classified or even non-classified internal information back and forth. Essentially what they did is for several minutes take control of the megaphone.”
A US Department of Defense official told NBC News “this is clearly embarrassing, but not a security threat.”
The Obama administration is “examining and investigating the extent of the incident,” White House Press Secretary Josh Earnest told reporters on Monday. “This is something we are obviously looking into and something we take seriously.” However, he made the distinction that the hacking of a Twitter account is much different than a large data breach.
This article originally appeared at: http://www.thewhir.com/web-hosting-news/us-military-social-media-accounts-hacked-isis-sympathizers-cyber-caliphate | | 10:54p |
Which IaaS Provider is Cheapest? Which One Has More Configurations, Locations? Cloud pricing comparison is often an exercise in comparing apples to oranges. Cloud Spectator, an Infrastructure-as-a-Service consulting and research firm, has published a cloud vendor report that attempts to take a nuanced look at the IaaS market and find each provider’s advantage depending on use case.
Comparing on price alone is difficult and can be short-sighted. It’s easy to boil cloud down to pricing, given the regular price cut announcements by the big providers, but lots of other factors are involved.
Other recent efforts at benchmarking the IaaS market include 451 Research’s attempt to create a standard metric and a cloud pricing benchmark in November, and CloudHarmony’s report on cloud uptime, which said Amazon Web Services was one of the most reliable providers.
The Cloud Spectator report doesn’t declare a single winner, but where each cloud holds an advantage in terms of contract length or instance size. It covers only 10 vendors and doesn’t take performance into account, but provides a good jumping-off point for customers and for Cloud Spectator to delve deeper, which it said it plans to do.
“General pricing comparisons for cloud infrastructure are incredibly difficult because of the number of variables involved,” said Cloud Spectator analyst Anne Liu. “For example, different users will have different configuration requirements, IOPS usage, customer support needs, and desired data locations. Certain features may be included on some providers or cost extra on others. Users looking for a more personalized pricing comparison should take those factors into consideration.”
Cost Depends Greatly on Usage
The report found that DigitalOcean, a rapidly growing IaaS startup, Microsoft Azure, and IBM SoftLayer provided the lowest cloud pricing overall, but Amazon’s EC2 displays cost advantages in the longer term when reserved instances are employed. SoftLayer is the least expensive for larger Windows-based offerings over terms of different length.
The most expensive block storage offerings are Rackspace’s SSD and CenturyLink’s Premium block storage, however both are premium offerings focused on performance rather than cost. Azure block storage was the most inexpensive and employs a non-linear pricing model.
There are a few caveats that would potentially change the results. Pricing examined reflects only U.S. data centers. Four separate server configurations were used to compare and contrast each cloud. These configurations were treated as minimum requirements, so the picture gets foggier outside of those selected configurations.
Often, cost comes down to the pricing structure employed by a given cloud provider. Different providers employ everything from pricing for usage under one hour to three-year commitments. The report said Google, Microsoft, and Rackspace excelled in sub-hour subscriptions. Several other providers turned out more cost effective with volume pricing and long-term discounts.
The Other Big Factors
The report also compares service level agreements and reimbursement, geographic availability, and configurability. Amazon, Microsoft, and Verizon are the only three available in all four regions (South America the only missing option in three other clouds).
The bare metal offerings excel at configurability, with CenturyLink listed as the only fully configurable option.
Cloud pricing, feature sets, and data center locations are not static. Providers also continue to retool billing models. AWS recently tweaked its reserved instance pricing model, and Google changed its sustained use discount schema. Amazon’s EC2 alone changed prices 44 times since 2006, according to the report.
Cloud Spectator said it is now turning to performance and price performance analysis, with a report expected in a few months.
“Although price is extremely important when comparing providers, it is only part of the story,” said Liu. “Therefore, part two of this report series will incorporate performance to show the price-performance value for each instance. Since an application may require less infrastructure on high-performance providers, this can have a substantial impact on overall cost that cannot be captured with only a price comparison.”
Comparison of the big ten is available here (registration required). |
|