Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
[Most Recent Entries]
[Calendar View]
Thursday, July 17th, 2014
Time |
Event |
12:00p |
Apple Data Center Energy Use Grows but Remains 100 Percent Renewable Earlier this month, few days after news came out that Apple was going to build its third massive solar farm in North Carolina, the company released its latest environmental responsibility report, which covers its fiscal year 2013 (October 2012 through September 2013).
The report includes a lot of information on energy use by the company’s data centers and their sources of energy. Particularly, while its data center capacity and power consumption grew tremendously year over year, Apple invested in enough renewable energy to continue claiming that its data centers are powered by renewables entirely.
Data center energy consumption is something the public has become somewhat more aware of since 2011, when Greenpeace started its campaign to pressure well-known high-tech brands to do something about the preponderance of coal energy in the power supply of their massive and growing data centers. Greenpeace included Apple, along with companies like Amazon, Facebook, Google, Microsoft and Twitter, among others, in its data center energy scorecards, which it has been releasing annually since.
Apple went from receiving very poor grades for energy transparency, carbon mitigation and site selection on the first 2011 report to becoming one of the three leaders who, according to Greenpeace, are “helping to build the green Internet” on this year’s scorecard. The other two leaders are Facebook and Google.
Data centers become Apple’s largest energy sink
Fiscal 2013 was the first year Apple’s data center energy use surpassed energy use of the company’s other corporate facilities. Total energy consumed by corporate facilities was around 230,000 megawatt-hours in fiscal 2012 and stayed at the same level in 2013, while data center energy use went from about 220,000 megawatt-hours in 2012 to more than 300,000 megawatt-hours in 2013.
The company substantially ramped up its Prineville, Oregon, data center in 2013, brought online some initial capacity at its newest data center in Reno, Nevada, and increased energy use at the Newark, California, data center. The biggest increase in energy use that year, however, was at the company’s Maiden, North Carolina, site, which consumed about 60,000 megawatt-hours of electricity more than it did in fiscal 2012.
Maiden, Prineville, Newark and Reno are the four data centers Apple has built, owns and operates. The company also leases space from data center providers in numerous locations but does not disclose names of the providers, the locations or the amount of power it consumes there.
Its environmental responsibility reports focus on its own facilities, since those are the only facilities whose power sources it has total control of.
The fine print
To provide reassurance that statements in its sustainability reports are correct, Apple has been using Bureau Veritas North America as an independent reviewer of the information. The bureau is a global testing, inspection and certification services company headquartered in France.
Apple included a statement from BNVA in its report, which confirms that the company’s reported energy consumption, renewable energy credits and greenhouse gas emissions for fiscal 2013 were correct.
Making renewable ends meet in Maiden
Apple gets to claim that its largest data center, the one in Maiden, is fully powered by renewable energy because it has built a massive biogas-powered fuel-cell plant and two solar farms there. The project to build the third one, as mentioned above, is already underway. The site generates nearly 170,000 megawatt-hours of renewable energy a year for the local grid, exceeding the 162,000 megawatt-hours the data center consumed in fiscal 2013.
Each of the photovoltaic arrays in Maiden generates about one-quarter of the total, while the fuel cells (supplied by Bloom Energy) generate about half. The company did buy some renewable energy (five percent of the total) from the grid in 2013 to make sure it was fully covered while some of the generation capacity was under construction.
Buy it till you make it
In Prineville, Apple simply bought enough renewable energy to power the facility practically in its entirety in 2013. This will not be the case in the future, since the company is building a “micro-hydro” generation system there, which will use local irrigation canals to generate energy for the data center. The system is slated to come online sometime this calendar year.
Apple’s newest data center in Reno started operating in December 2012. In fiscal 2013 it consumed 3,000 megawatt-hours of electricity. The company bought locally generated geothermal energy to power the facility. It is building a 20-megawatt solar farm together with the local utility – a project it expects to finish by early 2015. Apple said it will continue buying geothermal energy until the array comes online.
The company also buys wind energy from the grid to power its Newark facility. It is able to buy wholesale renewable electricity in California through the state’s Direct Access program.
Cleaning up the colos
Being 100-percent renewable gets trickier with using colocation data centers, which Apple does not own and operate and most likely shares with other tenants. The company said the vast majority of online services are served from its own data centers, yet it continues to work with its colocation providers to clean up its energy load in their facilities.
Apple said about 70 percent of the power it consumes in colocation facilities has been renewable since early 2013. “And we won’t stop until we get to 100 percent,” the report read. | 12:30p |
T5 Data Centers Starts Facilities Management Services Company Enterprise-class wholesale data center operator T5 Data Centers is launching a facilities management company called T5FM. The subsidiary will offer third-party facilities management services to data center operators across North America.
T5 operates several data centers in the U.S. and is offering its domain-specific expertise, including on-site talent and training, to other data center owners and users.
The executive management team of T5FM has more than 100 years combined experience managing a variety of types of data centers. It will assist customers in making critical operational decisions, develop policies and procedures to mitigate risk and eliminate operational stress.
The four pillars of the program are safety, training, process and procedures and customer communications.
- T5FM has a custom electrical safety program that follows NFPA 70 guidelines. NFPA 70 acts as the benchmark for safe electrical design installation and inspection to protect both people are property. All T5FM technicians are certified in OSHA (Occupational Safety and Health Administration), the aforementioned NFPA 70, as well as first aid/CPR/AED (capable of performing CPR or working an automated external defibrillator).
- A customized facility management training program has been developed, complete with textbook, portfolio training classes and online tests. The program trains staff across mechanical, electrical, plumbing and fire protection disciplines. The company says training is modified to suit each location.
- The process and procedures are drawn from T5’s portfolio policies for quality control. T5FM has a baseline for all site-specific methods of procedures (MOPs), standard operating procedures (SOPs) and emergency operating procedures.
- T5FM also goes beyond facilities operations, training and safety, also helping with customer communications. Streamlined communications strategies are built to suit each customer’s needs. The program offers visibility into T5’s computerized maintenance management systems so customers can track preventative maintenance, alarms and trouble ticketing.
“Creating our own data center facilities management company was a logical next step for T5 Data Centers,” said Mike Casey, chief operating officer of T5. “Unlike other facilities managers, we don’t operate apartment buildings or office buildings or other property types — we only operate data centers. That’s our core competency, and we want our customers to benefit from our extensive experience managing T5′s own enterprise grade wholesale data centers.” | 1:00p |
Avere Nabs $20M to Grow Hybrid Cloud Storage Solutions To accelerate growth of its hybrid cloud storage business Avere Systems announced it has raised $20 million in a Series D funding round, led by Western Digital Capital. Avere will use the funds to continue enabling the enterprise adoption of its hybrid cloud storage solutions, as well as accelerate sales and marketing. This funding round brings the total amount invested in the company to $72 million. Previous investors Lightspeed Venture Partners, Menlo Ventures, Norwest Venture Partners and Tenaya Capital also participated in the latest round.
“The reality for 99 percent of enterprises is they will operate increasingly in a hybrid IT storage environment for many years to come. This means that no single storage technology will win, and both on-premises and cloud storage will be required to achieve cost and performance goals,” said Ron Bianchini, president and CEO of Avere. “With this Series D funding round, we continue to scale the company so that we can provide the best solutions to customers embarking on their hybrid cloud path.”
The performance parity of cloud storage in recent times is what has allowed for yet another means for companies to enable hybrid cloud storage across storage type, location, budget and vendors. Bianchini attributes three things to enabling the new hybrid cloud storage world: enterprise NAS performance in the cloud, zero-pain migrations and effortless provisioning.
“Avere Systems is ushering in the new era of enterprise-class hybrid storage, which is a boon to organizations that struggle with the cost and performance of managing on-premises and cloud file locations, while also deciding whether to use hard disk or solid state drive technology,” said George Crump, president and founder of Storage Switzerland. “Avere’s technology enables its customers to select and provision storage options instantly, and this capability will drive new operational efficiencies and competitive advantages.”
In business for six years now Avere claims to have large global enterprises as customers. It hosts digital downloads for the Library of Congress and is well known in the visual effects industry, where it says its products were used in video rendering for each of the top 12 blockbuster movies in 2013. Avere recently noted that Australia-based Rising Sun Pictures, a visual effects specialist whose credits include Harry Potter and the Deathly Hallows and Gravity, is using Avere FXT Series Edge filers to render their movies. | 1:30p |
Pythian Buys Blackbird.io to Get a Piece of the DevOps Action Consulting and managed services company Pythian has made its first acquisition since its founding in 1997: Blackbird.io, formerly PalominoDB and DriveDev, a DevOps and data managed services provider. While the undisclosed price tag reportedly isn’t a huge one by tech standards, founder and executive chairman Paul Vallee says it’s huge in terms of the magnitude of the shift he is seeing in DevOps.
The acquisition is part of Pythian’s strategy to leverage the rapid increase in demand for agile development, operations and technology delivered as a service. DevOps stresses communication between software developers and IT operations. It’s a combination of the two into a role that encompasses both.
Pythian does well with companies where cost of operations is really low compared to the revenue opportunity. Vallee has a long track record of identifying needs and capitalizing on market shifts, and he sees DevOps as the biggest game changer.
Blackbird.io was acquired for Pythian to embrace DevOps capabilities that customers are now demanding. “Production and operations expertise is moving into R&D organizations,” said Vallee. “Traditional outsourcing will be challenged by DevOps.” Blackbird.io was doing DevOps as part of the DriveDev business.
Pythian is one of those great tech stories that start off with a kid in a dorm room messing around with a Unix cluster for fun. When that kid turned 29 in 1997, after some years working in the industry, he decided to start a company called Pythian. Its first target market was New York, but it has expanded globally. Pythian is now a global provider of data management consulting and services over 300 strong that employs top infrastructure talent like Oracle Aces and Math PhDs in its data science group. The company has never grown through acquisition, according to Vallee, but Blackbird.io was too attractive a proposition.
As a result of the acquisition, Pythian will become one of the largest, if not the largest, open source database managed services company. This means it’s one of the only independent vendors without its own distribution with the professional service capability at the top of the list.
“Traditional outsourcing models no longer apply,” said Vallee. “DevOps is going to change things very quickly. The day I announced the deal, a multi-million dollar customer called me up and said he’s investing heavily in DevOps. This demand was latent, invisible, until we actually did it. The second we made this announcement, a major sports equipment retailer reached out and asked for a briefing. “
Vallee’s remarkably candid about the initial rationale for looking at Blackbird.io. “About six months ago, one of my prouder customers left our services in favor of [Blackbird.io]. I dug deeper and discovered that they were paying twice as much than we were charging. What were they doing differently? Turns out it was DevOps. We were perceived as misaligned.” Pythian doesn’t want to fight the tide with legacy-style business, but rather embrace the change that Vallee says will challenge traditional outsourced models.
Pythian has a huge presence in the retail space, serving most of the top brick-and-mortar vendors online today. Vallee named several high-profile customers apart from the publicly named giants. It publicly touts companies such as Beats Music, Linkshare, Fresh Direct, Sonos, American Apparel and more. More than 1,000 companies have turned to Pythian for support with Oracle, MySQL, Microsoft SQL Server, Big Data and enterprise infrastructures.
“Today, the average large customer has hundreds of systems and they’re kind of random,” said Vallee. “The SaaS industry started with this DevOps idea. Netflix is the same app on 500 or 5,000 or however many machines. Our verticals, retail and SaaS, are way more interested than healthcare and financial services at the moment. We need to be enablers for our customers to execute on our strategies. DevOps wasn’t strategic two years ago. Now it is.”
“The acquisition is transformative,” he continued. “DevOps is about moving deeper into Platform-as-a-Service, Infrastructure-as-a-service and building in top operations expertise early in the development cycle and part of your architecture. The quantity of people are fewer, and they’re now about the automation stacks. This transforms my entire industry. | 3:55p |
Data Center Knowledge Announces Collaboration with Data Center World Orlando West Chester, Ohio – Data Center World, the premier industry conference and expo for the data center industry, and Data Center Knowledge, the industry’s leading online source for news and analysis about the data center industry, jointly announced today a new collaborative effort.
Data Center Knowledge will assist Data Center World in developing the most up-to-date educational program available during a critical period of transformative change within the industry. One such area of change is the convergence of the traditional enterprise data center with third-party data centers and cloud providers – a topic where Data Center Knowledge has deep expertise. This trend toward the hybrid data center is one of the many strategic changes requiring today’s data center professional to learn new skills.
Under the effort, Data Center Knowledge’s editorial team is developing a Trends Track for the upcoming Data Center World Orlando conference.
“This year’s Data Center World program will be better than ever with the help of the editorial staff at Data Center Knowledge,” said AFCOM President Tom Roberts. “By adding the collective experience of Yevgeniy Sverdlik, Editor in Chief, Rich Miller, Editor At Large, and Jason Verge, Industry Analyst, to help build the program, we are able to harness the knowledge of the industry’s leading news source to ensure our attendees are presented the most important and relevant trends within the data center industry today.”
Yevgeniy, Rich and Jason will work collaboratively with Tom to develop the Trends Track for the upcoming conference using their expertise to set specific session direction, bring in new speakers to the program and help moderate and facilitate conversations at the conference about the future of the industry. Topics will include utilizing managed services, the Open Compute Project, software defined data centers, the next-generation NOC, Internet governance and globalization, Big Data and power system innovations.
With Data Center Knowledge on board, attendees to the upcoming Data Center World Orlando conference will continue to receive the industry’s best real-world education available today. The enhancement of the Trends Track adds to an already strong program that will feature more education and networking hours, the best rated sessions and speakers from the previous Data Center World Global event and an entirely new track focused on practical managerial and leadership skills for today’s data center professional.
You can learn more about the upcoming Data Center World Orlando conference by visiting www.datacenterworld.com. | 4:41p |
PredictionIO Gets $2.5M for Open Source Machine Learning PredictionIO has quietly raised $2.5 million in funding to make machine learning accessible to developers. Dubbing itself “the MySQL of prediction,” the company’s Open Source Machine Learning Server is already empowering hundreds of applications and more than 4,000 developers.
The server is for programmers and developers to build smarter stuff with just a few lines of code. Building machine learning into products is expensive and time-consuming, and larger companies like Google, Amazon and LinkedIn all have sizeable teams of data scientists. The tools are available today to build machine learning deployments, such as R, Hadoop, Spark, Mahout and Scikit-learn, but it does not make sense for most companies to build a database server in-house. PredictionIO says you don’t have to build it from scratch.
Its differentiators are being developer friendly and open source as opposed to closed black-box machine learning services and software out there. Developers don’t like black-box solutions, which is partly why offerings like Hadoop and Docker have grown popular.
PredictionIO gives away a free machine learning server for developers to destroy and re-define as they learn. Developers can download and install PredictionIO CLOUD from the AWS marketplace and experiment. App event data is streamed into PredictionIO through REST APIs or with a few lines of code using Software Development Kits (SDKs). Prediction engines can be created using the PredictionIO User Interface, with prediction results retrievable through REST API calls.
 Screenshot of PredictionIO’s User Interface
The company was started by young serial entrepreneurs, an ex-Googler and alums from UC Stanford, UC Berkeley and University College London (UCL). Co-founder and CEO Simon Chan is a PhD candidate at UCL and has founded three startups in the past.
It is drawing interest ranging from individual developers to international enterprises in verticals such as e-commerce, fashion suggestion, retail, food delivery, video portals, app stores and news media. Some examples are Le Tote, who is using PredictionIO to predict customers’ fashion preferences, and PerkHub, an enterprise Software-as-a-Service company that powers perks and group buying programs within large enterprises. PerkHub is using PredictionIO to personalize product recommendation in its weekly emails.
Investors include Azure Capital (investor in KISSmetrics, VMware and others) QuestVP, CrunchFund, Standford – StartX Fund, Kima Ventures, IronFire, Sood Ventures and XG Ventures. PredictionIO is part of Mozilla WebFWD, 500Startups and Stanford’s StartX founder communities. | 5:37p |
CenturyLink’s Minnesota Data Center Gets Uptime Tier III Certification CenturyLink received Uptime Institute Tier III Certification of Constructed Facility for its newest data center in the Minneapolis-St. Paul region. MP2 is the first data center in Minnesota to receive this award across design, construction and commissioning.
To achieve the certification, a data center requires redundant capacity components and multiple independent distribution paths serving the critical environment, among other requirements. Uptime verifies and issues certifications for design documents separately from finished data centers. Tier Certification from Uptime Institute assures businesses that all elements of the critical infrastructure of a data center are capable of enterprise-grade performance.
CenturyLink has a strategic global commitment to the Tier certification process. It achieved Tier III Certification of Constructed Facility for the second phase of its OC2 data center in Irvine, California, making it the first Uptime Institute Tier Certification of both design and facility in the southern California colocation market. The company is also working toward Tier III Certification of Constructed Facility for colocation data centers in Elk Grove, Illinois, Markham, Ontario, and Hong Kong by the end of 2014. The Hong Kong data center, delivered jointly with Digital Realty Trust, would be the first data center to receive that award in its region.
The Minnesota data center was launched in May. It was delivered with technology partner and developer Compass Datacenters. Compass is a strong proponent of Tier Certification, with CEO Chris Crosby often highlighting its importance. The developer uses one design for all data centers it builds for customers, and that design has been certified as Tier III.
Located in Shakopee, Minnesota, a Minneapolis suburb, the data center is built to support up to six megawatts and 100,000 square feet of raised floor, the initial phase being 1.2 megawatts on 13,000 square feet, which is a standard capacity of a single Compass data center pod.
“Tier Certification of Design Documents is not easy, but building to that design—and certifying that build—is a key confidence factor for CenturyLink itself and its client base,” said Julian Kudritzki, chief operating officer at Uptime Institute. “When a longstanding colocation leader commits to Tier Certification, it endorses its relevance to the end-user community. The successful implementations of Minneapolis and California have us looking forward to the expanding relationship with CenturyLink, its project teams and technology partners around the world.”
There are two multi-tenant providers with Tier III design certifications in Minnesota: Stream in Chaska and TDS HMS in Eden Prairie. ViaWest is currently pursuing a Tier IV design certification in the state. Target operates two enterprise data centers that achieved Tier III for construction and operation sustainability. | 7:27p |
Oracle Enables SQL Queries Across Hadoop, NoSQL and Oracle Database Recognizing the needs of an evolving data management architecture in organizations Oracle launched Big Data SQL software as a way for integrating a variety of data sources, including Hadoop, NoSQL and Oracle Database.
One option is a full-stack solution, an engineered system that combines Oracle’s Big Data Appliance with Big Data SQL, Cloudera‘s Hadoop distribution and Oracle’s own NoSQL Database. At launch Oracle Big Data SQL only supports Apache Hive and the Hadoop File System. Other vendors have ported SQL relational databases to run on top of Hadoop.
Single, optimized SQL query for distributed data
The goal of creating this Big Data management system is to have one SQL query to run across diverse data sources and enable organizations to leverage existing skills and maintain enterprise-grade data security and governance for sensitive or regulated information. To help speed data analysis and distribution Oracle says that its unique architecture and Smart Scan technology inherited from Oracle Exadata permits Oracle Big Data SQL to query all forms of structured and unstructured data while minimizing data movement.
This also facilitates Oracle Database security capabilities, including an organization’s existing security policies, which extend to Hadoop and NoSQL data.
Oracle’s Dan McClary said that the product has been in development for some time now, and that it goes beyond existing connector support Oracle offers to Hadoop, NoSQL and others for moving data around on platforms. He said Big Data SQL is co-resident with HDFS DataNodes and YARN NodeManagers, and that queries from the new external tables are sent to these services to ensure that reads are direct path and data-local.
Cloudera founder, chairman and chief strategy officer Mike Olson said running Cloudera’s software suite on Oracle’s Big Data Appliance was “more cost-effective and quicker to deploy than a DIY cluster. When it comes to querying data in Hadoop, we’ve seen overwhelming demand from customers for SQL.
“This is why Cloudera has developed Impala—which Oracle includes on Oracle Big Data Appliance—to enable customers to query data with SQL natively and efficiently in Hadoop.” |
|