Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
[Most Recent Entries]
[Calendar View]
Thursday, September 18th, 2014
Time |
Event |
12:00p |
Open-IX: Netflix and Google’s Plan to Break Out of Equinix’s Gilded Cages Before the early 2000s, when the Internet was still a Wild West, if you were an Internet service provider and wanted to peer (exchange traffic) with another service provider, you had to colocate your gear inside one of the Network Access Point facilities, or NAPs. Each of those facilities was controlled by a specific carrier, and if you wanted to colocate in one of them, you had to buy upstream connectivity services from that carrier.
Naturally, the ISPs were not very happy with the model, and companies like Equinix and CRG West (now CoreSite) saw that there was a need for carrier-neutral data centers. They started providing these data centers, establishing Internet exchanges inside them and providing exchange access as part of their service offerings.
It may have seemed like great democratization of the Internet, but no-one anticipated what the carrier-neutral data center providers did next, Job Witteman, CEO and one of the founders of AMS-IX, the company that operates the Amsterdam Internet Exchange, said. “They were building kind of a fence around that area (the Internet exchanges), and they only offered that service to their own customer base and would not allow any outsiders in.”
These walled gardens grew and today are the largest Internet exchanges in the U.S. There is a handful of data center providers controlling all key traffic exchange points in the country, and of those, Equinix controls the most. The company solidified its position when it acquired Switch and Data in 2010, gaining control of more exchanges around the country.
Internet giants build alternative to walled gardens
Just as the early carrier-neutral data centers jumped at the opportunity to provide alternatives to carrier-controlled exchange points more than a decade ago, companies behind the Open-IX organization have created an alternative to today’s exchanges that are carrier-neutral but not data center provider-neutral.
Internet companies that move unimaginable volumes of traffic wanted alternatives to Equinix-controlled exchanges. Google, Netflix and Akamai, among others, founded the non-profit so they could have peering options beyond Equinix data centers, Witteman said, because if you want to peer through one of Equinix’s Internet exchanges, you have to be in one of Equinix’s data centers.
“If you’re inside Equinix and you can reach anyone who’s also inside Equinix over the exchange, sure you’re happy, but you cannot reach all the others that are in different data centers,” Witteman said. “For Netflix that means they have to pay upstream to get to customers who are not in the Equinix data center, or they have to build out in the second data center as well.”
First we take Manhattan, then we take … San Francisco (Berlin’s been taken)
Two European exchange operators, AMS-IX and LINX (operator of the London Internet Exchange) have set up shop this year in the U.S. as a result of Open-IX. Not quite following battle plan in the Leonard Cohen song, they started on the east coast, in the New York and D.C. metros, and earlier this month, one of them expanded to the west coast. A third European operator, DE-CIX, entered the U.S. market as well, but separately from Open-IX.
An AMS-IX Bay Area Internet exchange went live in Digital Realty Trust’s data center at 365 Main Street in San Francisco in early September. As of last week, when we spoke with Witteman, AMS-IX Bay Area had written commitments from a handful of parties and verbal commitments from about 15 more to peer in San Francisco. In the New York metro, where AMS-IX exchange lives in three Manhattan data centers and one facility in New Jersey, there were seven customers up and running and close to 30 had signed contracts, Witteman said.
 AMS-IX Bay Area equipment inside a cage in one of the meet-me rooms at 365 Main includes two Brocade Ethernet switches and an optical cross-connect made by a Hayward, California-based company called Glimmerglass Cyber Solutions. The set-up costs about $200,000 and is the same across all AMS-IX exchanges around the world, according to AMS-IX CEO Job Witteman.
AMS-IX, its U.S. and two other subsidiaries (there is one in Kenya and another one in Curaçao) are all operated as non-profit organizations. AMS-IX, the parent company, is owned by exchange members but it doesn’t pay dividends. It spends its profits primarily on lowering the rates and on projects like international expansion.
A metro-wide exchange model
Digital’s massive colo at 365 Main is the only Open-IX-certified data center in San Francisco at the moment and one of two certified facilities in the Bay Area. The other one is one is in Santa Clara and belongs to Vantage. An AMS-IX exchange can be in multiple data centers within a single metro. If it were to extend beyond metro borders, it would start competing with its customers, Witteman explained.
The beauty of the model is it is not confined to one data center or one data center provider in a metro. AMS-IX New York is in data centers operated by Digital Realty, Sabey, DuPont Fabros and the group of partners that owns the 325 Hudson building in Manhattan. If you’re peering on the exchange in one of the buildings, you can peer with companies in any of the other ones.
AMS-IX Bay Area is also not going to be limited to 365 Main. Within the next several months it will extend to CoreSite’s San Jose data center at 55 South Market, Witteman said. Yes, CoreSite. Between CoreSite and Equinix, the latter was “braver” in the game of chicken with Open-IX founders, he said. Two CoreSite employees are registered Open-IX members. Neither AMS-IX Bay Area nor CoreSite’s San Jose facility are Open-IX certified yet.
AMS-IX is after the same big users in the Bay Area that peer in the 12 data centers the Amsterdam Internet Exchange is spread across in the Netherlands, Witteman said, referring to them as the “usual suspects.” These are the likes of Netflix, Google, Facebook, Microsoft, Twitter, Akamai and LinkedIn, among others.
It is easy to see why Equinix would not be into the idea of Open-IX. The company’s business model relies heavily on the “ecosystems” of companies it hosts in its data centers. Access to these ecosystems is a big differentiator for Equinix and is the main thing that makes its data centers so attractive.
IXs sweeten the colo pot
Hosting an Internet exchange, especially a well-populated one, is a major selling point for data center providers, which explains the huge amount of Open-IX-certified data centers compared to the amount of certified exchanges. There are about 20 facilities, all around the U.S., that have been certified, and only two exchanges – AMS-IX NY and LINX NoVa (a Northern Virginia exchange operated by a subsidiary of the company that operates the London Stock Exchange).
John Sarkis, general manager of colocation and connectivity at Digital Realty, said the company has about 250 customers in the Bay Area, each of whom now have access to the AMS-IX exchange. Not that selling space at 365 Main is difficult (the facility is about 80-percent full, according to Asa Donough, director of property operations at the building), but Sarkis still nodded to Bay Area rivalry with Equinix. “This is an alternative exchange to Equinix,” he said. “I’d like to highlight that. There are options now.” | 3:00p |
Trends in Automation: The Lights Out Data Center The “Lights Out” data center concept has been around for quite some time, with Data Center Knowledge reporting HP’s plans to move to “Lights Out” in 2006 and AOL’s launch of “Lights Out” facilities in 2011.
However, the practice has not filtered out to the smaller data center or colo space in a widespread fashion. Are you interested in learning more about the benefits of “Lights Out”? You might want to attend data center architect, Jamie Fogal, of CareTech Solutions, session at Orlando Data Center World next month.
Data Center Knowledge asked Fogal about his thoughts about the trend.
“I think that as we look at trends, many of the “lights out” goals of colo providers and data center organizations revolve around becoming more green and managing power utilization,” he said.
Speaking personally, he added, “Our goals in the “lights out” spectrum are two-fold. The first is an obvious opportunity to decrease our consumption of power. The second is the improved experience for our clients. It’s not just coincidental that the two goals can be achieved with one initiative. As a colo provider in the healthcare space, we are always facing the challenge of adding more value to our current offerings.
“As part of our data center services portfolio, we wanted to offer a colocation service that returns control to our clients in a way that provides the insight and functionality of an on-site facility, while employing our best practices in infrastructure design and environmental controls. In this way, our DCIM monitoring solution allowed us to deliver greater visibility and control than our clients had previously experienced. At the same time, we are able to bill them based only on their actual power usage rather than billing for the complete circuit.”
Like all slightly different approaches to the traditional data center set up, there are risks and opportunities to consider.
Fogal said, “I think there are unique advantages in the space of colocation providers to create a data center experience that is more of an extension to the client’s in-house operations rather than creating a separation of the control they would otherwise be accustomed to.”
However there could be risk in complacency or over-reliance on automation, he noted. “The risks in creating a true “lights out” facility lie partially in the fact that people eventually become reliant autonomy of the data center. After all, most of the time everything runs very smoothly without any cause for alarm,” he said.
“This consistent reliability can result in employment of a scaled back workforce or one that has a lesser skill set in data center management. In some cases, I’ve seen the afternoon and midnight shifts of a 24/7 data center left to a security guard with no real knowledge of the facility other than the ingress and egress processes he/she is expected to enforce.”
Discuss Trends in Automation
Want to explore this topic more? Attend the session on “Flip the Switch on Your Lights-Out Colo Facility with DCIM Real- Time” or dive into any of the other 20 trends topical sessions curated by Data Center Knowledge at the event. Visit our previous post, Future Data Center Trends.
Check out the conference details and register at Orlando Data Center World conference page. | 3:30p |
Exploring International Data Center Development Callum Wallace is a Consultant in the TMT practice at Berwick Partners (an Odgers Berndtson Co.)
Scaling a business internationally is hard. Scaling a data center business which is mission critical and capital intensive is potentially overwhelming. Fortunately it is also immensely rewarding.
Creating winning strategies for international data center development is not overly challenging. I can, for example, suggest with a high degree of confidence that building a network of data centers across Nigeria’s key cities is a good idea. There is an enormous demand for secure, resilient data centers and a systemic shortage of supply. Executing the strategy, however, is somewhat more challenging.
The usual questions when developing any data center’s assets are also present when expanding internationally: target market, modular or traditional, connectivity and so on. However, the focus soon turns to less typical international questions: geopolitical, language, time difference, tax and local partnerships. The net result being a combination of options so daunting most choose not to pursue.
For the more determined, I wanted to extend my experience in a few key areas of international data center development that deal with organization structure and the finance function.
Organization structure: matrix vs. in-country
There is regular debate around the merits of global matrix reporting structures versus in-country “absolute command.” In a mission critical environment the most successful approach I have seen is a matrix structure with a first among equals “Country Head.” The leadership for sales, operations, marketing, finance and HR would be based at headquarters.
The overarching reason for advocating matrix structures in data center environments is to encourage effective operational systems and a unified sales effort while avoiding silos.
Operational: maintaining operational accountability and consistent adherence to best practice
- In the data center environment, accountability is key. Successfully empowering in-country operational staff to make decisions is potentially challenging. Nevertheless, when done effectively it will encourage rapid decision making in line with global operational best practices. This will ensure timely and effective local operational management without muddying the waters with in-country preferences.
- An additional benefit is that global teams will encourage mind-share around best practice and can make key operational staff feel part of a larger mission critical team.
Commercial: ensuring consistency to clients
- In global data center teams, multiple country heads proposing differing technical and commercial propositions to different parts of the same global client is common but not desirable.
- When selling to multi-nationals, the benefits of standardization and a consistent commercial value proposition will pay dividends. There should be the scope to encompass regional commercial and regulatory demands but in a globalized world the client will inevitably become irritated with wildly deviating services.
- If you grow to a point where creating market vertical teams becomes worthwhile – it is much easier to organize your teams if you are in a matrix organizational structure.
Fiefdoms:
- There are many parts of an in-country operation that need to be aligned to local culture and buying practices. In my experience what should be minor tweaks inside a wider strategy are often taken too far. These can become so extreme that internally and externally the local business loses touch with the core tenants, philosophy and commercial model of the wider organization. A matrix structure should maintain the global vision of the business while allowing for necessary in-country tweaks.
- Matrix management has the added benefit of avoiding the entrenchment of all powerful country MD’s that can become problematic to manage as their influence increases.
A critical component: the finance function
The finance function is a critical component of any leadership team but when international data center expansion the importance of the finance team intensifies.
The complexity of handling business across borders and in emerging economies is potentially intimidating. Developing data centers is a highly capital investment business which will inherently reduce the potential to take a chance. The finance team cannot make mistakes and must have deep and up-to-date knowledge of the intricacies of modelling, investing, tax and recouping in-country profits.
Finding talent that has operated in all of your target markets is not critical. However, they must know who to turn to for accurate advice and place the appropriate checks and balances to avoid unforeseen challenges in new territories.
If there is uncertainty in your finance functions international capability you may want to conduct a review of other available talent in the market. However, due to the importance of finance and the potential risks of ‘learning on the job’ I would typically advocate hiring externally (either on a permanent or interim basis).
Effective strategy to meet demand with supply
The above is a brief synopsis of two areas of importance when building an international data center business. In terms of people and structure there are of course many more questions to answer:
- Should we export known talent or grow and develop local talent?
- If the business crosses continents should we have regional heads, if so who should they report to?
- Within the boundaries of local laws should we have standard contracts for staff?
Every situation and country will throw up different challenges but with the right people in the right place (internally and externally) you can move much closer to executing a simple and effective strategy of meeting demand with supply.
Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library. | 4:00p |
A Comparison: Low-voltage and Overlay Lighting Control Systems Owners and lessors of U.S. offices typically spend over 40 percent of their total electricity bill on lighting. It is now possible to achieve up to an 80 percent reduction in those costs through advanced lighting control systems and LED lighting technology.
Almost every lighting control system available today is an overlay system. An overlay system uses either a wireless or wired network to control ac line voltage powered lights. These systems are called overlay systems because the control network is overlaid on top of the existing ac lighting power infrastructure.
In contrast, low-voltage lighting control systems supply an intelligent low-voltage network that both powers and controls LED light fixtures in a single system, in addition to providing building intelligence.
This whitepaper from CommScope compares low-voltage lighting platforms to overlay systems and demonstrates their higher energy savings (averaging 75 percent), lower installation cost, flexibility to offer a broader range of functionality, and their enhanced ability to scale in larger deployments.
Many trends have recently emerged that are resulting in the increased availability and adoption of lighting control systems in office spaces. Lighting control systems for offices have been available for over a decade, but it wasn’t until recent years that the number of available solutions began to rise. This marked increase is due in part to the following trends:
- Increasingly stringent building codes
- Utility incentives
- Rising electricity costs
- Growing environmental awareness
- LED lighting
Download this whitepaper today to learn how low-voltage control systems have superior benefits for commercial spaces being newly constructed or those that are undergoing renovations and desire a reliable, highly energy efficient controls solution. Low-voltage benefits include:
- Installation and commissioning costs
- Reliability
- Performance considerations
- Capabilities
- Measuring energy savings
Low-voltage and wired overlay solutions are better suited for environments that are sensitive to noise, those that have high security safeguards, or those where lighting is mission critical to the tasks being performed. On the other hand, wireless overlay may be suitable for smaller, non mission-critical areas or for applications where accessing wiring spaces is cost prohibitive. The key is understanding the differences between low-voltage and overlay control systems. | 4:10p |
Amsterdam’s Switch Buys Woerden, Netherlands, Data Center Amsterdam’s Switch Datacenters has purchased the former ABN AMRO data center in Woerden, Netherlands (about 25 miles south of Amsterdam). The facility is leased to a large unnamed cloud provider. The purchase price was €11 million.
The entire complex consists of two buildings with a total surface area of 134,550 square feet. The data center building account for 59,200 square feet. In total, the company’s data center footprint is about 150,000 square feet.
The location will complement its existing data center in the South East business district of Amsterdam. Switch will install a direct fiberglass connection between Amsterdam and Woerden so that data can be easily mirrored between the two data centers. Woerden will operate as a neutral Internet interchange for the surrounding region.
Switch will upgrade cooling, power capacity and efficiency of the data center to mimic the efficiency it sees in the Amsterdam facility, where one of the techniques it uses is the unusual practice of submerging server motherboards into dielectric fluid for more efficient cooing. The company said it uses 100 percent green energy and also leverages efficient cooling from StatiqCooling, an evaporative cooing systems vendor.
Switch puts a lot of research into cooling. In 2012 it launched an incubator to examine and test the technology.
Founder and CEO Gregor Snip said the “data center is an important addition to the product supply of Switch Datacenters that is concentrated on the larger cloud environments and customers with a need for twin data center.”
Amsterdam and often acts as an ideal location for European expansion. The country is well-connected with a skilled labor force, making it a good landing spot for companies overseas looking to establish an initial European presence. | 4:44p |
Cologix Integrates Modius DCIM, Canara’s Battery Monitoring and CRM Into Single Management System Data center colocation services provider Cologix has launched Cologix Command, a measurement and monitoring solution integrated with its customer relations management (CRM) platform. It combines the CRM with a data center infrastructure management (DCIM) tool by San Francisco-based Modius and a battery monitoring solution by Canara, based in San Rafael, California.
The setup is designed to benefits both the Cologix team and customers. The team gets platform-wide data and insights needed to confidently manage each data center. The platform interfaces with customers through an integrated ticketing system, advanced maintenance notifications and event management capabilities.
A centralized dashboard shows real-time, historical and predictive data across the data center platform. Displayed is the operation of uninterruptible power supply systems, backup generators, cooling systems, batteries, environmental controls and other critical data center equipment.
Data collection from the data centers is done by Modius’ Open Data platform. Canara’s solution not only monitors data center backup power systems. but also provides some predictive analytics tools to forecast and prevent upcoming UPS battery failure.
Cologix did not say which CRM solution it uses, but did say that the solution was cloud-based.
“There are thousands of measured and calculated data points within a data center which, without DCIM, are nearly impossible to effectively analyze,” said Val Milshtein, vice president of technology, Cologix. “We now have the ability to gain insight at a high-level or granular basis to the state and performance of every piece of equipment supporting a customer service with a few mouse clicks.”
Cologix now operates network-neutral data centers in Ohio, Texas, Florida and Minnesota, as well as in three major Canadian markets: Montreal, Toronto and Vancouver. It recently acquired Colo5, greatly expanding its footprint in Florida and DataCenter.BZ, expanding into Columbus, Ohio. | 5:05p |
IBM Intros Freemium Watson for the Masses IBM has introduced Watson Analytics for everyone. Now available as a freemium option, the new IBM Watson offering brings together a set of self-service enterprise data and analytics capabilities on the cloud, including access to data warehousing and data refinement services.
The natural-language-based cognitive service provides instant access to powerful predictive and visual analytics tools for business. The service is meant to use predictive analytics to surface key relevant facts and uncover unforeseen patterns and relationships.
It is hosted on the SoftLayer data center and cloud platform, and the first release includes a freemium version, designed to run on desktops and mobile devices.
This is analytics for the masses, unlike the recently released Discovery Advisor, an IBM Watson offering targeted specifically to research and sciences.
Most analytic offerings assume users have data ready for analysis, a clear idea of the type of analysis needed and the skills and time to build a model for analysis. However, many business users have none of these things. Finding and validating data takes a tremendous amount of time, and there’s further struggle with identifying what analysis would be relevant.
IBM is pitching Watson Analytics as a way to do a lot of the heavy lifting, automating steps like data preparation, predictive analysis and visual storytelling, provide the visuals you ultimately need to convey the findings to others. Watson in general has already proven itself to be very powerful, and now the goal is to make it easy to use.
A user identifies a problem and Watson Analytics can help them source the data, cleanse and refine it, discover insights, predict outcomes, visualize results, create reports or dashboards and collaborate with others. It does this through its natural language capabilities.
It can understand natural language questions like “What are the key drivers of my product sales?” It also produces results that explain why things happened and what’s likely to happen, all in familiar business terms. Results can be interacted with and questions can be fine-tuned to delve deeper.
“Watson Analytics is designed to help all business people – from sales reps on the road to company CEOs – see patterns, pursue ideas and improve all types of decisions,” said Bob Picciano, senior vice president of IBM’s Information and Analytics Group. “We have eliminated the barrier between the answers they seek, the analytics they want and the data in the form they need.” | 5:54p |
Report: Twitter Signs for Another 80,000 SF in Atlanta With QTS Twitter has signed for more data center space in Atlanta with QTS, Atlanta Business Chronicle reported citing anonymous sources.
The company has taken 80,000 square feet of additional space for Twitter servers in the data center provider’s massive, 1 million-square-foot Atlanta-Metro data center. The San Francisco-based microblogging platform company has been in this facility since at least 2011, starting with a 50,000-square-foot deployment and adding another 100,000 square feet in 2012, according to reports.
Unlike a handful of other Internet giants, such as Facebook, Google, Apple and Microsoft, Twitter has traditionally stayed silent about its data center infrastructure. Sources have told us that in addition to the Atlanta facility, Twitter servers also occupy a lot of space on the massive RagingWire data center campus in Sacramento, California, where it has had presence since 2010.
While lower-profile than New York, Silicon Valley, Dallas and Chicago, Atlanta is one of the hottest data center markets in the U.S. There is plenty of network connectivity and power is cheap.
QTS has two facilities in the metro (the other one is in nearby Suwanee), totaling about 1.3 million square feet and 100 megawatts of power. Equinix has presence in Atlanta and so do Telx, Level 3, Internap and CenturyLink.
Google and Digital Realty Trust own massive data centers in the area.
In July, Zayo-owned data center provider zColo entered the market with acquisition of a data center called AtlantaNAP. Earlier this month, Peak 10 announced it had opened its third data center in the market. | 6:22p |
Football Legends Ditka, Singletary Attend Windstream’s Chicago Data Center Grand Opening Cloud and managed hosting provider Windstream Hosted Solutions held a grand opening of its Chicago data center earlier this month. The 20,000-square-foot facility, launched late last year, has more than 15,600 square feet of raised floor and is fully equipped to support Windstream’s offerings.
The grand opening included facility tours and a meet-and-greet with National Football League legends Mike Ditka and Mike Singletary. Ditka is the idol of Bill Swerski’s Chicago Superfans.
The new data center features:
- 3600 kVa of utility capability, expandable to five times that amount
- 800 tons of cooling capacity to keep inside temperatures constant at an average of 74 F
- Redundant 10Gbps OC-192 circuits, connecting each center to multiple Windstream core POP sites with fully redundant peering
- On-site Network Operations Centers (NOCs), fully staffed 24 x 7 x 365, providing facilities and network monitoring, security and technical and remote hands support
Chris Nicolini, Windstream’s senior vice president of cloud and data center operations, said, “In essence, the Windstream Hosted Solutions team becomes an extension of our customers’ information technology staff, handling their maintenance tasks, providing compliance support for industry regulations, keeping critical data protected and making sure their business operations continue in the event of a disaster.”
Among the company’s offerings is a new EMC-powered storage service. The collaboration with EMC offers customers the choice to store or back up data to multiple dedicated or shared EMC platforms in cloud-enabled data centers.
Chicago continues to be a hot data center market with continued activity both in the city proper and the suburbs.
“The Chicago multi-tenant data center market has experienced a supply/demand imbalance in the western suburbs, as well as in downtown Chicago, for several years now,” said Rick ‘The Hammer’ Kurtzbein, research analyst at 451 Research. “Windstream’s new … facility offers enterprise-quality colocation services, as well as a suite of managed hosting and cloud infrastructure services, providing Fortune 1000 companies with data center services that we expect will be well-received in the active Chicago MTDC market.” | 6:52p |
Microsoft Considers German Data Center Amid US Attempt to Access EU-Held Data 
This article originally appeared at The WHIR
On Sunday German news outlet Tagesspiegel reported that Microsoft is considering a German data center. With recent court rulings, public concern over government access to private information after the Snowden revelations and the EU diligently working on new data protection and privacy regulations, it is more important than ever for companies to consider local data sovereignty laws.
Microsoft’s German head Christian Illek told the Tagesspiegel that keeping data within Germany would make the data subject to German and European law, circumventing the attempt of the NSA and US courts to obtain private data.
The ability of local and international governments to request or demand information by warrant will strongly affect cloud and hosting businesses going forward. According to a report by the Information Technology & Innovation Foundation (ITIF), US providers may lose billions in the wake of the Snowden revelations. The New America Foundation had similar findings.
In July, the US Department of Justice issued a warrant to Microsoft to hand over customer data stored in its Irish datacenter. US District Judge Loretta Preska ruled that Microsoft must provide the information despite the data being stored on a foreign server. She determined that since the data was controlled and stored by a company based in the United States that the US court has jurisdiction regardless of where it resides.
Microsoft is so serious about protecting the data that by it’s own request is being held in contempt of court so it can move forward immediately with an appeal. “The only issue that was certain this morning was that the District Court’s decision would not represent the final step in this process,” said Brad Smith, Microsoft’s top lawyer, in a statement in July. “We will appeal promptly and continue to advocate that people’s email deserves strong privacy protection in the US and around the world.”
This ruling and the outcome going forward has obvious implications affecting US companies that do business globally.
Germany is one of the biggest tech economies in Europe and has some of the strongest data protection laws, refined in 2009 to reflect the changing internet climate.
A German data center would allow Microsoft to more easily prove compliance with local law. Amazon began considering a similar strategy in April and the Register reported in July that the strategy is now coming to fruition.
This article originally appeared at: http://www.thewhir.com/web-hosting-news/microsoft-considers-german-data-center-amid-us-attempt-access-eu-held-data | 7:09p |
Amazon to Reopen AWS Training Hangout for Startups in SF Amazon is preparing to reopen its AWS Pop-up Loft in San Francisco in October, where it will offer technical expert advice, AWS training courses and labs for startups using its cloud. Besides being a space for learning, the loft on Market Street will also be a place to network, hang out and work, the company said.
AWS opened the loft for one month in June, but later closed it to build it out. Ariel Kelman, vice president of worldwide marketing for AWS, said the space was very well attended during that temporary opening.
“During that month, thousands of people visited the AWS Loft for hundreds of hours of appointments, sessions, training, and hands-on labs to gain a deeper understanding of AWS,” he said.
Among startups that attended and were happy that they did were folks from CircleCl, a testing and deployment platform for web application developers, Coin, a company working on a chameleon credit card device, a sort of a virtualized credit card that can switch between up to eight different credit cards, and Twilio, a platform for building voice, VoIP and SMS apps.
The Pop-up Loft is sponsored by Intel and Chef. Intel is planning to demo some of its recent technologies and host talks. Chef will provide DevOps-oriented talks and training in the space.
Amazon didn’t say how long the loft would stay open this time around, but the event schedule on its website extends from October through early November. | 8:51p |
Larry Ellison Steps Down From Oracle CEO Position Larry Ellison, Oracle’s flamboyant co-founder and CEO of more than 30 years, has stepped down from the chief executive role, the company announced Thursday. He is remaining part of the company’s executive leadership team as CTO and executive chairman of the board.
Oracle presidents Safra Catz and Mark Hurd have both been appointed to the CEO position to replace Ellison. Sales, service and vertical industry business units will continue to report to former HP CEO Hurd, while manufacturing, finance and legal functions will continue to report to Catz.
Ellison will continue overseeing all software and hardware engineering functions. Jeff Henley, who has been the company’s CTO until Ellison’s appointment to the role, will move into the role of vice chairman of the board.
The company did not immediately explain the changes, releasing a statement from Oracle board’s presiding director Michael Boskin saying Ellison had made it clear that he wanted to continue working on product engineering, technology development and strategy full time.
“Safra and Mark will now report to the Oracle board rather than to me,” Ellison said in a prepared statement. “All the other reporting relationships will remain unchanged.
“The three of us have been working well together for the last several years, and we plan to continue working together for the foreseeable future. Keeping this management team in place has always been a top priority of mine.”
Oracle announced the changes simultaneously with releasing its earnings report for recently completed first quarter of fiscal 2015. The company’s revenue for the quarter was $8.6 billion, up 3 percent year over year. Its net income was $2.2 billion, unchanged compared to the first quarter of fiscal 2014.
Fiscal Q1 2015 revenue by business:
Software-as-a-service (SaaS) and Platform-as-a-service (PaaS) cloud: up 32% to $337 million
Infrastructure-as-a-service (IaaS) cloud: up 26% to $138 million
Hardware systems revenue: down 8% to $1.2 billion
We’ll be updating this story as more details become available. | 11:08p |
Oracle Execs: Ellison’s Title Change Won’t Change Anything Oracle executives spent a good chunk of the time allotted for Thursday’s quarterly earnings call – which followed the announcement that Larry Ellison was stepping down as the company’s CEO – trying to convince analysts that nothing will change at Oracle in practice, and that Oracle cloud will eventually conquer the world.
“There will actually be no changes,” Safra Catz, Oracle’s former CFO and one of its two new CEOs, said. “No changes whatsoever.”
While she will no longer be referred to as CFO, her team will continue reporting to her as they have been. Her title is now CEO and principal financial officer, and the company will not be looking for a replacement.
“We will not be hiring CFO and my team will continue reporting to me,” Catz said. The functions that have and will continue to report to her are manufacturing, finance and legal.
Former HP CEO Mark Hurd will share the Oracle CEO throne with Catz. This will be the first time anyone but Ellison will have sat in it.
Ellison took Hurd on as company president after Hurd’s ouster from HP in 2010 amid a scandal related to allegations of sexual harassment and misuse of company resources, which were later disproven.
While his title has changed, Hurd’s responsibilities and direct reports will also remain the same. All sales, service and vertical industry business units will continue to report to Hurd.
Ellison: “You should be so lucky. I’m staying on the calls.”
Ellison, whose title has changed to CTO and executive chairman of the board, will still be overseeing all software and hardware engineering functions and will even continue participating in quarterly earnings calls. “You should be so lucky,” he said, replying to a comment from an analyst who said he will miss Ellison on the calls. “I’m staying on the calls.”
None of the execs actually gave a clear explanation for the title changes. The company’s stock fell more than 2 percent in afterhours trading following the announcement.
New Oracle cloud database service to launch at Open World
As if to illustrate how little will change, Ellison made a new product announcement on the call, held in the run-up to Oracle Open World in late September, early October. Ellison announcing new products a week or so before the massive San Francisco conference has become an annual tradition of his.
This time around, he announced that the company will be rolling out a new cloud database service. The pitch is that customers will be able to move their applications from their own data centers into the Oracle cloud “with a push of a button.”
The applications will automatically become multi-tenant applications, and data will automatically be compressed and encrypted. “No reprogramming is required,” Ellison said.
“Database is our largest software business, and database will be our largest cloud service business,” he added.
The company, which grew up to be an enterprise software giant, has been putting a lot of effort into growing its cloud services business, recognizing that the portion of business software deployed on premises is shrinking while the portion of business software delivered as cloud services is growing.
“We’re focused on becoming number one in the cloud, being bigger than Salesforce in the cloud,” Ellison said, referring to one of the world’s largest providers of business software services in the cloud.
Cloud services business grows, but it’s only 4% of revenue
Oracle cloud services business grew in the recently completed first quarter of fiscal 2015, but it remains a very small portion of its overall business, which still very much relies on on-premises software license sales and updates and support for those licensees.
Oracle’s Software-as-a-Service and Platform-as-a-Service revenue grew 32 percent in the first quarter, reaching $337 million. Infrastructure-as-a-Service revenue grew 26 percent, reaching $138 million. Today, the former category is responsible for 3 percent of total revenue, and the latter for 1 percent.
The company’s total revenue for the quarter was $8.6 billion, up 3 percent year over year. Its net income was $2.2 billion, unchanged compared to the first quarter of fiscal 2014.
Company misses Q1 EPS guidance
“Q1 is a seasonally smaller quarter, which can mean more volatility in our results, and that’s what we saw in this quarter,” Catz said. The company’s earnings per share for the quarter were $0.48, slightly below the lower end of its previously forecasted range of $0.49 to $0.53.
Revenue from new software licenses was down 2 percent, but revenue from software license updates and software support was up 7 percent. As more enterprise software moves to the cloud, Oracle expects this update and support revenue to shrink, Catz said.
Execs optimistic about shift to the cloud
But the company expects to benefit from the shift nevertheless, as it expects to provide the cloud services those users will turn to. Instead of making money on updates and support, Oracle leadership hopes to make money on providing the complete infrastructure stack as a service.
“We’ll also be providing the hardware, the application management and complete operations,” Catz said. These trends will not have substantial impact on the company’s bottom line immediately, but will show themselves more distinctively in the medium-to-long term, she warned. |
|