Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
[Most Recent Entries]
[Calendar View]
Tuesday, January 31st, 2017
| Time |
Event |
| 5:12p |
Datto Acquires Managed Networking Firm Open Mesh Datto is expanding its MSP-friendly product line through the acquisition of Portland, Ore.-based networking provider Open Mesh on Tuesday. The terms of the deal have not been disclosed.
Datto senior product marketing manager Scott West says Open Mesh’s cloud-based networking solutions, will be complementary to Datto’s offerings, which are distributed exclusively through its MSP partners.
“Out the door, our CEO [Austin McChord] has made the claim that this is probably the most MSP-centric networking that exists,” West tells Talkin’ Cloud in an interview.
The addition of Open Mesh brings more networking capabilities, including Ethernet switching and access point technologies, to Datto’s networking product portfolio, including its Datto Networking Appliance. With the Open Mesh acquisition, Datto has launched Datto Networking, a complete portfolio of networking solutions.
“[Our MSP partners} work with a number of physical servers and virtual servers, and they’re in a lot of SaaS applications and cloud servers that are out there. Most of our MSP partners also sell networking technology to combine and connect all of those servers and those systems together,” West says. “We’ve been in conversations with our partners for a while and they have been asking us to provide more capabilities and give them the opportunity to add more value to what they deliver in the marketplace. As we looked at our product line we were missing that connectivity piece which is really networking.”
When looking at the networking market prior to the acquisition of Open Mesh, West says that Datto didn’t see many options that were designed specifically for MSPs. Most networking solutions are designed for the enterprise user which can require significant investment for an MSP to get certified.
“In many cases the MSP is not a large service provider themselves,” West says. “They may have a technician, they may have 15-20 for some of the larger ones, but they’re very sensitive to how they’re able to deliver services, and any way that they can optimize products that they deliver and not have to go onsite if possible or waste time and energy and money in managing and deploying systems, that’s a good fit.”
Founded in 2007, Open Mesh currently manages more than 90,000 networks. The company’s entire team will join Datto as a subsidiary company, and will continue operating in Portland. West says that the percentage of joint partners between Datto and Open Mesh is fairly low.
“The difference is that now they’re owned by Datto and obviously they get a lot of support from us as a company, but we also have the ability to take their product line and fold it into Datto and create a whole new set of products that Datto can deliver to our partners,” West says.
In a statement, Open Mesh founder Michael Burmeister-Brown said that the company wasn’t looking to be acquired but found a perfect fit with Datto; “When we met the team at Datto, we discovered a company with shared DNA. The common vision, culture and commitment to IT service providers is uncanny. Datto is disruptive, innovative and extremely customer focused. While we weren’t looking to be acquired, we’ve found a natural home within Datto that positions our partners, our customers and Open Mesh for continued growth and success.”
The acquisition comes at a time of growth for Datto, which had a 30 percent increase in MSP partner growth in 2016, and saw partners and resellers signing on from all parts of the world.
“The company has been growing considerably, a lot of that we owe to relationships with our channel partners,” he says.
“The plan is to continue to add new capabilities to help our MSP partners to continue to add value and make money on top of networking services.” | | 5:42p |
Toshiba Asset Sales After Chips Spinoff Will Cut to the Bone Pavel Alpeyev (Bloomberg) — Toshiba Corp. is looking to sell more assets to repair a balance sheet facing multibillion-dollar writedowns. The conglomerate happens to have more than 600 different businesses, but raising cash from the firesale will be far from simple.
While Toshiba’s two biggest enterprises are nuclear reactors and semiconductors, the Tokyo-based company also has its hands in a wide range of endeavors, including elevators, a general hospital, software services and light bulbs. It even had a lettuce-growing factory. Making matters worse, a round of asset sales following Toshiba’s 2015 accounting scandal eliminated many of the easy choices.
Satoshi Tsunakawa, Toshiba’s president, said the company is looking at selling shareholdings, real estate and other assets in order to come up with capital. “We will keep considering all options as needed and promptly, and take all necessary steps,” he said at a briefing in Tokyo Friday.
Toshiba is set to book a writedown of much as 700 billion yen ($6 billion) at its nuclear unit, and needs money in a hurry to secure the support of lenders. The company has already indicated that it’s willing to part with 20 percent of its computer-chip unit. Beyond that, there aren’t a lot of businesses that can be easily carved out. The nuclear unit is too much of a mess, and there are no obvious buyers for the smaller entities. All of this raises questions about what will be left of the company after its latest rout.
“There is an unavoidable question of where growth will come from after you strip Toshiba bare,” said Masahiko Ishino, an analyst at Tokai Tokyo Securities.
While Toshiba announced the plan on Friday to separate the chip business by March 31 and raise capital, it didn’t offer any target amounts or specifics. Tsunakawa said more details would be given at an earnings briefing on Feb. 14. On Monday, Toshiba shares fell as much as 5.9 percent in early trading in Tokyo.
The asset sale will only begin to cover the losses from its nuclear operations. Deutsche Bank AG expects that sale to bring in only 200 billion yen, while many of the other assets are in closely held ventures or minority stakes that take time to attract buyers.
“The value of the chips business will be determined in the bidding process,” Yasuo Naruke, Toshiba’s senior executive vice president, said at the briefing.
The reactor maker also said it’s reconsidering plans to accept new nuclear projects, possibly limiting orders to just turbines and related equipment orders.
Then the next likely target is Toshiba Elevator & Building Systems Corp., according to Macquarie and Morningstar Investment Services. The maker of escalators and lifts, including the world’s fastest operating in the Taipei 101 tower, has been part of Toshiba group for 50 years. The unit’s self-contained manufacturing and maintenance operations make it a relatively easier candidate for an asset sale, according to Damian Thong, an analyst at Macquarie Group Ltd.
Toshiba doesn’t disclose earnings from the elevator business, which is part of its building and facilities segment that generated 306 billion yen in revenue in the fiscal first half. It’s part of the larger infrastructure systems and solutions division that also includes sewage, locomotives and air-conditioning systems.
“Elevators and escalators are a profitable business, but there is no visibility on the assets,” Ishino said. “With unlisted entities, you have to go through a due diligence process, which will take time.”
A quicker way to generate cash would be to sell Toshiba’s stakes in publicly traded subsidiaries. Point-of-sale equipment maker Toshiba TEC Corp. is listed, and Toshiba’s 55 percent holding is worth about 100 billion yen. Stakes in chip-equipment manufacturer NuFlare Technology Inc., Toshiba Plant Systems & Services Corp., which specializes in power generation infrastructure, and industrial tool maker Toshiba Machine Co. could add as much as another 150 billion yen.
NuFlare’s expertise in electron-beam masks may be attractive to Tokyo Electron Ltd. and Canon Inc., which has been developing nanoimprint technology, Ishino said. Both companies have also expressed interest in buying a stake in Toshiba’s memory chip business.
Toshiba has also decided to sell Toshiba General Hospital, the Asahi newspaper reported. The 50-year-old facility in central Tokyo has 308 beds and testing services that include two CT and two MRI scanners. Some of the machines were made by Toshiba’s own medical equipment arm, which Canon bought last year for $5.9 billion.
“There is no real reason for Toshiba to hold on to the hospital, so it’s an easy decision to make,” said Kazunori Ito, an analyst at Morningstar. “The question is whether there is a willing buyer.”
What’s left are companies and assets that are more difficult to split off. There is Toshiba Information Systems, which offers chip design and software integration services and has been part of the group since 1968, and Toshiba Lighting & Technology Corp, a light-bulb business that’s more than a century old. TMEIC, a joint venture with Mitsubishi Electric Corp., makes inverters and motors for use in energy and industrial applications.
“The sale is easier if you have a distinct entity, which is often not the case with Toshiba,” Thong said. “Many of these companies depend on the head office for core functions like HR, procurement, accounting. That’s why some of these sales may be tricky.” | | 6:52p |
Digital Bridge-Backed DataBank Buys Cleveland, Pittsburgh Sites from 365 Data Centers DataBank, an emerging national colocation services player, has added two more data center markets to its portfolio, acquiring facilities in Cleveland and Pittsburgh from 365 Data Centers.
Last July, Dallas-based DataBank was acquired by investor Digital Bridge Holdings, for whom the deal represented an entry into the data center market. Around the same time, Michael Foust, former CEO of Digital Realty Trust, joined as chairman. Since then, the company has been expanding via acquisition.
DatBank considers the data centers in Cleveland and Pittsburgh “key interconnection assets” and plans to leverage them as “anchors” for further expansion in the two markets.
Counting the two former 365 facilities, DataBank is now in six data center markets. Earlier this month it entered Salt Lake City by acquiring local provider C7 Data Centers.
Just last week, Reuters reported that Boca Raton-based Digital Bridge is close to announcing acquisition of Silicon Valley-based Vantage Data Centers, which has data center campuses in Santa Clara, California, and Quincy, Washington.
Further reading: 2016 Was a Record Year for Data Center Acquisitions, but 2017 Will Be off the Charts | | 7:30p |
IPv4 in a World of IoT Mike Hollander is Co-Founder and CEO of MOD Mission Critical.
If you’ve ever seen the movie “Minority Report,” Steven Spielberg’s sci-fi detective thriller set in a dystopian Washington, D.C., in 2054, you may remember a scene in which Chief of PreCrime-turned fugitive, John Anderton, ducks into a crowded shopping mall to escape the long arm of the law.
“John Anderton!” a personalized advertisement calls out as he passes by. “You can use a Guinness, right about now!” Entering a GAP, he sees and hears other simulated barkers clamoring for his attention, spurred by smart sensors.
Elements of this IoT-enabled advertising universe are no longer the stuff of science-fiction. Personalized advertising is a very real and growing technology. And where there is IoT, there is IPv4.
By 2020, 33 Billion Served
Gartner has predicted by the year 2020, the IoT could reach 26 billion devices. That is apart from the predicted 7.3 billion smartphones, mobile, and PC devices expected to be in use by that time. Meanwhile, a McKinsey report predicts business-to-business IoT applications will create even more value than pure consumer applications.
Most of the devices we use are connected to a network via Internet Protocol (IP), which requires an IP address. Internet Protocol version 4, or IPv4, describes what the numbers used by every device that connects to the internet must look like, for example, 193.168.0.254. Hence, every computer, laptop, tablet, smartphone and any IoT-enabled device that links to the internet — over 33 billion connections — has an IPv4 address. The primary purpose of an IP address is to allow these devices to interact with one another. The IP address of your personal computer is not visible to any other devices but is used to connect to a router, which then uses its own specific IP address to connect to the internet.
If you live alone and use a personal, password-protected wireless router to connect your desktop to the internet, your IP address is linked to just you. At other times, however, your IPv4 address is shared by many and is openly displayed, as it does when you stop for a caffè mocha at Starbucks, pull out your iPad from your briefcase and connect to the free Wi-Fi to check out Facebook. The same situation occurs when you work alongside colleagues on your laptop, connecting to the wireless internet in the conference room of most offices.
Over the last two decades, network, cloud, mobile and IoT technologies have rapidly reduced the number of available IP addresses as each new technology adds devices to the global internet. Moreover, because networks can span buildings, city blocks or even entire metro areas, as in the case of public Wi-Fi, it’s often a difficult process to determine the exact geolocation of an IP address. And that’s where trouble can arise.
We’re Not in Kansas Anymore
Recently, more than 600 million IPv4 addresses were inadvertently mapped to the front yard of a residential location in Potwin, Kansas. To put that number into perspective, that’s the equivalent of the combined populations of Central and South America. Many of the users with whom these IPv4 addresses were associated were involved in nefarious or otherwise desperate activities. As a result, FBI agents, federal marshals, IRS collectors and even an ambulance in response to a suicide call, began frequently showing up at the unknowing, and innocent, homeowner’s front door.
Currently, geolocation information is only available from two sources: MaxMind and HostIP.info. The location database provided by MaxMind is 98 percent accurate on a country level and 70 percent accurate on a city level within a 25-mile radius in the United States. Accuracy information is not available for HostIP.info, although its database is open for user input; hence it is continuously becoming more precise. Because the network geophysical location has been tested and utilized primarily for the U.S., it may not have the same accuracy when dealing with international networks.
Keeping Business From the Bottom of the Lake
In the instance described above, very often MaxMind could only obtain information linking an IPv4 address to the country. When that was the case, it chose the Potwin residence as its default location, a latitude and longitude that was in the center of the U.S. That unfortunate solution, combined with readily available, free programs that can mask IP addresses, and 600 million IPv4 addresses became associated with a front lawn.
It’s interesting to note that MaxMind recently shifted its default U.S. location to the center of a lake, west of Wichita.
Given the explosive growth anticipated in mobile and IoT-enabled business applications, it’s critical that enterprise and small-to-medium sized companies confirm their IPv4 addresses are correct, since all connected devices are dependent on a specific geolocation. This is especially significant given the increasing digitalization and globalization of business, particularly across the advertising, e-retail and security sectors, and the increased IPv4 network address requirements of these types of companies.
While IPv6 has been a hot topic for internet providers for the past five years, IPv4, the original address system, isn’t changing anytime soon. And although addresses through ARIN are no longer available, there are plenty to be sourced in the grey market or previously-owned space.
The business goals of an IP address plan would provide IPv4 addresses to end nodes to enable them to communicate with other nodes across the organization, with internet or partner nodes, as well as enable end nodes to communicate via supported media. Additionally, an IP address plan would facilitate network and security management.
The alternative, to have no plan, risks putting the geophysical location of one’s IPv4 address, and one’s business, in the lake.
Opinions expressed in the article above do not necessarily reflect the opinions of Data Center Knowledge and Penton.
Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library. | | 7:31p |
Radware Buys Seculert, Augments Data Center Security Chops with Machine Learning Cybersecurity company Radware has acquired Seculert, augmenting its data center security capabilities with Seculert’s machine learning technology that analyzes systems for threat in advance of an attack rather than analyzing attacks once they’ve already happened.
Seculert’s cloud platform uses large-scale processing and machine learning techniques to conduct behavioral analysis aimed at detecting stealth attack campaigns.
“The Seculert acquisition allows Radware to leverage machine learning technology and its data analytics platform in order to expand our core expertise beyond attack analysis to threat analysis, which provides a panoramic view of the data center’s posture,” David Aviv, Radware CTO, said in a statement. “These capabilities expand Radware’s attack mitigation from real-time and near-time to include detection of stealth attack campaigns.
Terms of the deal were not disclosed. | | 8:08p |
Sponsored: Data Center Power Best Practices – Designing for Resiliency & Redundancy The data center industry has seen significant growth over the past few years as more organizations are now working with data center providers to make their businesses more agile. This translates to greater requirements around uptime, resiliency, and cost efficiency.
According to the latest AFCOM State of the Data Center report, 70% of respondents indicated that power density (per rack) has increased over the past 3 years. Twenty six percent indicated that this increase was significant.
Because of the critical dependence on data center services, redundancy and uptime are big concerns. There are fairly steady trends around redundant power levels spanning today and the next three years. For example, the report shows that at least 55% already have, and will continue to have, N+1 redundancy levels. Similarly, no more than 5% of respondents either currently have, or will have, 2(N+1) redundant power systems. For the most part, data center managers are using at least 1 level of redundancy for power.
However, not everyone is as prepared for an outage as they probably should be. Consider this – only 27% of companies received a passing grade for disaster readiness, according to a 2014 survey by the Disaster Recovery Preparedness Council. At the same time, increased dependence on the data center and providers means that overall outages and downtime are growing costlier over time. Ponemon Institute has released the results of the latest Cost of Data Center Outages study. Previously published in 2010 and 2013, the purpose of this third study is to continue to analyze the cost impact of unplanned data center outages. According to the new study, the average cost of a data center outage has steadily increased from $505,502 in 2010 to $740,357 today (or a 38 percent net increase) per incident.
Throughout their research of 63 data center environments, the study found that:
- The cost of downtime per incident has increased 38 percent since the first study in 2010.
- Downtime costs for the most data center-dependent businesses are rising faster than average.
- Maximum downtime costs increased 32 percent since 2013 and 81 percent since 2010.
- Maximum downtime costs for 2016 are $2,409,991 per incident.
When it comes to your own data center environment – what are you doing to design your IT ecosystem to ensure resiliency and uptime? Most of all, how do you make sure that you’re not hit with a massive outage price tag?
The good news is that there are powerful data center systems which take uptime directly into consideration. And, this technology has already been deployed. With tens of thousands of High-Density Outlet Technology (HDOT) PDUs already installed, Server Technology has now completed its popular and innovative product with the addition of the HDOT Switched POPS (Per Outlet Power Sensing) PDU. This technology includes device level power monitoring, and is the most uniquely valuable rack PDU on the market aiming to tackle challenges around density, capacity planning, uptime, and remote power management for the modern data center.
Click here to see the Rack PDU Feature Options that Server Tech offers.
The HDOT Alt-Phase PRO2 rack PDU expands on Server Technology’s modular PDU design allowing custom user configuration. With thousands of configurations possible, the customer is sure to find exactly the right product for their application.
“Integrating power measurement to our HDOT Alt Phase modular product family was a significant engineering challenging which I’m proud to say we accomplished without sacrificing quality or manufacturability,” said Travis Irons, Director of Engineering at Server Technology.
When it comes to uptime – the new HDOT PDU takes this directly into consideration:
- PRO2 is a flexible and feature-rich hardware and firmware platform, higher onboard compute power, all modern security protocols, redundant features, and advanced customization all built into the product.
- The new PRO2 architecture is ideal in any situation where reliability and uptime are important, particularly in high temperature and high-security applications. With PRO2, customers can maintain uptime with access to current data and future trends.
Remember, greater levels of uptime must revolve around technologies which help make the data center easier to manage. To simplify load balancing and cable management, Server Technology offers PDUs with Alternating Phase outlets, which distribute phases on a per receptacle basis (rather than in discrete separate banks), providing tangible benefits in the form of shorter cable runs, resulting in better airflow, easier load balancing, and greater efficiencies. Prior to the advent of HDOT, Alternating Phase products were impractical to build due to the low outlet density inherent with discrete commercially available outlets.
Power management within the data center is absolutely critical. New PDU solutions from Server Technology expand upon their already innovative power products on the market. These new power management systems can help resolve major data center challenges revolving around density, capacity planning and uptime. Ultimately, easier to manage data centers with greater levels of uptime help keep both your IT environment and your business a lot more resilient. Most of all, with downtime costing more every year, Server Technology PDUs aim to tackle uptime concerns by integrating redundancy and better management; all helping cut down costs and outages.
This article was sponsored by Server Technology. |
|