Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
[Most Recent Entries]
[Calendar View]
Thursday, December 25th, 2014
| Time |
Event |
| 1:00p |
Living On the Edge With Fog Computing I know, it’s another buzz term. It just feels like we haven’t had one in a while. That being said, something interesting is happening in the cloud world. We’re seeing more users access an even larger set of information. Services around streaming, content delivery, and even caching are becoming very popular. But how do you deliver such large workloads efficiently to users located all over the world?
Cisco recently joined forces with Akamai to create a truly powerful distributed computing platform. In fact, Akamai’s network is one of the world’s largest distributed-computing platforms, responsible for serving between 15 and 20 percent of all web traffic. But let’s look beyond this and examine the current user, cloud, and delivery model. The modern data center is literally becoming the home of everything. We now have services called “Everything-as-a-Service” and there’s quite a bit of information being transferred via the cloud. Furthermore, we have the idea of the Internet of Everything where anything you require can be delivered through the cloud. This is where Fog Computing comes in.
- Bringing the “edge” closer to the user. Content delivery is huge. With so much more emphasis on cloud computing, you know that there is more data being pushed down to the end user. Take a look at this report from Cisco:

[Source: Cloud Index Report, 2014]
We’re already in the zettabyte era. In fact, global cloud traffic crossed the zettabyte threshold in 2013, and by 2018 more than three-quarters of all data center traffic will be in the cloud. Cloud traffic will represent 76 percent of total data center traffic by 2018. With that in mind, edge computing strives to reduce the amount of bandwidth we use and lessen the latency for our content. By placing the information on servers closest to the user we’re able to deliver rich content quickly.
- Creating geographical distribution. So much data, so many data points. Information and data analytics are becoming crucial for organizations to understand both their business and the consumer. With edge/Fog computing, big data and analytics can be done faster with better results. Complex data engines no longer have to traverse large data sets across the WAN. Rather, they’re able to access these edge (Fog) systems in such a way that real-time data analytics becomes a reality on a truly distributed scale.
- Support for mobility and “Everything-as-a-Service.” With so many devices connecting to the cloud and the modern data center, administrators are tasked with creating true efficiency. In creating a Fog computing platform, you’re able to improve user performance and further enhance security and privacy issues. By controlling data at the edge, Fog computing integrates core cloud services with those of a truly distributed data center platform.
- Adoption is already happening. I bet you can think of a few examples already. Big data platforms love the concept of Fog computing because it accelerates their ability to process data. On the same note, consumer services love the concept of the edge as well. Folks like Netflix openly adopt this model. Here’s the thing: they’re not the only ones. Companies like Facebook, Twitter, AMD, Adobe, ESPN, Blizzard, and Trend Micro all use Fog computing and edge services to deliver rich content to their users. As more users connect to the cloud and request data heavy in content and size, utilizing the edge for fast delivery will make complete sense.
The truth of the matter is that cloud services and user device numbers are going to increase. As the modern data center becomes even more distributed, organizations are going to have to find ways to deliver a lot of data very quickly. Using edge networking and Fog computing can help bring that data much closer to both the organization and the end-user. | | 4:30p |
Does Your Data Center Pass the App Test? Lara Greden is a senior principal, strategy, at CA Technologies.
In today’s application economy the app is the public face of a successful brand and behind every great (or not so great) app is a data center. Apps need data. They need APIs. They need patches. They need authentication.
The data center is fundamental to the application economy, as well as supporting apps throughout their lifecycle; it also provides the compute resources needed for software development.
As organizations ramp up their development efforts and app portfolios, the pressure on the data center will only increase. With many facilities already struggling to meet business demands, IT departments need to ensure their data center is fit for the application economy.
Keeping Your Data Center in Check
DCIM enables organizations to give their data centers a health check and keep them in optimum condition. It helps achieve such optimum operating conditions by bringing greater control and visibility to three key areas:
- Availability: It can lose revenues. It can damage reputations. It can derail application programs. Data center downtime has far-reaching consequences, which is why 55 percent of organizations now measure the costs associated with downtime. IT and facilities teams need to spot the potential for a data center outage before it happens and take faster remedial action when it does. In the application economy, this is essential for safeguarding the time-to-market for new offerings and the user experience on existing apps.
- Capacity: With a steady stream of new apps and users, data center capacity can quickly disappear. This invariably results in capital expenditure on new resources, which can undermine the overall ROI from the application economy. This investment, however, is not always justified. Untapped capacity is concealed in every data center. Using 3D maps of existing data center assets and ‘what if’ scenarios, organizations can right-size rather than over-size their data center roll-outs, reducing both costs and complexity.
- Efficiency: Although data centers need to do more to support the application economy, they need to do it for less. As density and workloads increase, data center energy consumption can spiral out of control. IT and facilities departments can stop this spiral using tools that provide granular information and metrics about when and where energy is being consumed.
Increased Adoption of DevOps
Another area where DCIM can help organizations is by facilitating the move to DevOps, which can help accelerate the delivery of proven, high-quality applications. Almost half of application economy leaders have adopted DevOps.
The demand to launch new applications at an increased cadence is placing additional pressure on IT and facilities to work together to ensure that the data center infrastructure delivers efficiency, availability and agility. From real-time data collection for timely insight to analytics-backed capacity planning and integrated workflow capabilities, DevOps leaders are finding that DCIM solutions are an essential technology component for enabling the people and process changes needed to facilitate a DevOps culture.
With DevOps and DCIM underpinning development activities and ongoing management, organizations will be better placed to take full advantage of the application economy. And it could prove to be one big advantage.
Application economy leaders (as compared to laggards) are achieving more than double the revenue growth, 68 percent higher profit growth and have 50 percent more business coming from new products and services. These numbers speak for themselves.
Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library. |
|