Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
 
[Most Recent Entries] [Calendar View]

Friday, December 6th, 2013

    Time Event
    1:12p
    The December Cloud Job Update: Big Data, Applications, and Security
    As the cloud landscape evolves, new skills are required to navigate. (Illustration: Dreamstime, Jeff Gunderman)

    As the cloud landscape evolves, new skills are required to navigate. (Illustration: Dreamstime, Jeff Gunderman)

    Cloud computing continues to evolve at a near blistering pace. As more compute and bandwidth resources become available, engineers and cloud architects are able to do more with new, robust platforms.

    Originally, network, storage and compute engineers worked together to build massive cloud components. These IT professionals helped create the building blocks to today’s modern cloud architect and engineer.

    Although we still have dedicated data center professionals overlooking physical infrastructure, the field around cloud disciplines continues to grow. Dedicated cloud experts now focus in on specific areas of cloud computing and information delivery. As cloud continues to evolve, so will the skilled workers that support it. Here’s a look at the trends driving this evolution of cloud jobs:

    • Big Data and Intelligence. There is a growing need for architects and engineers who understand not just big data, but the correlation and quantitative process behind it all. When the buzz around big data first started, a lot of folks were confused by the concept. Many dismissed the technology as just a marketing term focused around lots of data traversing the WAN. The reality is very much different. With more devices, more connections coming in via the cloud, and more applications delivery through the Internet, there is a lot more valuable data to be analyzed. The power of big data and business intelligence is a powerful combination. Big data engines like Hadoop and MonboDB continue to grow in popularity. However, the solutions like SAS and Oracle Hyperion, and Microsoft BI are creating direct intelligence around massive amounts of data that a corporation may produce. Moving forward, big data isn’t just an idea around a lot of data. Future cloud engineers and architects who focus their time on controlling this information must understand how to create data logic. That means using solid big data engines to gather information and process it all via business intelligence tools.
    • The Application Continues to Advance. The application is becoming even more cloud-centric. New delivery models mean new ways to code and optimize applications. Like the applications, the end-user environment is changing as well. We are no longer as worried with the hardware as we are with how quickly and efficiently we can deliver applications and data. Through this evolution, applications are lighter, easier to deploy and can connect on an entirely new level. APIs are eliminating hops and layers that take up precious compute and networking cycles. New technologies are tying applications directly to the resources that they require. Architects and engineers who focus on the cloud must understand exactly where the application layer is heading. For example, new types of services are allowing applications to seek resources well outside their own data center. In utilizing SDKs and various APIs, services like backend-as-a-service are able to directly integrate various cloud services with both web and mobile applications. Already, there is a broad focus where open platforms aim to support every major platform including iOS, Android, Windows, and Blackberry. Furthermore, the BaaS platform aims to further enhance the mobile computing experience by integrating with cloud-ready hosting vendors like Azure, Rackspace and EC2. Basically, these types of services can give you the back-end while you create the front-end app.
    • Cloud, Data, and Workload Security. There needs to be a sanity check here. Cloud security is no joke and continues to evolve and move forward. The way that applications connect with both data center and cloud resources is changing as well. Some of the main standards now include AES 256-bit encryption and SSL/TLS protocols during transfer and storage of cloud information. Furthermore, enhanced levels of segregation allow for even greater controls of secure resources. More verticals are looking at data and file sharing options. More organizations are examining ways they can remain compliant while still moving their workloads into the cloud. For example, in the healthcare world, Citrix’s The ShareFile Cloud for Healthcare service provides a few slight modifications over its existing platform that make it suitable for use in sending, storing, and sharing protected health information (PHI). By signing a business associate agreement (BAA) and thereby becoming a business associate (BA), Citrix takes on extra liability during the transfer of data over their network. This allows their ShareFile Cloud for Healthcare to be HIPAA compliant. Another example is how far the hybrid cloud platform has come. Originally, to place regulatory or complaint-driven workloads in a public or even hybrid cloud was seen as very difficult. Now, public cloud providers are creating powerful, secure, platform capable of remaining complaint. Cloud providers, like Rackspace, are creating cloud environments capable of PCI/DSS compliance. These types of platforms create extra flexibility and open the doors for more organizations to move to a cloud infrastructure. Cloud security experts must continuously understand how data operates within the cloud and how it can be secured. Modern cloud security requirements span applications, users, devices, big data, file shares and much more. Of all the fields in the cloud category – security continues to evolve very quickly.

    The IT field is a great place to be right now. This is especially true if you’re always trying to learn something new. Cloud computing and the rapid evolution of the infrastructure that supports it has created a technology world where stagnation and complacency have absolutely no ground. IT professionals looking to move ahead must not only understand key cloud and infrastructure technologies, they must also apply their knowledge to the business world. One of the most valued assets within a technical person is their ability to understand business challenges and solve them with intelligent IT solutions. So, as you continue learning and understanding all that there is about cloud, never forget the important business drivers where cloud and next-generation technologies can help.

    1:30p
    2014 Will Be The Year That ARM Breaks The x86 Monoculture

    Graeme Caldwell works as an inbound marketer for InterWorx, a revolutionary web hosting control panel for hosts who need scalability and reliability.

    Graeme-Caldwell-tnGRAEME CALDWELL
    InterWorx

    If you’d asked anyone from the last four decades with even a modicum of tech knowledge to describe a server, they’d have given you the same basic response. They would have described a box containing discrete components, including a processor module, memory, various controllers, and the buses that connect them. That’s been the model on which servers have been built for decades and its a model that has shaped the way data centers are built.

    It’s not a model that is infinitely scalable. We live in a data-centric world. The quantities of data the worldwide data infrastructure has to process, store and transmit is growing rapidly. Faster processing and higher bandwidth has given the world a glimpse of the potential that “big data” has to change our lives. Everything from social media and search engines to disaster planning and the nascent “Internet of Things” will continue to push at the limitations of our available infrastructure, creating an outward pressure that incentivizes the building of ever more and ever larger data centers.

    However, because of the inherent limitations engendered by the x86 architecture, the servers built around that architecture, and the data centers constructed to house and support those servers, there is also an inward pressure that incentivizes a radical change to the way we think about building servers and architecting data centers. Data is currently too expensive to manage, both in terms of infrastructure investment and power consumption. In addition to expanding the number of data centers, we also need to focus on making those data centers as efficient as possible.

    ARM Comes to Market

    Over the last few years, ARM System-On-Chip components have been hailed as part of the solution. ARM SoCs, which take advantage of the the power efficiencies originally developed for mobile platforms, have the potential to revolutionize how we think about architecting the data center and designing servers. While the power of each SoC pales in comparison to a full blown server processor, multiple SoC units can be efficiently clustered into servers that blow away traditional architectures when it comes to processing power/watt and number of compute cores per square meter.

    Many companies have released and are currently working on ARM SoC based server products, including Calxeda and HP. Intel, recognizing that future trends are likely to seriously impact their x86 lines, have also been developing their Atom-based Avoton low-powered SoCs, which have something of an edge in that they are currently available in 64-bit variants, while ARM SoCs presently on the market are limited to 32 bits. All that will change next year though, with the introduction of 64-bit components from Calxeda, HP, and Applied Micro.

    Processor Market is Expanding

    For the first time in a long time, we’ll have a competitive market, with multiple manufacturers designing both general purpose SoC based servers and SoCs custom built to maximize efficiency on particular tasks, like running memcached instances. We can expect to see the pace of change accelerate as competing vendors strive to differentiate themselves with ever more efficient and powerful SoC-based products.

    Forward thinking providers of web hosting services and software should be aware of the coming shift towards low-power ARM-based SoC architecture and begin to make their products available on that platform. InterWorx, the company behind the InterWorx Web Control Panel, in partnership with Calxeda, has already demonstrated the viability of their advanced clustering technology on ARM architecture. As a proof-of-concept, they also have a site that runs on a pair of clustered Raspberry Pi’s, the ARM-based credit card-sized computer.

    In coming years, the web hosting industry and the wider data center industry are going to experience huge changes as low-powered servers revolutionize the way we think about data center architecture and efficiency.

    Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.

    2:00p
    Friday Funny: Natural Cooling

    It’s Friday and time for a bit of humor before the work week ends. Diane Alber, the Arizona artist who created Kip and Gary, has a new cartoon for Data Center Knowledge’s cartoon caption contest. We challenge our readers to submit a humorous and clever caption that fits the comedic situation. Please add your entry in the comments below. Then, next week, our readers will vote for the best submission.

    Here’s Diane’s thoughts on this week’s cartoon, “I thought Kip and Gary might take ‘natural cooling’ to a new level . . . ”

    Hearty congratulations to Colton Brown for submitting, “Catching the Fail Whale was so much easier!” for our Gobble, Gobble cartoon.

    New to the caption contest? Here’s how it works: We provide the cartoon and you, our readers, submit the captions. We then choose finalists and the readers vote for their favorite funniest suggestion. The winner receives a hard copy print, with his or her caption included in the cartoon!

    reindeer2013-470

    For the previous cartoons on DCK, see our Humor Channel.

    2:00p
    Need for Speed: How Groupon Migrated to Node.js

    Groupon-Sean-McCullough-Nod

    SAN FRANCISCO - Because it’s still a relatively young company, Groupon might seem like a place where application development happens fast. But engineers weren’t able to move quickly enough, prompting the company to make a major architectural move over the past year.

    “To change one color throughout the entire Groupon.com webpage, that was estimated to take three months to do,” said Groupon engineer Sean McCullough said at the Node Summit on Tuesday. ”We weren’t able to iterate at the speed that we needed to.”

    Engineers looked at a slew of alternatives to the Ruby on Rails web application framework that Groupon had used since its inception in 2008. They checked out PHP and the Twisted framework written in Python, among other options. They selected the popular Node.js server-side JavaScript runtime. It can scale fairly easily, and lots of developers are already familiar with JavaScript, McCullough said.

    But getting Node.js running in production wasn’t a straight shot.

    After developers hacked together their first application in Node – an email form – load tests showed high memory utilization. And during tests, the site went down for two hours. After the testing stopped, the outage ended, too, McCullough said. Groupon traced the problem to a load balancer that was handling test and development systems as well as production systems. That configuration should never have happened in the first place, but the Node testing highlighted it and sparked an outage.

    The incident proved that issues certainly can come up during such tinkering. “We never had to worry about the outgoing connection pool in Ruby,” McCullough said. “That was a new problem for us.”

    Once the team had overcome the obstacles with its first Node application and put it into production, the time came to create another one. This time, it was a more complex page that contained current deals and a user authentication feature. And it got about 10 times as much traffic as the first Node-based page. The plan was to launch the new version in about three months, but the move took twice as long.

    Despite the early hiccups, “at some point we were like, ‘I think this Node thing is going to work,’” McCullough said. The company directed developers to stop building on the company’s longstanding code base written with Ruby on Rails and port features over to Node. Now almost all web traffic hits the new platform.

    “We’re able to serve much higher traffic,” McCullough said. Before the change to Node, a Starbucks deal was so popular that it brought the site down. “The next time, that didn’t happen,” McCullough said. On top of that, he said, pages now take less time to load for end users.

    And indeed, it’s become easier for developers to tack on new features now, according to a Groupon engineering blog post on the move.

    Other companies have run across benefits of using Node.js, and now that Joyent has announced commercial support for it, more companies could feel compelled to follow Groupon’s lead.

    2:30p
    Intel Launches Communications Platform for Network Transformation
    intel-platform-communicaton

    High;and Forest is Intel’s new platform for network communications.

    Intel (INTC) this week launched Highland Forest, a communications platform that will drive the chipmaker’s ambitions for network transformation in software defined networking and network functions virtualization.

    After launching the Crystal Forest platform in 2012 Intel sees tremendous growth opportunities in the networking silicon market, including silicon for CPUs, ASICs, and FPGAs. The innovation in accelerator and chipset technologies resulted in a 2-6x performance improvement between generations.

    The Highland Forest platform will boost network performance up to 255 million packets per second, according to Intel, which is powering the platform with its Xeon E5-2600 v2 processors with the next generation of Intel Communications Chipset Series 89xx (Coleto Creek).

    Highlighting the performance gains, Intel GM and VP of the Data Center group Rose Schooler says the new platform scales up to 255 Mpps of L3 forwarding (64  bytes packets), 110 Gbps of IPsec throughput, 200 Gbps of OpenSSL throughput, and 140 Gbps of deep packet inspection (DPI) throughput. Highland Forest will also allow telecom customers to consolidate workloads.

    Intel special-purpose hardware accelerators are integrated in a family of pin-compatible server chipsets, called the Intel Communications Chipset 89xx Series, and in the Intel Atom processor C2758, which is ideal for entry-level, network equipment. A simple API invokes the hardware-based compression and cryptography acceleration supported by Intel QuickAssist Technology using Intel-developed or open source framework patches.

    The new communications platform enables organizations to scale up and down, using a common architecture and programming model under Highland Forest. To foster collaboration within Intel’s Network Builders ecosystem, the company seeks to increase ecosystem alignment, leading to the design of world-class solutions, from partners like Akamai, Arista, Brocade, Dell, F5, HP, Quanta, vmware, and many others.

    3:30p
    Box Raises $100 Million for Expansion

    In big data news for this week, Box gets $100 million to aid global expansion, the University of Florida selects DDN for converged infrastructure, HP helps Norfolk County harness big data, and SAP NetWeaver, a new rapid-deployment for near-line storage.

    Box continues expansion with $100 million investment.  Box announced new strategic partnerships, including a $100 million investment, for international expansion. The new relationships include commercial agreements and strategic investments from Japanese partners Itochu Technology Ventures, Macnica, and Mitsui USA and MKI. The enterprise file sync and share platform enjoyed many success stories throughout 2013, such as Schneider Electric, Toyota Motor Sales, USA, Rosetta Stone, and eBay. Since first opening a London office in June 2012, Box has also opened offices in Munich and Paris, while adding employees throughout Europe to address growing demand for partnerships and sales in the Nordics, Benelux, Spain and Italy. ”The combination of cloud and mobile technologies creates an entirely new way of working that will fundamentally reshape the IT industry,” said Aaron Levie, co-founder and CEO, Box. “Our new partners will help us connect and work with businesses in key global markets as they manage this transition.”

    University of Florida selects DDN. Data Direct Networks (DDN) announced that it has been selected by the University of Florida’s Interdisciplinary Center for Biotechnology Research (ICBR) as the foundation for a converged infrastructure designed to accommodate ever-increasing life sciences and bioinformatics workloads. To tame its Big Data growth within the constraints of limited data center space and funding and a lean administrative team, ICBR turned to DDN’s unique, appliance-based approach to converged infrastructure which allowed them to reduce their storage and application server footprint by 350 percent. “We now have the opportunity to build an authoritative, immutable data warehouse that provides a safe harbor in the middle of the ‘Wild Wild West’ of scientific research,” said Aaron Gardner, Cyberinfrastructure Section Director at ICBR. ”Based on that, as well as the ability to increase performance with smaller footprint, fewer hardware costs, lower management overhead and latency makes me confident we are headed in the right direction.”

    HP selected by Norfolk County Council to harness big data.  HP (HPQ) Enterprise Services  announced it is working with Norfolk County Council (NCC) in a groundbreaking new initiative to boost the local economy, solve social problems and safeguard vulnerable people while saving costs. The two organizations will create a cloud-based information hub to transform the delivery of integrated public services in Norfolk, driving efficiencies through smart use of technology and multi-agency collaboration. Based on HP Autonomy IDOL, HP RM, HP Vertica Analytics Platform, Visionware and Microsoft Windows 8.1 and Office365 software, NCC’s new information hub will be integrated through HP Enterprise Services Information Management and Analytics Advisory services. “Our vision is to deliver world-class integrated public services that stimulate and support a sustainable knowledge economy in Norfolk,” said Tom Baker, chief information officer, Norfolk County Council. “HP will contribute to the economic, social and environmental sustainability of the County by enabling multiple agencies to effectively participate in joint service delivery. The creation of a platform for more joined-up collaboration between NCC and other partners will make it possible to get a single view. In addition, HP will enable us to make cost savings of about 20 percent.”

    SAP NetWeaver helps reduce strain of big data.  SAP announced a new rapid-deployment solution that enables transparent access to historical data for reporting and query capabilities. The SAP NetWeaver BW Near-Line Storage rapid-deployment solution enables seamless data transfer between the business warehouse (BW) and the near-line storage that holds important historical data. Powered by the SAP HANA platform, the solution can  be up and running in as little as 12 weeks. “With the unprecedented growth and sheer volume of data companies receive today, executives must decide where to house it all in order to keep costs down while keeping it close for reporting purposes,” said Dr. Bernd Welz, executive vice president and global head, Solution & Knowledge Packaging, SAP. “By implementing the SAP NetWeaver BW Near-Line Storage rapid-deployment-solution, customers can experience faster online querying through a reduced amount of data in the business warehouse and its proximity to SAP Sybase IQ. Businesses can now work with archived data on an as-needed basis with the same agility they have come to expect with live data.”

    << Previous Day 2013/12/06
    [Calendar]
    Next Day >>

Data Center Knowledge | News and analysis for the data center industry - Industry News and Analysis About Data Centers   About LJ.Rossia.org