AnandTech's Journal
 
[Most Recent Entries] [Calendar View]

Friday, January 22nd, 2016

    Time Event
    8:10a
    The Apple iPad Pro Review

    At this point it probably isn’t a secret that tablet sales have leveled off, and in some cases they have declined. Pretty much anywhere you care to look you’ll see evidence that the tablet market just isn’t as strong as it once was. It’s undeniable that touch-only tablets have utility, but it seems that the broader market has been rather lukewarm about tablets. I suspect at least part of the problem here is that the rise of the phablet has supplanted small tablets. Large tablets are nice to have, but almost feel like a luxury good when they’re about as portable as an ultrabook. While a compact laptop can’t easily be used while standing, or any number of other situations where a tablet is going to be better, a compact laptop can do pretty much anything a touch-only tablet can. A laptop is also going to be clearly superior for a significant number of cases, such as typing or precise pointing.

    This brings us to the iPad Pro. This is probably the first time Apple has seriously deviated from traditional iPad launches, putting together a tablet built for (limited) productivity and content creation rather than just simple content consumption, creating what's arguably the iPad answer to the Surface Pro. To accomplish this, Apple has increased the display size to something closer to that of a laptop, and we see the addition of a stylus and a keyboard cover for additional precision inputs. Of course, under the hood there have been a lot of changes as well. Read on for the full review of the Apple iPad Pro.

    9:01a
    Interview with Ian Livingstone CBE: Gaming in VR and Development in the UK

    This week I decided last minute to attend PG Connects, a trade show conference on mobile gaming, attended by developers and business looking to promote or sell their games and services. As part of the conference, several presentation tracks relating to mobile gaming, such as promotion, media interaction and ‘tales of the industry’ were included to help educate the (mostly young) developers present. There were also a few of the old guard in the UK games industry presenting, and I jumped at the opportunity to speak to Ian Livingstone for a quick fifteen minutes.

    Ian Livingstone is a well-known figure, particularly in the UK, for the many roles he has played in developing the sector from starting with text and table-top based imagination gaming right the way through to full on graphical immersion.

    - Ian started in 1975 by co-founding Games Workshop, the miniature wargaming company that quickly spread as a vestige for Dungeons & Dragons and Warhammer enthusiasts to gain supplies to build battlefields, paint figurines, or teach newcomers. As part of this, Games Workshop brought the official original D&D to the UK.

    - Ian is also the co-founder and co-writer of the Fighting Fantasy series of RPG novels, part of the Choose Your Own Adventure style of story-telling. This was the ‘to turn left, go to page 72’ sort of dungeon crawlers that would explain the narrative but still leave the important decisions to the reader. I have fond memories of these books.

    - On the videogame side, Ian is the former Life President of Eidos Interactive, originally investing and doing design work for publisher Domark before it was acquired by Eidos. Part of Ian’s role involved securing the popular Eidos franchises and IP such as Tomb Raider and Hitman as the industry evolved. Eidos was acquired by Square Enix in 2009, and since then Ian has been a champion of the UK games industry. In 2011, he was tasked by the UK government to produce a report reviewing the UK video games industry, described as ‘a complete bottom up review of the whole education system relating to games’. Ian’s current interests, aside from promoting the strength of UK gaming, involves investing in talent for the gaming industry and the future.

    - In recognition for his work, Ian was appointed an OBE and CBE for services to the gaming industry, won the BAFTA Interactive Special Award and Fellowship, a British Inspiration Award and has an Honorary Doctorate of Technology by the University of Abertay, Dundee.

    Virtual Reality

    Ian Cutress: What are your thoughts on VR (Virtual Reality)?

    Ian Livingstone: Technology evolves in the gaming industry like no other entertainment industry. There’s always a new platform that comes along that gets people very excited when it comes to leveraging their content to new areas, new technologies and new audiences. Of course VR is causing that excitement right now. We have seen in previous years, and not too long ago, places like Facebook became a great platform for commercial games and mobile became an amazing platform for games people who didn’t even think of themselves as gamers.  It became a mass market entertainment industry because of Apple coming along with swipe technology and then everyone was able to play a game. People were no longer intimidated by sixteen button controllers which was the realm of console gamers. So then video games become a mass market if it is intuitive - if people don’t have to learn any particular rules or even learn how to play. Therefore I would hope that VR, at the starting point, is a mass market entertainment device in allowing people to play intuitively.

    Now clearly Mark (Zuckerburg) didn’t buy Oculus merely as a games platform – he sees it as an immersive social platform that will include games but it is going to be much wider in scope. But from a games point of view it is a fantastic opportunity yet again, allowing people to have experiences they couldn’t have without it. My worry about it is that it is going to be too much content on a device that is going to be too expensive at launch.

    IC: So your thoughts on $600 for Oculus?

    IL: It’s a lot. In many ways it is a peripheral, and peripherals have never been hugely successful unless they became the technology of the day. So a peripheral-based idea like Guitar Hero – it was hugely successful and people were prepared to pay a lot of money for a single trick device. Clearly VR gives you the scope to play many games on the device but in short term as far as developers are concerned they are more likely to be getting revenue from the hardware manufacturers rather than consumers as it is sort of a strange launch point because of people being wary of VR, not being used to having a device around their head for more than five minutes when playing games, or motion sickness due to any sort of acceleration that makes some people feel a bit queasy. I think there’s a huge amount of excitement, a huge amount of opportunity, but it’s not going to be a slam dunk. I think there’s going to be a lot of people who don’t succeed but there’s going to be some fantastic success stories.

    IC: When you say succeed; are you speaking more on hardware of software?

    IL: On the software side. I mean everyone seems to be creating some sort of VR opportunity today and the consumers can’t possibly digest it all. I’m just caveating the excitement behind VR with a little bit of realism! This is quite a change in games.

    IC: What price would a headset have to be more widely accepted?

    IL: One of the issues is that you can buy a console for less!

    IC: So does a VR headset have to be an integrated gaming system on its own, or does it have to reduce down?

    IL: I would think it has to reduce down to that $150 mark. At $600 it can’t be a mass market proposition today. But as we know, technology always starts off expensive – the early adopters are going to buy it no matter what the price and over time the market will sort out what price it should be in order for it to be successful. But it many ways, hardware is a tough business to be in. I mean Sega pulled out of hardware, Nintendo has had its highs and its lows in hardware. It’s a tough business, and by comparison software is a lot easier.

    IC: How many of the headsets have you tried personally? Any favourites?

    IL: I’ve tried three, but I don’t feel qualified to comment on any in particular! I’ve enjoyed the experience if there’s no acceleration involved because I do feel a little bit queasy. Apart from games I have toured the Serengeti and climbed a couple of mountains, and that has been fantastic. I’ve sat in a cockpit of a plane too.

    IC: Today in your talk you mentioned that the App Store and Google Play were essentially the world’s largest shops with the smallest shop windows, referring to the top lists where everyone is trying to game the system. Is there anything that could be done to improve it? Is this even a problem?

    IL: I think everyone is tired of seeing the same top ten! Users want to know more, so the App Store has to give a way for greater discoverability for great games that aren’t being seen. That is easier said than done, and there isn’t a single answer. But I know it would be welcomed by consumers and creators alike.

    The UK and Gaming Education

    IC: What makes the UK a good place to make games? We’ve seen other regional industries dissolve but the UK is still strong.

    IL: We have a rich heritage of making games, and got off to a flying start in the 1980s when kids were coding in schools – plus we are a naturally creative nation with our film, our fashion, our music, architecture, design, our publishing and now of course our games industry. We have that ability to create entertainment that resonates with global audiences and most of our content is admired around the world. We have that ability to create unique entertainment – it’s a magic fairy dust that makes you come back time and time again and we punch way above our weight in content creation. So combine creativity with the early adoption of technology and hey presto: video games!

    IC: Are there any video games made in the UK that you feel don’t get that ‘made in UK’ recognition?

    IL: There are many cases of games that people would not know have originated in the UK. Grand Theft Auto V, developed in Scotland by Rockstar North, the incredible and largest entertainment franchise in any medium and not always known that it was developed in the UK. The success of companies like Jagex with Runescape, or that originally Tomb Raider was developed in the UK. Games like Football Manager probably have been mostly acknowledged as being from the UK! But companies like Creative Assembly with their Total War series, or Moshi Monsters, CSR Racing. There’s a huge list of content and new successes – Batman from Rocksteady for example. The list is seemingly endless, but most people assume that video games are developed in the United States or Japan, so they don’t get recognized as being from the UK, plus we’re not very good at blowing our own trumpet! We don’t shout about our successes. That’s why I always try to get the message out to media, to parents and to investors that we are very good at making games, it’s a great British success story, it’s a proper job and it’s a real investment opportunity – so go for it.


    Ian Livingstone's TEDxZurich talk on 'The Power of Play'

    IC: You’ve been working with the UK Government on a number of projects for the gaming industry. Can you talk about what you’ve done in this field in recent years?

    IL: I’m delighted the way the UK Government is now very supportive of the video games industry here. I’ve worked a lot with Ed Vaisey, the Culture Minister, on a number of projects. I was chair of the Computer Games Skills Council for Creative Skillset for seven years and we mapped out every university course with the word ‘games’ in them. Out of the 144 courses, we only felt able to accredit ten of those courses as being fit for purpose to earn the Creative Skillset Kite at the time.

    As an industry we’re struggling to find enough computer programmers of a high enough quality for some of the games in development. It was crazy that in the early days we had so many young people unemployed and we were so good at making games and programming that we had to outsource production overseas. Also the fact that a lot of our (UK) companies had to be bought out because they couldn’t access finance because the investment community didn’t understand the value of digital intellectual property or the ability to scale great games very profitably and globally.

    So the government tasked Alex Hope (the Managing Director of Double Negative, a major UK video effects studio) and I to write a review called Next Gen which was published by Nesta and we made twenty recommendations about education and additional education (for the skills related to the gaming industry). We found that IT taught in schools was largely a strange hybrid of office skills. Kids were being bored to death with Word, PowerPoint and Excel. Against all odds we were actually putting them off technology while they ran their lives through social media, using a phone as almost a part of their brain. Effectively ICT was teaching kids how to read but not how to write. They could use an application but not make an application. They could play a game but not make a game. What we wanted to do was turn them from consumers to creators of technology, so our number one recommendation in Next Gen was to put Computer Science as an essential discipline on the national curriculum. Next Gen came out in 2011, and the Department for Education at first said they weren’t interested in our recommendations and that ICT was perfectly fine. It might have been fine for what it was but it was outdated, outmoded and absolutely no good for the 21st century skills required.

    So we started the Next Gen Skills Coalition backed by UKIE, the trade body association for UK Interactive Entertainment, for campaigning and talks and being mad campaigners for about four years when we finally got to meet Michael Gove’s (the Education Minister at the time) special advisors. Eric Schmidt (current Executive Chairman of Alphabet, formerly Google) also referenced Next Gen in his MacTaggart Lecture in 2011. We finally got to meet Michael Gove himself, and to his credit he isn’t always Mr. Popular when it comes to further education, but he did take on-board our recommendations and said he would change the curriculum. 2014 saw the new curriculum coming to English schools so now every child can have the opportunity to learn how to code, and more importantly how to think computationally, problem solve and give them better skills for the 21st century and for jobs that don’t yet exist rather than training for jobs that will no longer exist. So we’re getting from the passenger seat to the driver seat in technology and hopefully the UK might be able to create the next Google, Facebook or Twitter, as well as its games. 

    There are a lot more university courses now accredited aside from those initial ten, but the important thing was changing the curriculum in schools, moving away from entry level digital literacy to a much higher set of skills. Not everyone is going to become a coder or a programmer but they should understand how code works to be a true digital citizen. You have to understand its place, so I think digital literacy is as important as literacy and numeracy for the 21st century and you could argue that computer science is the new Latin because it underpins the digital world in the way that Latin underpins the analogue world. So we have to think about digital creativity and to make things interesting – get kids to build an app, make a game, build a website, do some robotics and to learn by doing in order to create.

    I think games are also misunderstood as a medium. You can park your prejudice against one or two titles and think about what is happening when you play the game – you problem solve, you learn intuitively, you’re in a fair and safe environment, you’re almost incentivised to try again, you’re not punished for your mistakes and it enables creativity. Like Minecraft where you are building these wonderful 3D architectural worlds like digital Lego and sharing them with your friends. For me games are a wonderful learning tool, and why can’t learning be fun and playful – there’s no reason not to be.

    The second thing with Ed Vaisey is that he did understand the need for access to finance and helped bring about the introduction of tax credits because film and TV have already had that access and the games industry has never had any help. There’s no BFI (British Film Institute) or Film Council equivalent. There were certainly no tax incentives. So now we’ve got production tax credits so we can build games that would not ordinarily have been built from a cultural sense or from an economic sense.

    IC: When you see somebody that has a good idea for a game or for content, what is the barrier to production (talent, financial, etc.)?

    IL: All of the above!

    IC: Are there any current bottlenecks?

    IL: The best thing to do is to make a game, learn from your mistakes, and then make another game. Fail fast. There’s no point in saying you had an idea for a game – having the idea is very easy and we can all say that. You have to find out if you’re up to doing it. But don’t be put off by failure – failure is just success in progress. Angry Birds was Rovio’s 51st game, not their first game. So you have to have some real passion and follow your heart. Hopefully one day you will find an audience and find a way.

    IC: What are your current projects?

    IL: I’m currently applying to open a free school (a non-profit, state funded, but not run by the state, similar to an academy, subject to the same rules as state schools). Its aim is to be the flagship school on all the things I’ve been campaigning for. So more creativity in the classroom, more computer science, more computational thinking, more project based work and more learning by doing to get people creative in games as a cross disciplinary approach to problem solving rather than rote learning of silent subjects. It will have greater engagement and greater traction with kids because Generation Z is different. They naturally collaborate, they naturally share, and collaboration shouldn’t be seen as cheating because it’s what we do in the workplace. So let’s work with that and bring the workplace closer to the classroom and vice versa.

    Many thanks to Ian for his time at PG Connects, and best of luck in his future endeavours. Hopefully in a few years we can loop back and get his opinion again on how the industry is changing.

    Relevant Links

    Ian Livingstone's Twitter: https://twitter.com/ian_livingstone
    Next Gen Report: http://www.nesta.org.uk/publications/next-gen

    The Power of Play, Ian Livingstone's TEDxZurich talk: https://www.youtube.com/watch?v=58P8JU5p_Z4
    How British Video Games Became a Billion Pound Industry (BBC): http://www.bbc.co.uk/timelines/zt23gk7
    Eric Schmidt’s MacTaggart Lecture 2011: https://www.youtube.com/watch?v=hSzEFsfc9Ao
    Creative Skillset: http://creativeskillset.org/
    Free School Application: http://www.bbc.co.uk/news/technology-29550486

    12:00p
    Logitech Formally Exits OEM Mouse Market

    In a bit of news that is a sign of the times, this week Logitech announced that it had completed its exit from the OEM mouse business. The company no longer sells OEM mice, which for a long time accounted for a large portion of Logitech’s revenue. Instead the company will continue to focus on new categories of premium products for retail markets.

    Logitech was among the first companies to mass-produce computer mice back in the eighties. For decades, its mice were supplied with PCs made by various manufacturers and for a long time Logitech’s brand was synonymous to pointing devices. In fact, Logitech’s U96 is among the world’s most famous optical mice since it was bundled with millions of PCs. However, a lot has changed for Logitech in recent years. As sales of desktop PCs began to stagnate in the mid-2000s and the competition intensified, OEM margins dropped sharply. At some point, OEM business ceased to make sense for Logitech: there was no growth and profitability was minimal.

    Last March the company announced plans to stop selling OEM devices, and in December Logitech made its last-time shipments, entirely depleting its inventory. Sales of OEM hardware accounted for about 4.45% of the company’s revenue in Q3 FY2016, which ended on December 31, 2015. Due to razor-thin margins, Logitech’s OEM business was not exactly something that could be sold for a lot, according to the company. Moreover, it did not make a lot of sense for Logitech to sell it and license the brand to a third party.

    Logitech has been expanding its product portfolio for many years now and while mice, trackballs and keyboards remain three key types of products for the company, they no longer account for the lion’s share of Logitech’s revenue. The manufacturer recognizes gaming gear (which includes mice, keyboards, speakers, headsets, controllers and other devices), mobile speakers, video collaboration as well as tablet and other accessories as its key growth categories of products. Net sales of Logitech's growth category products totaled $224.87 million in Q3 FY2016, net sales of traditional devices totaled $368.87 million, whereas OEM business brought only $26.512 million in revenue. The lack of OEM mice in Logitech's portfolio will be offset by growing sales of other products.

    Ultimately even though Logitech stopped to sell cheap mice to producers of PCs, Logitech remains one of the world’s largest suppliers of pointing devices and keyboards, and many premium personal computers still come equipped with the company’s advanced keyboards and mice designed for gamers. These days the company has also taken on a more well-rounded portfolio, with significant presences in speakers, PC headsets, webcams, remotes and other devices.

    3:00p
    GDDR5X Standard Finalized by JEDEC: New Graphics Memory up to 14 Gbps

    In Q4 2015, JEDEC (a major semiconductor engineering trade organization that sets standards for dynamic random access memory, or DRAM) finalized the GDDR5X specification, with accompianing white papers. This is the memory specification which is expected to be used for next-generation graphics cards and other devices. The new technology is designed to improve bandwidth available to high-performance graphics processing units without fundamentally changing the memory architecture of graphics cards or memory technology itself, similar to other generations of GDDR, although these new specifications are arguably pushing the phyiscal limits of the technology and hardware in its current form.

    The GDDR5X SGRAM (synchronous graphics random access memory) standard is based on the GDDR5 technology introduced in 2007 and first used in 2008. The GDDR5X standard brings three key improvements to the well-established GDDR5: it increases data-rates by up to a factor of two, it improves energy efficiency of high-end memory, and it defines new capacities of memory chips to enable denser memory configurations of add-in graphics boards or other devices. What is very important for developers of chips and makers of graphics cards is that the GDDR5X should not require drastic changes to designs of graphics cards, and the general feature-set of GDDR5 remains unchanged (and hence why it is not being called GDDR6).

    Performance Improvements

    Nowadays highly binned GDDR5 memory chips can operate at 7 Gbps to 8 Gbps data rates. While it is possible to increase performance of the GDDR5 interface for command, address and data in general, according to Micron Technology, one of the key designers of GDDR5X, there are limitations when it comes to array speed and command/address protocols. In a bid to improve performance of the GDDR5 memory, engineers had to change internal architecture of memory chips significantly.

    The key improvement of the GDDR5X standard compared to the predecessor is its all-new 16n prefetch architecture, which enables up to 512 bit (64 Bytes) per array read or write access. By contrast, the GDDR5 technology features 8n prefetch architecture and can read or write up to 256 bit (32 Bytes) of data per cycle. Doubled prefetch and increased data transfer rates are expected to double effective memory bandwidth of GDDR5X sub-systems. However, actual performance of graphics cards will depend not just on DRAM architecture and frequencies, but also on memory controllers and applications. Therefore, we will need to test actual hardware to find out actual real-world benefits of the new memory.

    Just like the predecessor, GDDR5X functions with two different clock types - a differential command clock (CK) to where address and command inputs are referenced, as well as a forwarded differential write clock (WCK) where read and write data are referenced to. WCK runs at a frequency that is two times higher than the CK. The data can be transmitted at double data rate (DDR) or quad data rate (QDR) relative to the differential write clock (WCK), depending whether 8n prefetch or 16n prefetch architecture and protocols are used. Accordingly, if makers of chips manage to increase CK clock to 1.5 GHz, then data rate in QDR/16n mode will rise to 12 Gbps.

    Since the GDDR5X protocol and interface training sequence are similar to those of the GDDR5, it should be relatively easy for developers of chips to adjust their memory controllers to the new type of memory. However, since the QDR mode (which is called Ultra High Speed mode in Micron’s materials) mandates usage of PLLs/DLLs (Phase Locked Loops, Delay Locked Loops), there will be certain design changes to design of high-end memory chips.

    JEDEC’s GDDR5X SGRAM announcement discusses data rates from 10 to 14 Gbps, but Micron believes that eventually they could be increased to 16 Gbps. It is hard to say whether commercial chips will actually hit such data rates, keeping in mind that there are new types of memory incoming. However, even a 256-bit GDDR5X memory sub-systems running at 14 Gbps could provide up to 448 GBps of memory bandwidth, just 12.5% lower compared to that of AMD’s Radeon R9 Fury X (which uses first-gen HBM).

    GPU Memory Math
      AMD Radeon
    R9-290X
    NVIDIA GeForce
    GTX 980 Ti
    NVIDIA GeForce
    GTX 960
    AMD Radeon
    R9 Fury X
    Samsung's 4-Stack HBM2 based on 8 Gb DRAM Theoretical GDDR5X 256-bit
    sub-system
    Theoretical GDDR5X 128-bit
    sub-system
    Total Capacity 4 GB 6 GB 2 GB 4 GB 16 GB 8 GB 4 GB
    B/W Per Pin 5 Gb/s 7 Gb/s 7 Gb/s 1 Gb/s 2 Gb/s 14 Gb/s 14 Gb/s
    Chip capacity 2 Gb 4 Gb 4 Gb 1 GB 4 GB 1 GB
    (8 Gb)
    1 GB
    (8 Gb)
    No. Chips/Stacks 16 12 4 4 4 8 4
    B/W Per Chip/Stack 20
    GB/s
    28
    GB/s
    28
    GB/s
    128
    GB/s
    256
    GB/s
    56
    GB/s
    56
    GB/s
    Bus Width 512-bit 384-bit 128-bit 4096-bit 4096-bit 256-bit 128-bit
    Total B/W 320
    GB/s
    336
    GB/s
    112
    GB/s
    512
    GB/s
    1
    TB/s
    448
    GB/s
    224
    GB/s
    Estimated DRAM
    Power Consumption
    30 W 31.5 W 10 W 14.6 W n/a 20 W 10 W

    Capacity Improvements

    Performance was not the only thing that developers of the GDDR5X had to address. Many applications require not only high-performance memory, but a lot of high-performance memory. Increased capacities of GDDR5X chips will enable their adoption by broader sets of devices in addition to graphics/compute cards, game consoles and network equipment as well as other areas. Initially one would expect the high density configurations to be slightly conservative on frequency to begin with.

    The GDDR5 standard covered memory chips with 512 Mb, 1 Gb, 2 Gb, 4 Gb and 8 Gb capacities. The GDDR5X standard defines devices with 4 Gb, 6 Gb, 8 Gb, 12 Gb and 16 Gb capacities. Typically, mainstream DRAM industry tends to double capacities of memory chips because of economic and technological reasons. However, with GDDR5X the industry decided to ratify SGRAM configurations with rather unusual capacities — 6Gb and 12Gb.

    The mobile industry already uses LPDDR devices with 3 Gb, 6 Gb and 12 Gb capacities in a bid to maximize flexibility of memory configurations for portable electronics. As it appears, companies developing standards for graphics DRAM also wanted to capitalize on flexibility. A GDDR5X chip with 16 Gb capacity made using 20 nm or 16/18 nm process technology would have a rather large die size and thus high cost. However, the size and cost of a 12 Gb DRAM IC should be considerably lower and such a chip could arguably address broader market segments purely on cost.

    Just like in case of the GDDR5, the GDDR5X standard fully supports clamshell mode, which allows two 32-bit memory chips to be driven by one 32-bit memory controller by sharing address and command bus while reducing the number of DRAM IC’s I/Os to 16. Such operation has no impact on system bandwidth, but allows doubling the amount of memory components per channel. For example, it should be theoretically possible to build a graphics card with 64 GB of GDDR5X using one GPU with a 512-bit memory bus as well as 32 16 Gb GDDR5X memory chips.

    Unusual capacities will help GDDR5X to better address all market segments, including graphics cards, HPC (high performance computing), game consoles, network equipment and so on. However, it should be noted that the GDDR5X has extremely potent rival, the second-gen HBM, which offers a number of advantages, especially in the high-end segment of the graphics and HPC markets.

    Energy Efficiency

    Power consumption and heat dissipation are two major limiting factors of compute performance nowadays. When developing the GDDR5X standard, the industry implemented a number of ways to keep power consumption of the new graphics DRAM in check.

    Supply voltage and I/O voltages of the GDDR5X were decreased from 1.5V on today’s high-end GDDR5 memory devices to 1.35V. Reduction of Vdd and Vddq should help to cut power consumption of the new memory by up to 10%, which is important for high-performance and mobile devices where the memory can take a sizable chunk of the available power budget.

    The reduction of supply and I/O voltages is not the only measure to cut power consumption of the new memory. The GDDR5X standard makes temperature sensor controlled refresh rate a compulsory feature of the technology, something that could help to optimize power consumption in certain scenarios. Moreover, there are a number of other features and commands, such as per-bank self refresh, hibernate self refresh, partial array self refresh and other, that were designed to shrink the energy consumption of the new SGRAM.

    Due to lower voltages and a set of new features, power consumption of a GDDR5X chip should be lower compared to that of a GDDR5 chip at the same clock-rates. However, if we talk about target data rates of the GDDR5X, then power consumption of the new memory should be similar or slightly higher than that of GDDR5, according to Micron. The company says that GDDR5X’s power consumption is 2-2.5W per DRAM component and 10-30W per board. Even with similar/slightly higher power consumption compared to the GDDR5, the GDDR5X is being listed as considerably more energy efficient due to its improved theoretical performance.

    We do not know specifications of next-generation graphics adapters (for desktops and laptops) from AMD and NVIDIA, but if developers of GPUs and DRAMs can actually hit 14 Gb/s data-rates with GDDR5X memory, they will double the bandwidth available to graphics processors vs GDDR5 without significantly increasing power consumption. Eventually, more efficient data-rates and unusual capacities of the GDDR5X could help to actually decrease power consumption of certain memory sub-systems.

    Implementation

    While internally a GDDR5X chip is different from a GDDR5 one, the transition of the industry to GDDR5X is a less radical step than the upcoming transition to the HBM (high-bandwidth memory) DRAM. Moreover, even the transition from the GDDR3/GDDR4 to the GDDR5 years ago was considerably harder than transition to the GDDR5X is going to be in the coming years.

    The GDDR5X-compliant memory chips will come in 190-ball grid array packaging (as compared to 170-ball packaging used for current GDDR5), thus, they will not be pin-to-pin compatible with existing GDDR5 ICs or PCBs for modern graphics cards. But while the GDDR5X will require development of new PCBs and upgrades to memory controllers, everything else works exactly like in case of the GDDR5: the interface signal training features and sequences are the same, error detection is similar, protocols have a lot of resemblances, even existing GDDR5 low and high speed modes are supported to enable mainstream and low-power applications. BGA packages are inexpensive, and they do not need silicon interposers nor use die-stacking techniques which HBM requires.

    Implementation of GDDR5X should not be too expensive both from R&D and production perspectives; at least, this is something that Micron implied several months ago when it revealed the first details about the technology.

    Industry Support

    The GDDR5X is a JEDEC standard supported by its members. The JEDEC’s document covering the technology contains vendor IDs for three major DRAM manufacturers: Micron, Samsung and SK Hynix. Identification of the memory producers are needed for controllers to to differentiate between various vendors and different devices, and listing the memory makers demonstrates that they participated in development, considered features and balloted on them at JEDEC’s meetings, which may indicate their interest in supporting the technology. Unfortunately, exact plans for each of the companies regarding GDDR5X production are unknown, though we would expect GDDR5X parts to fit between the current GDDR5 high end and anything implementing HBM, or for implementing higher memory capacity on lower end GPUs. Micron plans to start mass production of its GDDR5X memory chips in mid-2016, so we might see actual GDDR5X-based memory sub-systems in less than six months from now.

    NVIDIA, currently the world’s largest supplier of discrete graphics processors, said that that as a member of JEDEC it participates in the development of industry standards like GDDR5X. AMD is also a member of JEDEC and it usually plays a key role in development of memory standards. Both of these companies also employ compression algorithms to allieviate the stress on texture transfers between the GPU and memory, and thus an increase in bandwidth (as shown by Fiji) plus an increase in density can see benefits in texture rich or memory bound compute scenarios.

    While specific plans of various companies regarding the GDDR5X are unclear, the technology has a great potential if the numbers are accurate (it has to be, it's a standard) and has all chances to be adopted by the industry. The main rival of the GDDR5X, second-generation HBM, can offer higher bandwidth, lower power consumption and smaller form-factors, but at the cost of design and manufacturing complexities. In fact, what remains to be seen is whether the HBM and the GDDR5X will actually compete directly against each other or will just become two complementary types of memory. Different applications nowadays have different requirements, and an HBM memory sub-system with 1 TBps of bandwidth makes a perfect sense for a high-end graphics adapter. However mainstream video cards should work perfectly with GDDR5X, and chances are we will see both in play at different market focal points.

    << Previous Day 2016/01/22
    [Calendar]
    Next Day >>

AnandTech   About LJ.Rossia.org