AnandTech's Journal
 
[Most Recent Entries] [Calendar View]

Friday, May 22nd, 2020

    Time Event
    8:00a
    Avantek's Arm Workstation: Ampere eMAG 8180 32-core Arm64 Review

    Arm desktop systems are quite a rarity. In fact, it’s quite an issue for the general Arm software ecosystem in terms of having appropriate hardware for developers to actually start working in earnest on more optimised Arm software.

    To date, the solution to this has mostly been using cloud instances of various Arm server hardware – it can be a legitimate option and new powerful cloud instances such as Amazon’s Graviton2 certainly offer the flexibility and performance you’d need to get things rolling.

    However, if you actually wanted a private local and physical system, you’d mostly be relegated to small low-performing single-board computers which most of the time had patchy software support. It’s only been in the last year or two where Arm-based laptops with Qualcomm Snapdragon chips have suddenly become a viable developer platform thanks to WSL on Windows.

    For somebody who wants a bit more power and in particular is looking to make use of peripherals – actively using large amounts of storage or PCIe connectivity, then there’s options such as Avantek’s eMag Workstation system.

    10:30a
    Gaming AIs: NVIDIA Teaches A Neural Network to Recreate Pac-Man

    Following last week’s virtual GTC keynote and the announcement of their Ampere architecture, this week NVIDIA has been holding the back-half of their conference schedule. As with the real event, the company has been posting numerous sessions on everything NVIDIA, from Ampere to CUDA to remote desktop. But perhaps the most interesting talk – and certainly the most amusing – is coming from NVIDIA’s research group.

    Tasked with developing future technologies and finding new uses for current technologies, today the group is announcing that they have taught a neural network Pac-Man.

    And no, I don’t mean how to play Pac-Man. I mean how to be the game of Pac-Man.

    The reveal, timed to coincide with the 40th anniversary of the ghost-munching game, is coming out of NVIDIA’s research into Generative Adversarial Networks (GANs). At a very high level, GANs are a type of neural network where two neural networks are trained against each other – typically one learning how to do a task and the other learning how to spot the first doing that task – with the end goal being that the competition between the networks can help make the two networks better by forcing them to improve to win. In terms of practical applications, GANs have most famously been used in research projects to create programs that can create realistic-looking images of real-world items, upscale existing images, and other image synthesis/manipulation tasks.

    For Pac-Man, however, the researchers behind the fittingly named GameGAN project took things one step further, focusing on creating a GAN that can be taught how to emulate/generate a video game. This includes not only recreating the look of a game, but perhaps most importantly, the rules of a game as well. In essence, GameGAN is intended to learn how a game works by watching it, not unlike a human would.

    For their first project, the GameGAN researchers settled on Pac-Man, which is as good a starting point as any. The 1980 game has relatively simple rules and graphics, and crucially for the training process, a complete game can be played in a short amount of time. As a result, over 50K “episodes” of training, the researchers taught a GAN how to be Pac-Man solely by having the neural network watch the game being played.

    And most impressive of all, the crazy thing actually works.

    In a video released by NVIDIA, the company is briefly showing off the Pac-Man-trained GameGAN in action. While the resulting game isn’t a pixel-perfect recreation of Pac-Man – notably, GameGAN’s simulated resolution is lower – the game none the less looks and functions like the arcade version of Pac-Man. And it’s not just for looks, either: the GameGAN version of Pac-Man accepts player input, just like the real game. In fact, while it’s not ready for public consumption quite yet, NVIDIA has already said that they want to release a publicly playable version this summer, so that everyone can see it in action.

    Fittingly for a gaming-related research project, the training and development for the GameGAN was equally as silly at times. Because the network needed to consume thousands upon thousand of gameplay sessions – and NVIDIA presumably doesn’t want to pay its staff to play Pac-Man all day – the researchers relied on a Pac-Man-playing bot to automatically play the game. As a result, the AI that is GameGAN has essentially been trained in Pac-Man by another AI. And this is not without repercussions – in their presentation, the researchers have noted that because the Pac-Man bot was so good at the game, GameGAN has developed a tendency to avoid killing Pac-Man, as if it were part of the rules. Which, if nothing else, is a lot more comforting than finding out that our soon-to-be AI overlords are playing favorites.

    All told, training the GameGAN for Pac-Man took a quad GV100 setup four days, over which time it monitored 50,000 gameplay sessions. Which, to put things in perspective of the amount of hardware used, 4 GV100 GPUs is 84.4 billion transistors, almost 10 million times as many transistors as are found in the original arcade game’s Z80 CPU. So while teaching a GAN how to be a Pac-Man is incredibly impressive, it is, perhaps, not an especially efficient way to execute the game.

    Meanwhile, figuring out how to teach a neural network to be Pac-Man does have some practical goals to it as well. According to the research group, one big focus right now is in using this concept to more quickly train simulators, which traditionally have to be carefully constructed by humans in order to capture all of the possible interactions. If a neural network can instead learn how something behaves by watching what’s happening and what inputs are being made, this could conceivably make creating simulators far faster and easier. Interestingly, the entire concept leads to something of a self-feedback loop, as the idea is to then use those simulators to then train other neural networks how to perform a task, such as NVIDIA’s favorite goal of self-driving cars.

    Ultimately, whether it leads to real-world payoffs or not, there’s something amusingly human about a neural network learning a game by observing – even (and especially) if it doesn’t always learn the desired lesson.

    12:00p
    NVIDIA Reports Q1 FY2021 Earnings: Let The Good Times Roll

    This week NVIDIA announced their earnings for the first quarter of their 2021 fiscal year. The current fiscal year is an especially important one for NVIDIA on both a business level and a product level, as the company is enjoying closing the Mellanox deal, all the while opening up shipments of their new datacenter-class A100 accelerators. Especially coming off of last year’s crypto-hangover, NVIDIA has started their new fiscal year with the good times rolling on.

    NVIDIA Q1 FY2021 Financial Results (GAAP)
      Q1'FY2021 Q4'FY2020 Q1'FY2020 Q/Q Y/Y
    Revenue $3080M $3105M $2220M -1% +39%
    Gross Margin 65.1% 64.9% 58.4% +0.2% +6.8%
    Operating Income $1028M $990M $358M -1% +116%
    Net Income $917M $950M $394M -4% +106%
    EPS $1.47 $1.53 $0.64 -5% +105%

    For Q1’FY20, NVIDIA booked $3.08B in revenue. Compared to the year-ago quarter, this is a jump in revenue of 39%, making for a very strong first quarter that was only a hair under Q4, which is commonly a very strong quarter for NVIDIA. Those sizable revenues, in turn, are reflected in NVIDIA’s profits: the company booked $917M in net income for the quarter, more than double Q1’FY20. In fact it’s the second-best Q1 ever for the company; only Q1’FY19 was better, which was in the middle of the crypto boom.

    What was a record, however, was NVIDIA’s gross margin. For the quarter NVIDIA booked a GAAP gross margin of 65.1%, edging out the previous quarter and beating even Q1’FY19. As NVIDIA’s revenues have shifted increasingly towards higher-margin products like accelerators, it’s helped the already profitable NVIDIA to extend that profitability even further.

    NVIDIA Quarterly Revenue Comparison (GAAP)
    ($ in millions)
    In millions Q1'FY2021 Q4'FY2020 Q1'FY2020 Q/Q Y/Y
    Gaming $1339 $1491 $1055 -10% +27%
    Professional Visualization $307 $331 $266 -7% +15%
    Datacenter $1141 $968 $634 +18% +80%
    Automotive $155 $163 $166 -5% -7%
    OEM & IP $138 $152 $99 -9% +39%

    Breaking down NVIDIA’s revenue by platform, while there are no great surprises per-se, the company has reached some milestones that are strong indicators of where things are going. Starting with NVIDIA’s datacenter revenue, that platform of their business has set a record for revenue for a second consecutive quarter, with $1.141B in revenue. This marks the first time NVIDIA’s datacenter business has booked more than $1B in revenue in a single quarter, and NVIDIA doesn’t expect it to be the last.

    While the picture will get muddled a bit next quarter as Mellonox revenue is folded into this mix, the big picture is that datacenter accelerator sales are strong, and set to grow. NVIDIA’s Ampere-based A100 accelerators began shipping for revenue in Q1, helping to boost the numbers there, while Q2 will be the first full quarter of sales. According to NVIDIA, they’re already seeing broad demand for datacenter products, with the major hyperscalers quickly picking up A100s. Overall, NVIDIA’s Volta-generation accelerators were extremely successful for the company, almost but not quite growing the datacenter business to one billion dollars per quarter, and the company is eager to repeat and extend that success with Ampere.

    Meanwhile, NVIDIA’s largest business, gaming, was also strong for the quarter, with the company booking $1.339B in revenue. While down seasonally as usual, NVIDIA is reporting that they have weathered the current pandemic similar to other chipmakers, with soft sales in some areas being counterbalanced by greater demand for chips for home computers as workers shift to working from home.

    Interestingly, there’s a very real chance that this could be one of the last quarters where gaming is NVIDIA’s biggest revenue generator. Along with folding Mellanox into the company – and into the datacenter segment – NVIDIA’s datacenter business as a whole has been growing at a much greater clip than gaming. NVIDIA has made it very clear that they’re pushing for a more diversified revenue stream than their traditional gaming roots, and if the datacenter business grows too much more they may just get there this year. Though it will be interesting to see what the eventual launch of Ampere-based gaming products does for gaming revenue, as NVIDIA’s revenue also reflects the fact that they’re nearing the end of the Turing generation of products.

    Bringing up third place was NVIDIA’s professional visualization platform, which saw $307M in revenue. As with gaming sales, the company is seeing a boost in sales due to work from home equipment purchases. This comes on top of the day-to-day demand for workstation laptops, which NVIDIA has been increasingly invested in.

    Meanwhile NVIDIA’s automotive business ended up being something of a laggard for Q1’FY21. The segment booked $155M in revenue, which is down 7% from the year-ago quarter. NVIDIA’s automotive business moves at a much different pace than its GPU businesses – in part because it’s not set to really take off until self-driving cars become a retail reality – so the business tends to ebb and flow.

    Finally, NVIDIA booked $138M in OEM & IP revenue for Q1’FY21. While this platform is small potatoes compared to gaming and datacenter, on a percentage basis it’s actually another big jump for NVIDIA; the segment grew 39% over the year-ago quarter. According to NVIDIA, the main driving factor here was increased entry-level GPU sales for OEM systems.

    Wrapping things up, looking ahead to Q2 of FY2021, NVIDIA’s current predictions call for another strong quarter. Having closed the Mellanox deal, Mellanox’s earnings will be folded into NVIDIA’s numbers starting in Q2, helping to push the company to what should be record revenue. Meanwhile on the product side of matters, Q2 will be the first whole quarter for A100 accelerator shipments, which should help NVIDIA further grow their datacenter business.

    2:00p
    ViewSonic Announces Elite XG270QC Monitor: 1440p@165 Hz, Curved For Gaming

    The latest monitor in Viewsonic's large and varied portfolio comes via the XG270QC, which is a part of its gaming-focused Elite series. Available in the US now, the 27 inch Viewsonic Elite XG270QC features a 1500R curved screen, with a refresh rate of 165 Hz, and is certified for VESA DisplayHDR 400. 

    Designed with gaming in mind, the Viewsonic Elite XG270QC comes with many of the features you'd expect for a contemporary gaming displaying. including a 27-inch, 2560x1440 VA panel with a fast refresh rate of 165 Hz, variable refresh support including AMD's FreeSync Premium Pro certification, and is VESA certified DisplayHDR 400. Although officially it has a 3 ms response time, Viewsonic is stating that it has a 1 ms MPRT response time, with Viewsonic's PureXP Motion Blur reduction technology making this possible. The curve of the panel is rated at 1500R which Viewsonic claims is provide a more immersive gaming experience.

    Looking at the dimensions, it's 24.1 inches wide with a 4-inch depth. It has an adjustable height of between 18.97 and 23.59 inches, with a net weight of 7.5 kg with the stand installed. For users looking to mount it to a monitor stand or wall mount, it is VESA 100 x 100 mm mounting on the rear and weighs 4.9 kg without the stand installed. The XG270QC has a black glossy finish and includes a single DisplayPort 1.4 input, two HDMI 2.0 inputs, a 3.5 mm audio output, and for security, it features a Kensington Lock slot. Provided with the Elite XG270QC is Viewsonic's Elite Display Controller software which connects to its device via a Type-A cable which is supplied, and allows users to adjust the integrated RGB LED lighting. It is certified to work with ThermalTake's RGB Plus and Razer's popular Chroma RGB Ecosystems.

    Touching on some of the finer details of the 27-inch panel, it has a 178-degree viewing angle and offers VESA Adaptive-Sync support. It features AMD FreeSync Premium Pro certification, which is AMD's own classification system for grading monitors, ensuring among other things a wide enough refresh rate for Low Framerate Compensation support, as well as low-latency HDR support. In terms of color reproduction, Viewsonic is claiming 16.7 million colours, with a 3,000 to 1 static contrast ratio and 120 million to 1 dynamic contrast ratio. For power, Viewsonic states that in Eco mode, it's optimized for 45 W, while it has a 55 W typical consumption rate, with a maximum of up to 59 W.

    Viewsonic has said that the Elite XG270QC is to purchase in the US for a price around the $460 mark. Users in the EU, AU, and other regions around the world will, however, need to wait until June.

    Related Reading

    << Previous Day 2020/05/22
    [Calendar]
    Next Day >>

AnandTech   About LJ.Rossia.org