Slashdot: Hardware's Journal
 
[Most Recent Entries] [Calendar View]

Wednesday, September 22nd, 2021

    Time Event
    2:02a
    China Vows End To Building Coal-Fired Power Plants Abroad
    Chinese President Xi Jinping told the United Nations General Assembly Tuesday that his country "will not build new coal-fired power projects abroad" and plans to boost support for clean energy in developing nations. Axios reports: The pledge, if maintained, would mark a breakthrough in efforts to transition global power away from the most carbon-emitting fuel. Nations, including the U.S., have been urging China -- historically a key source of coal-plant finance -- to make such a commitment. Xi's pledge on coal financing comes just weeks before a critical United Nations climate summit. However, his remarks did not provide any details on the commitment or its implementation timeline. China is by far the world's largest coal producer and consumer, and is still building new coal-fired power generation domestically. Xi reiterated China's pledge to have it's greenhouse gas emissions peak before 2030 and to achieve carbon neutrality by 2060, but did not offer strengthened domestic commitments.

    Read more of this story at Slashdot.

    Image
    3:30a
    Scientists Develop the Next Generation of Reservoir Computing
    An anonymous reader quotes a report from Phys.Org: A relatively new type of computing that mimics the way the human brain works was already transforming how scientists could tackle some of the most difficult information processing problems. Now, researchers have found a way to make what is called reservoir computing work between 33 and a million times faster, with significantly fewer computing resources and less data input needed. In fact, in one test of this next-generation reservoir computing, researchers solved a complex computing problem in less than a second on a desktop computer. Using the now current state-of-the-art technology, the same problem requires a supercomputer to solve and still takes much longer, said Daniel Gauthier, lead author of the study and professor of physics at The Ohio State University. The study was published today in the journal Nature Communications. Reservoir computing is a machine learning algorithm developed in the early 2000s and used to solve the "hardest of the hard" computing problems, such as forecasting the evolution of dynamical systems that change over time, Gauthier said. Previous research has shown that reservoir computing is well-suited for learning dynamical systems and can provide accurate forecasts about how they will behave in the future, Gauthier said. It does that through the use of an artificial neural network, somewhat like a human brain. Scientists feed data on a dynamical network into a "reservoir" of randomly connected artificial neurons in a network. The network produces useful output that the scientists can interpret and feed back into the network, building a more and more accurate forecast of how the system will evolve in the future. The larger and more complex the system and the more accurate that the scientists want the forecast to be, the bigger the network of artificial neurons has to be and the more computing resources and time that are needed to complete the task. In this study, Gauthier and his colleagues [...] found that the whole reservoir computing system could be greatly simplified, dramatically reducing the need for computing resources and saving significant time. They tested their concept on a forecasting task involving a weather system developed by Edward Lorenz, whose work led to our understanding of the butterfly effect. Their next-generation reservoir computing was a clear winner over today's state-of-the-art on this Lorenz forecasting task. In one relatively simple simulation done on a desktop computer, the new system was 33 to 163 times faster than the current model. But when the aim was for great accuracy in the forecast, the next-generation reservoir computing was about 1 million times faster. And the new-generation computing achieved the same accuracy with the equivalent of just 28 neurons, compared to the 4,000 needed by the current-generation model, Gauthier said. An important reason for the speed-up is that the "brain" behind this next generation of reservoir computing needs a lot less warmup and training compared to the current generation to produce the same results. Warmup is training data that needs to be added as input into the reservoir computer to prepare it for its actual task.

    Read more of this story at Slashdot.

    Image

    << Previous Day 2021/09/22
    [Calendar]
    Next Day >>

Slashdot: Hardware   About LJ.Rossia.org