ISC Interview with the Program Team 2016
By   |  April 06, 2016

This interview was conducted with ISC 2016 Program Chair Prof. Dr. Satoshi Matsuoka and ISC Group Program Consultants, Dr. Horst Gietl, and Prof. Dr. Gerhard Wellein and Prof. Dr. Georg Hager (who answered collectively). The ISC 2016 will take place June 19 – 23 in Frankfurt, Germany.

1. How is the ISC 2016 program different from past years?

Wellein and Hager: The ISC 2016 organization committee has put a lot of effort to increase the program’s credibility in a number of ways. For example, the scientific reviewing process has been completely reworked. Furthermore, we have added a new component, the PhD forum, which will be interesting for many young HPC researchers to present their ongoing work.

Matsuoka: The ISC 2016 is not so much more “different” but it has been improved with new
elements and focus areas. The peer review system of technical papers has been greatly enhanced, involving a new review system, comprehensive review criteria and a face-to-face paper selection meeting that happened in Frankfurt, under the technical paper chair Prof. Jack Dongarra. The PhD Forum is a competitive and interactive program, strengthening the involvement of future generation HPC researchers and engineers. New focus areas such as deep learning has been established, with the conference keynote by Andrew Ng of Baidu, and other sessions during the conference.

Gietl: If different means new focus elements, then you are right. In the application area you will find the traditional applications but also some new elements like disaster mitigation or extreme scale algorithms. The classical hardware and software topics are clearly illustrated in our program. Some topics even show the way to the HPC future, for example the exascale trajectory. Plus we have introduced a new field, deep learning, since we believe that AI will heavily influence the HPC world and vice versa.

2. This year, topics like machine learning, the internet of things, and robotics are prominently featured in the conference program. Do you think this indicates HPC is undergoing an organic evolution or a disruption? 

Wellein and Hager: While it is true that these application areas have not been part of traditional HPC, we believe that many of them will gradually adopt HPC techniques and technologies in the future. Whether this will drive an evolution of HPC or cause a disruptive change is not yet clear. We believe the 2016 conference will provide the first indications.

Matsuoka: Both disruption and evolution are happening at the same time. There is a broadening of HPC into areas such as big data and machine learning, and they bring about new sets of parameters for machine designs as well as new software stacks and applications. Despite the launching of exascale programs, we are gradually observing the slowdown of Moore’s Law, thus motivating HPC researchers to look for disruptive methodologies to attain continued performance growth.

3. What is the significance of the emerging machine learning applications to the HPC community?

Gietl: Here I would like to draw your attention to Andrew Ng’s keynote:

“The significance of Machine Learning for the HPC community is the fact that AI is transforming the entire world of technology. Much of this progress is due to the ability of learning algorithms to spot patterns in larger and larger amounts of data. Today this is powering everything from web search to self-driving cars. This insatiable hunger for processing data has caused the bleeding edge of machine learning to shift from CPU computing, to cloud, to GPU, to HPC. These latest trends in AI will show in greater detail, how HPC researchers now and in the future will drive significant progress in AI.”

We are introducing this topic this year to demonstrate to our audience how machine learning and HPC are converging.

Matsuoka: It is quite significant that machine learning is applicable to many aspects of our society, not just narrow sets of high-end applications. In particular, compute-hungry algorithms such as deep learning will significantly increase the demand for HPC. It will also cause changes in the manner in which hardware and software are designed, as well as provide an alternative method to characterize natural phenomenon in an extrinsic fashion using data analytics, in addition to first principle simulations. The good news is that both require HPC!

4. Industrie 4.0, the theme of this year’s Industry Track, is a rather German-specific topic. Can you briefly outline what it is and how this topic is relevant to the wider audience at ISC?

Gietl: You are right that the term “Industrie 4.0” originates from a high-tech project, as part of the strategy devised by the German government, to promote the computerization of manufacturing. But the main idea behind it is the digital revolution and the use of electronics and IT to further automate production, which will influence manufacturing companies worldwide. And it is by far not limited to Germany.

The Industry Track will cover Industrie 4.0 with a special presentation about the virtual factory, a session on applications, and another on tools for the digital factory.

Beside these presentations, the track will cover traditional HPC applications in commercial settings. We have also invited ISVs to present their experiences in parallelizing software products for commercial deployment. The European HPC experience with commerce will be demonstrated in a special session, and last but not least, we have introduced a panel titled “What Cloud can do for Industry.”

5. Can you talk about some of the more traditional HPC topics in this year’s program?

Matsuoka: One should look at the program and find many traditional HPC topics from applications, software and hardware. We have very robust presentations on exascale programs in the major regions, including Europe, the US, Japan, China. The technical papers cover many broad aspects of HPC. There are also application talks on life sciences, weather and climate, as well as ISVs, apps and more.

Gietl: The traditional topics that cover hardware and software for HPC systems include memory technologies, non-von Neumann processors, the exascale trajectory, and performance modeling. On the application side, beside the ones Professor Matsuoka mentioned, we have the topics like extreme engineering, disaster prediction, human brain research, extreme-scale algorithms, and HPC benchmarking.

6. If you look at all of focus areas in this year conference, it’s very application-oriented; there’s less of a spotlight on hardware topics than in years past. What’s behind that shift?

Matsuoka: I do not agree that the focus is on applications this year. There are many sessions that address new architectures, performance modeling, and parallel programming. ISC strives to cover and cater to all aspects of the HPC global community in a balanced fashion, and this year is no exception.

Wellein and Hager: We see an increasing interaction between application fields and hardware designs. There is a strong trend to tailor applications to the specific architectural needs of new hardware designs. On the other hand, new HPC application areas, such deep learning may ignite developments on the hardware side. In order to foster this process we want to bring together application scientists and architects to drive this process.

Gietl: If you look closely at the conference program, you will detect that we address many HPC topics and thus, we try to introduce a balance between applications, hardware and software-oriented topics.

For example, we have eight sessions or panels about HPC hardware and software, including four on the ones I mentioned previously on memory, non-von Neumann architectures, the exascale trajectory and performance modelling. There are two panels about HPC software systems, as well as two distinguished speaker sessions, with emphasis on exascale systems and other supercomputer developments like Watson and quantum computers. There are three sessions on deep learning, in addition to the two on big data/HPC convergence, and automotive driving and its relationship with big data. There are also seven sessions on applications, one of which is reserved for HPC benchmarking.

It is my conviction that this program is really well-balanced and covers most of the significant HPC topics of interest to the community.

© HPC Today 2024 - All rights reserved.

Thank you for reading HPC Today.

Express poll

Do you use multi-screen
visualization technologies?

Industry news

Brands / Products index