Conveners
HPC: Technology 1
- Chair: Dr Scott Klasky (Oak Ridge National Laboratory)
HPC: Technology 2
- Chair: Dr Hakizumwami Birali Runesha (University of Chicago)
HPC: Technology 3
- Chair: Prof Clement Onime (International Centre for Theoretical Physics (ICTP))
HPC: Technology 3
- Chair: Dr Paul Calleja (Cambridge University)
HPC: Technology 4
- Chair: Dr Oz Parchment (University of Cambridge)
HPC: Education BoF
- Convener: Mr Bryan Johnston (CHPC)
HPC: Ecosystems BoF
- Convener: Mr Bryan Johnston (CHPC)
HPC: Visualisation BoF
- Convener: Dr Samuel Mabakane (CENTRE FOR HIGH PERFORMANCE COMPUTING)
HPC: Storage & IO 1
- Chair: Dr Jay Lofstead (Sandia National Laboratories)
HPC: Storage & IO 2
- Chair: Dr Jay Lofstead (Sandia National Laboratories)
The advancement of scientific knowledge is driven by the ability to reproduce research findings. While there is an agreement about the importance of reproducibility, the core challenge, however, is that reproducible artifacts are usually put together after the research has been completed coupled with the lack of standards and motivation to carry out the task after the research has been...
According to studies in the UK and US, research software underpins ninety-five percent of research. Thirty-three percent of international research produces new code. However, research software still has to be recognised as a first-class research output, and the researchers who develop it often find themselves in dead-end career paths.
In 2012, research software engineers in the United...
This talk will highlight the ongoing efforts of the ICTP (International Centre for Theoretical Physics, Trieste, Italy) and partner
institutions in developing graduate level academic programmes on HPC and
related fields.
The HPC field is growing far faster than we can adequately train the most qualified candidates. To better address these workforce needs, mentoring people that have potential, but do not believe they can achieve in or even belong in HPC can help both address the workforce needs as well as enhance and expand representation. This talk focuses on developing students and even early career people...
The Square Kilometer Array (SKA) global mega science project stands to have a continued significant positive impact on the South African science, research, technology and innovation space. A computing infrastructure perspective of the past, present and future of those impacts will be presented. As the South African Radio Astronomy Observatory pivots to become the host of the SKA1 Mid frequency...
From the sensor to the laptop, from the telescope to the supercomputer, from the microscope to the database, scientific discovery is part of a connected digital continuum that is dynamic and fast. In this new digital continuum, Artificial intelligence (AI) is providing tremendous breakthroughs, making data analysis and automated responses possible across the digital continuum. SAGE is a...
This session will cover the latest Intel Hardware (Xeon and GPU) in more technical details related to HPC use cases. It will help the audience to understand what products are coming shortly and what are the capabilities of those products.
This talk will provide an update on the way HPE is working with customers to provide technology and solutions to address the most challenging problems in high performance computing. It will include the need to make use of heterogeneous computing elements, and how AI can be combined with HPC to enable end users to be more productive.
Seeking the balance between power consumption and performance. Is sacrificing a little bit of performance while considerably reducing the power consumption acceptable for research? A short study of the work that was done at the University of Cambridge with our labs on reducing power consumption while maintaining acceptable performance numbers, resulting in a initial listing in the top10 in...
Rick Koopman, Tech Lead and Director for Lenovo’s High-Performance Computing and Artificial Intelligence business segment for Emerging Markets in EMEA is going to talk about the challenges we will run into the coming period adopting new technologies while walking the path towards Zero Emissions Computing.
Lenovo is an established leader in HPC, with ~180 of the world’s top 500...
This session will bring members of the HPC Education community together to discuss the establishment of a sustainable HPC Education Community of Practice for the African region. The session will identify the needs of members in the community through an interactive discussion, where members can share questions, ideas, and suggestions. The session will conclude with practical next steps for the...
The HPC Ecosystems Project (and the SADC CyberInfrastructure Initiative) is responsible for the repurposing and distribution of decommissioned tier-1 HPC systems. A significant part of the project's scope is the training of an HPC System Administrator workforce. This session will bring members of the partner sites together to discuss the general progress of the project, as well as identifying...
Proposed design of the visualisation system for CHPC
Visualisation systems are used to analyse scientific data in various disciplines such as material science, computational physics, chemistry and climatology. The Centre for High Performance Computing (CHPC) users visualise their scientific data utilising their own laptops and desktops, of which, these computers sometimes do not have...
The rapid growth in technology is providing unprecedented opportunities for scientific inquiry. However, dealing with the data produced has resulted in a crisis. Computer speeds are increasing much faster than are storage technology capacities and I/O rates. This ratio is also getting worse for experimental and observational facilities, where for example, the Legacy Survey of Space and Time...
Replication has been successfully employed and practiced to ensure high data availability in large-scale distributed storage systems. However, with the relentless growth of generated and collected data, replication has become expensive not only in terms of storage cost but also in terms of network cost and hardware cost. Traditionally, erasure coding (EC) is employed as a cost-efficient...
Parallel file systems are at the core of HPC I/O infrastructures. Those systems minimize the I/O time of applications by separating files into fixed-size chunks and distributing them across multiple storage targets. Therefore, the I/O performance experienced with a PFS is directly linked to the capacity to retrieve these chunks in parallel. In this work, we conduct an in-depth evaluation of...
Open source plays an increasingly important role in society. Almost 90 companies (many of them Fortune 100) have now dedicated open source program offices (OSPOs) responsible for the implementation of the company’s open source strategy. Industry is placing a high value on organizations who know how to leverage open source, as for example IBM’s acquisition of RedHat for $34 billion indicates....