BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//CERN//INDICO//EN
BEGIN:VEVENT
SUMMARY:KEYNOTE 7: The Future of Computing Will Be non-von Neumann
DTSTART;VALUE=DATE-TIME:20191204T160000Z
DTEND;VALUE=DATE-TIME:20191204T164500Z
DTSTAMP;VALUE=DATE-TIME:20260413T043753Z
UID:indico-contribution-47-1091@events.chpc.ac.za
DESCRIPTION:Speakers: Thomas Sterling (Indiana University)\nSeven decades 
 of HPC has been empowered by the abstraction of “the von Neumann Archite
 cture” and its many derivatives\, driven to a significant degree by Moor
 e’s Law and the exponential growth of device density and concomitant clo
 ck rates yielding a performance gain over that period of more than ten tri
 llion for floating point computation. But a perfect storm of recent techno
 logy trends has terminated this unprecedented expansion and challenges the
  future opportunities beyond exascale computing. But even as the end of co
 nventional processing practices is flat-lining\, a new world of non von Ne
 umann execution models and architectures is emerging igniting a revolution
  for the next generations of computing systems orders of magnitude greater
  performance than is currently achieved. Even more important is that the r
 eality of HPC users is that approximately 90% of the Top-500 machines meas
 ured with the HPL benchmark demonstrate only about 1% of the performance o
 f the fastest machines. Thus we are much further away from exascale than i
 s generally assumed and therefore much greater gains are required to truly
  bring the major base of HPC users into the exascale era. New non von Neum
 ann architectures and models\, such as Quantum Computing\, Neomorphic Comp
 uting\, and Continuum Computing (this last presented at CHPC18) are offeri
 ng important possibilities for the future. Earlier non von Neumann techniq
 ues previously explored in past decades\, such as static and dynamic dataf
 low\, cellular automata\, systolic arrays\, and special purpose designs ma
 y also serve as starting points for new classes of useful computing method
 s even as Moore’s Law recedes. Finally\, advanced technologies beyond co
 nventional semiconductors such as cryogenic such as single flux quantum lo
 gic provides yet another dimension of potential post exascale strategies. 
 This Keynote Address will convey a fast paced odyssey through the near fut
 ure opportunities of HPC non von Neumann based computers. Questions will b
 e encouraged by participants throughout the presentation as well as the Q&
 A session at its conclusion.\n\nhttps://events.chpc.ac.za/event/47/contrib
 utions/1091/
LOCATION:Birchwood
URL:https://events.chpc.ac.za/event/47/contributions/1091/
END:VEVENT
BEGIN:VEVENT
SUMMARY:BRICS programme on “Cybersecurity: Software and Data Security”
DTSTART;VALUE=DATE-TIME:20191203T092500Z
DTEND;VALUE=DATE-TIME:20191203T095000Z
DTSTAMP;VALUE=DATE-TIME:20260413T043753Z
UID:indico-contribution-47-1084@events.chpc.ac.za
DESCRIPTION:Speakers: Joey Jansen van Vuuren (Tshwane University of Techno
 logy)\nhe BRICS Network University (BRICS NU) is a network of 60 universit
 ies\, 12 each from the five BRICS countries. The BRICS NU is aimed at deve
 loping partnerships and exchange programmes in six thematic areas (ITGs) d
 etermined by the BRICS   Ministries responsible for education. This projec
 t forms part of the University Capacity Development Programme (UCDP)\, an 
 umbrella imitative developed and implemented by the South African Departme
 nt of Higher Education and Training to build capacity in South African uni
 versities in three key areas: student success\, staff development and prog
 ramme/curriculum development. The BRICS NU ITG on Computer Science and Inf
 ormation Security (CSIS) representatives from the BRICS countries\, agreed
  that CSIS ITG will focus on developing a BRICS Masters programme on ìCyb
 ersecurity: Software and Data Security. The presentation will include the 
 proposed content\, development and implementation plan of the BRICS Master
 s Programme.\n\nhttps://events.chpc.ac.za/event/47/contributions/1084/
LOCATION:Birchwood
URL:https://events.chpc.ac.za/event/47/contributions/1084/
END:VEVENT
BEGIN:VEVENT
SUMMARY:SA NREN Talk 3
DTSTART;VALUE=DATE-TIME:20191203T095000Z
DTEND;VALUE=DATE-TIME:20191203T101500Z
DTSTAMP;VALUE=DATE-TIME:20260413T043753Z
UID:indico-contribution-47-1085@events.chpc.ac.za
DESCRIPTION:Speakers: Attlee Munyaradzi Gamundani ()\nhttps://events.chpc.
 ac.za/event/47/contributions/1085/
LOCATION:Birchwood
URL:https://events.chpc.ac.za/event/47/contributions/1085/
END:VEVENT
BEGIN:VEVENT
SUMMARY:SANReN Data Transfer Pilot
DTSTART;VALUE=DATE-TIME:20191203T101500Z
DTEND;VALUE=DATE-TIME:20191203T102500Z
DTSTAMP;VALUE=DATE-TIME:20260413T043753Z
UID:indico-contribution-47-1083@events.chpc.ac.za
DESCRIPTION:Speakers: Kasandra Pillay (SANReN)\nMoving masses of data is a
  challenge. It can be slow and frustrating to transfer vast quantities of 
 data from the many places it can be stored or generated over general-purpo
 se computer networks.\n\nWhen scientists attempt to run data intensive app
 lications over campus networks\, it often results in slow transfers - in m
 any cases poor enough that the science mission is significantly impacted. 
 In the worst case\, this means either not getting the data\, getting it to
 o late or resorting to alternative inefficient measures such as shipping d
 isks around.\n\nSANReN would like to provide an update on the Data Transfe
 r Pilot service that is available for the South African Research and Educa
 tion community\, to assist them to move their data locally and internation
 ally. This session will also provide Data Transfer updates within the Nati
 onal Integrated Cyber Infrastructure System (NICIS).\n\nhttps://events.chp
 c.ac.za/event/47/contributions/1083/
LOCATION:Birchwood
URL:https://events.chpc.ac.za/event/47/contributions/1083/
END:VEVENT
BEGIN:VEVENT
SUMMARY:BIOPERIANT12\, a regional high-resolution model configuration towa
 rds  developing the South African VrESM Earth System Model
DTSTART;VALUE=DATE-TIME:20191203T121000Z
DTEND;VALUE=DATE-TIME:20191203T123000Z
DTSTAMP;VALUE=DATE-TIME:20260413T043753Z
UID:indico-contribution-47-1081@events.chpc.ac.za
DESCRIPTION:Speakers: Nicolette Chang (Ocean Systems and Climate\, CSIR)\n
 In understanding and predicting a changing global climate system\, the rep
 resentation of ocean-biogeochemistry processes in the Southern Ocean is pa
 rticularly important because of the  key role it plays in global carbon-cl
 imate feedbacks. To date\, Earth System Models (ESMs) do not  adequately r
 esolve important ocean dynamics (e.g.\, mesoscale processes)\, features th
 at are critical in Southern Ocean heat and CO2 fluxes and storage. Therefo
 re high resolution ocean biogeochemical models provide essential constrain
 ts to the medium resolution (100km) global ESMs.\n\nThe South African ESM\
 , VrESM\, comprises of globally coupled atmosphere\, ocean\, ice\, land-su
 rface\, atmospheric-chemistry\, and ocean-biogeochemistry models. Building
  and running the ESM is therefore a huge task: both scientifically and com
 putationally. Several numerical models\, each discretized on a global grid
  need to be integrated in space and time\, while additionally passing info
 rmation to each other. As part of a multi-institution and multi-year goal 
 of building South Africa’s first Earth System Model\, which will be run 
 at the CHPC\, we have been developing the ocean-biogeochemistry component 
 of the VrESM (PISCES-SOCCO). BIOPERIANT12 is a critical platform in this d
 evelopment. \n\nWe present the NEMO v3.4 regional model configuration BIOP
 ERIANT12\, our most computationally-challenging model to date and run on C
 HPC’s Lengau cluster. BIOPERIANT12 simulates ocean\, ice\, biogeochemist
 ry of the circumpolar Southern Ocean (south of 30°S) from 1989 to 2009\, 
 prescribed by ERA-interim atmospheric forcing. BIOPERIANT12 is high resolu
 tion at a  mesoscale-resolving 8 km in the horizontal and in the vertical:
  ranges from 6 m resolution at the surface to 250 m at the ocean bottom ov
 er 46 vertical levels. \n\nIn addition to the technical aspect of developi
 ng the  PISCES-SOCCO source code for VrESM\, we have to configure VrESM fo
 r an improved representation of the Southern Ocean. BIOPERIANT12\, thus se
 rves in multiple ways: (1) as a comparison for ocean biogeochemistry in th
 e ESM\, (2) as a large test case for ocean-biogeochemical evaluation metri
 cs for the ESM\, (3) as an experimental platform for understanding process
 es which influence atmosphere-ocean carbon exchange in the Southern Ocean\
 , which additionally helps improve the ESM. We discuss PISCES-SOCCO develo
 pment progress as well as the building and evaluation of BIOPERIANT12.\n\n
 https://events.chpc.ac.za/event/47/contributions/1081/
LOCATION:Birchwood
URL:https://events.chpc.ac.za/event/47/contributions/1081/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Interaction between S. mansoni Universal stress G4LZI3 protein and
  selected polyphenols: a bioinformatics investigation.
DTSTART;VALUE=DATE-TIME:20191202T123000Z
DTEND;VALUE=DATE-TIME:20191202T125000Z
DTSTAMP;VALUE=DATE-TIME:20260413T043753Z
UID:indico-contribution-47-1080@events.chpc.ac.za
DESCRIPTION:Speakers: Abidemi Paul Kappo (University of Zululand\, KwaDlan
 gezwa 3886\, South Africa)\nFor decades\, Praziquantel has been the undisp
 uted drug of choice against schistosomiasis\; a disease that affects more 
 than 200 million people in over 78 countries and responsible for over 280 
 000 lives lost per annum\, predominantly in sub-Saharan Africa. Rising con
 cerns have been raised due to the unknown mechanism of action of the drug 
 and unavoidable reports of the emergence of drug resistant strains. Moreov
 er\, current apprehension has been reinforced by the total dependence on a
  single drug for treatment. Therefore\, the search for novel and effective
  anti-schistosomal drugs become imperative. This study made use of bioinfo
 rmatics tools to determine the binding properties of a selective range of 
 polyphenols docked onto the Universal stress G4LZI3 protein\, a recently i
 dentified ‘lead’ molecule in the design of alternative treatment drug 
 against schistosomiasis. Schistosomes have over several years\, evolved me
 chanisms that include the presence of USPs\, to counter biotic and abiotic
  stress. Up-regulation of the G4LZI3 protein throughout the multifaceted d
 evelopmental cycle of the schistosome worm sparks interest in this protein
 \, whose function is currently unknown. Ten polyphenols were docked onto t
 he G4LZI3 protein\; the best five complexes were selected for post-molecul
 ar dynamics analyses and binding free energy calculations. The strongest b
 inding interactions were observed between the G4LZI3 protein with curcumin
  and catechin respectively. The major interacting residues conserved in al
 l the complexes provides basis for further structure-based drug design of 
 new compounds\, with enhanced inhibitory potency and toxicity against G4LZ
 I3. This study suggests an alternative approach for the development of ant
 i-schistosomal drugs using natural compounds.\n\nhttps://events.chpc.ac.za
 /event/47/contributions/1080/
LOCATION:Birchwood
URL:https://events.chpc.ac.za/event/47/contributions/1080/
END:VEVENT
BEGIN:VEVENT
SUMMARY:High Performance Computing for Medical Interventional Planning App
 lications
DTSTART;VALUE=DATE-TIME:20191202T090000Z
DTEND;VALUE=DATE-TIME:20191202T092000Z
DTSTAMP;VALUE=DATE-TIME:20260413T043753Z
UID:indico-contribution-47-1078@events.chpc.ac.za
DESCRIPTION:Speakers: Malebogo Ngoepe (UCT)\nPersonalised solutions for he
 althcare are increasingly recognised as an important approach for treating
  a variety of conditions that have different outcomes\, based on the patie
 nt. In the field of computational mechanics\, different virtual pipelines 
 have been developed in an effort to improve interventional planning and lo
 ng-term patient outcomes. One of the major challenges to realising patient
 -specific treatment tailoring is a mismatch of timeframes. Clinical diagno
 ses and treatments need to be carried out in as short a timeframe as possi
 ble\, while traditional CFD codes tend to run over longer time periods.\n\
 nThe use of high performance computing platforms has been beneficial in th
 e development of interventional planning pipelines for cerebral aneurysm t
 hrombosis and congenital heart disease.  For aneurysms\, it is important t
 o determine what type of clot will form in the aneurysm sac\, based on the
  treatment modality selected. In the case of congenital heart disease\, tr
 eatments which are selected need to be optimised to ensure that solutions 
 will remain suitable as the child grows to adulthood. This talk will explo
 re the challenges encountered in developing these two pipelines for clinic
 al use.\n\nhttps://events.chpc.ac.za/event/47/contributions/1078/
LOCATION:Birchwood
URL:https://events.chpc.ac.za/event/47/contributions/1078/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Predicting Thermodynamic Properties of Ionic Liquids—from Molecu
 lar Simulation to Machine Learning
DTSTART;VALUE=DATE-TIME:20191202T121000Z
DTEND;VALUE=DATE-TIME:20191202T123000Z
DTSTAMP;VALUE=DATE-TIME:20260413T043753Z
UID:indico-contribution-47-1049@events.chpc.ac.za
DESCRIPTION:Speakers: Gerhard Venter (University of Cape Town)\nRoom Tempe
 rature Ionic Liquids (RTILs)\, which are broadly defined as salts that are
  liquid below 100 °C\, offer an alternative to typical organic solvents. 
 Favourable properties\, such as high thermal stability\, large melting ran
 ge\, low vapour pressure (and consequently low flammability) and miscibili
 ty with both polar and nonpolar compounds\, provide the motivation for the
 se systems to become next-generation solvents for synthesis\, catalysis an
 d separation technologies. Since all ILs consist of both a cation and anio
 n\, interchange between existing and novel ions can deliver a vast set of 
 new solvent systems\, much quicker than derivatisation of regular\, molecu
 lar systems. Consequently\, the millions of potential systems for explorat
 ion make property prediction of novel ILs an exciting research opportunity
 . Coupled with the global demand to reduce waste and develop processes wit
 hin the principles of sustainable and green chemistry\, there is increasin
 g interest in moving away from conventional solvent systems.\nTo speed up 
 rational design methodologies and avoid costly synthetic routes\, cost-eff
 ective alternatives to experimental screening of potential new ILs for fav
 ourable physical and mechanical properties are needed. The estimation of t
 hermodynamic properties is an essential component of this task and the app
 lication of high-performance computing is central to achieving this. Estim
 ation methods can come in various forms but can be divided into three broa
 d categories: (1) molecular/atomistic simulation and computation\, (2) emp
 irical modelling (e.g. equations of state) or (3) quantitative structure p
 roperty relationships\, driven by machine learning approaches. The first r
 equires a classical or quantum mechanical description of molecules and a t
 heoretical framework (such as statistical mechanics) to derive macroscopic
  properties from these\; the second is immersed in classical thermodynamic
 s and exploits the relationships between state variables\; the last finds 
 patterns in data and can build linear or nonlinear functions of user-defin
 ed features to express physical properties without the need for direct\, p
 rior knowledge of these. Each comes with its own strengths and weaknesses 
 in terms of resources needed (empirical information and computational cost
 )\, domain of applicability\, accuracy and ease-of-use.\nIn this presentat
 ion\, the prediction power of molecular simulation and machine learning me
 thods as applied to ILs is put to the test. The constant pressure heat cap
 acity is chosen as the target—this property expresses the response of a 
 system’s energy to a temperature change and can be both quantified theor
 etically and measured experimentally with reasonable ease. A test set of f
 ive structurally diverse ionic liquids have been picked and their temperat
 ure dependent heat capacities calculated using classical molecular dynamic
 s (MD) simulations with various force field implementations as well as a s
 election of machine learning algorithms. In addition to discussing the acc
 uracy\, the strengths and weaknesses of the difference approaches are also
  compared.\nMD simulations\, of systems consisting of 10k to 20k atoms\, w
 ere run using the AMBER code\, which has been implemented on both GPU and 
 CPU architectures. The former is highly efficient and speedups in excess o
 f 20x can be obtained (> 500 ns/day). The CPU code runs in parallel using 
 an MPI implementation that scales well up to 64 processors (system size de
 pendent). The Scikit-learn Python framework was used for machine learning 
 model development with the Keras API and TensorFlow as backend for the art
 ificial neural network models.  In addition to built-in support for multit
 hreading\, where applicable\, embarrassingly parallel steps in model train
 ing and validation were optimized using the mpi4py module.\n\nhttps://even
 ts.chpc.ac.za/event/47/contributions/1049/
LOCATION:Birchwood
URL:https://events.chpc.ac.za/event/47/contributions/1049/
END:VEVENT
BEGIN:VEVENT
SUMMARY:ML approaches for HPC
DTSTART;VALUE=DATE-TIME:20191203T093000Z
DTEND;VALUE=DATE-TIME:20191203T100000Z
DTSTAMP;VALUE=DATE-TIME:20260413T043753Z
UID:indico-contribution-47-1043@events.chpc.ac.za
DESCRIPTION:Speakers: Peter Braam (University of Oxford)\nMachine Learning
  methodologies and tools have delivered new approaches to scientific compu
 ting ranging from new approximation methods to solve differential equation
 s to leveraging advantages of ML hardware over traditional HPC hardware. I
 t is not unlikely that such approaches will be helpful to computational pr
 oblems that have seen little progress for decades.  We will discuss a few 
 examples\, and discuss key themes in carrying this forward.\n\nhttps://eve
 nts.chpc.ac.za/event/47/contributions/1043/
LOCATION:Birchwood
URL:https://events.chpc.ac.za/event/47/contributions/1043/
END:VEVENT
BEGIN:VEVENT
SUMMARY:KEYNOTE 6:  Eliminating Weapons of Math Destruction: Next-Generati
 on Arithmetic
DTSTART;VALUE=DATE-TIME:20191204T074500Z
DTEND;VALUE=DATE-TIME:20191204T083000Z
DTSTAMP;VALUE=DATE-TIME:20260413T043753Z
UID:indico-contribution-47-1069@events.chpc.ac.za
DESCRIPTION:Speakers: John Gustafson (National University of Singapore)\nU
 sers of floating-point arithmetic (floats) have long experienced the disco
 nnect between mathematically correct answers and what a computer provides.
  Choices made in the 1986 IEEE 754 Standard for floats lead to irreproduci
 ble results that destroy the confidence we experience\, say\, when working
  with integers. After 33 years\, language support for mandated internal fl
 ags (rounding\, overflow\, etc.) remains nil\, so float hazards are almost
  invisible. The Standard does not require correct or consistent rounding o
 f transcendental functions\, so bitwise portability of float-based program
 s is nonexistent.\n \nThe emerging posit standard is a fresh approach to c
 omputing with real numbers that is fast\, bitwise-reproducible\, and capab
 le of preserving mathematical properties like the associative and distribu
 tive laws of algebra without sacrificing performance. Complete hardware-so
 ftware stacks supporting this new kind of arithmetic are beginning to appe
 ar\, so we now have the hope of eliminating IEEE 754 "weapons of math dest
 ruction" with something much closer to the logical behavior we expect from
  computers.\n\nhttps://events.chpc.ac.za/event/47/contributions/1069/
LOCATION:Birchwood
URL:https://events.chpc.ac.za/event/47/contributions/1069/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Benchmarking off-the-shelf radio astronomy software on Lengau
DTSTART;VALUE=DATE-TIME:20191203T115000Z
DTEND;VALUE=DATE-TIME:20191203T121000Z
DTSTAMP;VALUE=DATE-TIME:20260413T043753Z
UID:indico-contribution-47-1057@events.chpc.ac.za
DESCRIPTION:Speakers: Sean February (SARAO)\nMeerKAT\, one of the world’
 s most powerful radio telescopes in operation today\, is producing science
 -ready data into the now public-facing archive at a steady rate. In this t
 alk\, we will assess the performance of off-the-shelf tools for post-proce
 ssing and imaging MeerKAT data on Lengau.\n\nhttps://events.chpc.ac.za/eve
 nt/47/contributions/1057/
LOCATION:Birchwood
URL:https://events.chpc.ac.za/event/47/contributions/1057/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Opening Session
DTSTART;VALUE=DATE-TIME:20191202T070000Z
DTEND;VALUE=DATE-TIME:20191202T074500Z
DTSTAMP;VALUE=DATE-TIME:20260413T043753Z
UID:indico-contribution-47-1071@events.chpc.ac.za
DESCRIPTION:https://events.chpc.ac.za/event/47/contributions/1071/
LOCATION:Birchwood
URL:https://events.chpc.ac.za/event/47/contributions/1071/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Prize Giving Ceremony
DTSTART;VALUE=DATE-TIME:20191204T164500Z
DTEND;VALUE=DATE-TIME:20191204T173000Z
DTSTAMP;VALUE=DATE-TIME:20260413T043753Z
UID:indico-contribution-47-1075@events.chpc.ac.za
DESCRIPTION:https://events.chpc.ac.za/event/47/contributions/1075/
LOCATION:Birchwood
URL:https://events.chpc.ac.za/event/47/contributions/1075/
END:VEVENT
BEGIN:VEVENT
SUMMARY:The adoption of Deep Learning in Weather Forecast
DTSTART;VALUE=DATE-TIME:20191203T123000Z
DTEND;VALUE=DATE-TIME:20191203T125000Z
DTSTAMP;VALUE=DATE-TIME:20260413T043753Z
UID:indico-contribution-47-1058@events.chpc.ac.za
DESCRIPTION:Speakers: Yania Molina Souto (LNCC)\nTechnological development
  and the internet of things (IoT) increased the data assimilation sources 
 in meteorology through satellites\, sensors\, weather stations\, solar pan
 els\, cell phones\, traffic lights\, to name a few whose number grows dail
 y. This begins to build an ideal scenario for Artificial Intelligence wher
 e the demand for data is high. This combined information generates spatio-
 temporal maps of temperature\, rainfall\, air movement\, etc.\, with high 
 precision in regions with larger data sources. Applying AI techniques in c
 onjunction with a physical understanding of the environment can substantia
 lly improve prediction skill for multiple types of high-impact weather eve
 nts.\n\nIn 2017\, the American Meteorological Society (AMS) published a pa
 per [1] with a broad summary indicating how modern AI techniques are helpi
 ng to improve insights and make decisions in weather prediction. Among the
  most used techniques are Support Vector Machines (SVM)\, regression trees
 \, k-means for radar image segmentation and traditional neural networks (A
 NN). These techniques lack temporal and spatio-temporal analysis\, typical
  of meteorological phenomena. Today\, most of the temporal analysis done o
 n meteorological data is through statistical algorithms\, such as autoregr
 essive methods. However\, the AMS recognizes that the novel Deep Learning 
 techniques could soon be the cause of new improvements and says\, “In th
 e future\, convolutional neural networks operating in a deep learning fram
 ework may reduce the need for feature engineering even further” .\n\nCom
 plementarily\, recurrent neural networks\, designed for analysis of natura
 l language processing\, are known by their results in numerical problems l
 ike temperature prediction\, among others. In order to improve the tempora
 ry forecasts of spatio-temporal phenomena\, such as rain and temperature\,
  hybrid architectures have emerged that build the temporary forecast codin
 g the spatial pattern of the neighborhood.\n\nIn [7]\, Shi et al. formulat
 e precipitation nowcasting as a spatio-temporal sequence forecasting probl
 em in which both the input and the prediction target are spatio-temporal s
 equences. They extend the fully connected LSTM (FC-LSTM) to have convoluti
 onal structures in both the input-to-state and state-to-state transitions\
 , they propose the convolutional LSTM (ConvLSTM) and use it to build an en
 d-to-end trainable model for the precipitation nowcasting problem. Experim
 ents show that the ConvLSTM network captures spatio-temporal correlations 
 better and consistently outperforms FC-LSTM and the state-of-the-art opera
 tional ROVER algorithm for precipitation nowcasting.\n\nIn [8]\, Souto et 
 al. use a ConvLSTM architecture as a spatio-temporal ensemble approach. Th
 e channels in the convolution operator are used to input different physica
 l weather models. In this way the ConvLSTM encodes the spatial information
  which are subsequently learned by the recurrent structures of the network
 . The results show that ConvLSTM achieves superior improvements when compa
 red to traditionally used ensemble techniq ues such as BMA [9].\n\nAs a ma
 tter of fact\, there are a plethora of opportunities to be investigated ex
 tending the initial results we have achieved in adopting Deep Neural netwo
 rks to weather prediction. Linear and causal convolution operators (the la
 tter also known as temporal convolution)\, for instance\, have resulted in
  deep networks architectures that use convolutions to encode and decode ti
 me and space with greater precision. Raissi and colleagues [10] investigat
 e the integration of physical laws described as a set of partial different
 ial equations to the training process. By means of such integration\, the 
 training process is bound to obey the physical laws\, an approach that has
  been dubbed as model teaching . Another area of interest is multimodal ma
 chine learning (MML) [11]. In MML\, data from different representations ar
 e complementarily used in building models\, including: images\; textual da
 ta\; quantitative dataetc... This can be extremely interesting in weather 
 forecast as more data is captured from satellite images and sensors data t
 o weather bulletins and predictive physical models.\n\nREFERENCES\n1. Amy 
 McGovern\, Kimberly L. Elmore\, David John Gagne II\, Sue Ellen Haupt\, Ch
 ristopher D. Karstens\, Ryan Lagerquist\, Travis Smith\, and John K. Willi
 ams. Using Artificial Intelligence to Improve Real-Time Decision-Making fo
 r High-Impact Weather. AMS. 2017\n2. Qi Fu\, Dan Niu\, Zengliang Zang\, Ju
 nhao Huang\, and Li Diao. Multi-Stations’ Weather Prediction Based on Hy
 brid Model Using 1D CNN and Bi-LSTM. Chinese Control Conference (CCC)\, 37
 71-3775. 2019\n3. Anthony Wimmers\, Christopher VThomas Bolton and Laure Z
 anna . Applications of Deep Learning to Ocean Data Inference and Subgrid P
 arameterization. Journal of Advances in Modeling Earth Systems 11:1\, 376-
 399. 2019\n4. Anthony Wimmers and Christopher Velden \, and Joshua H. Coss
 uth . Using Deep Learning to Estimate Tropical Cyclone Intensity from Sate
 llite Passive Microwave Imagery. Monthly Weather Review 147:6\, 2261-2282.
  2019\n5. S. Scher. Toward Data-Driven Weather and Climate Forecasting: Ap
 proximating a Simple General Circulation Model With Deep Learning. Geophys
 ical Research Letters 45:22\, 12\,616-12\,622. 2018\n6. Brian S. Freeman\,
  Graham Taylor\, Bahram Gharabaghi and Jesse Thé. Forecasting air quality
  time series using deep learning. Journal of the Air & Waste Management As
 sociation\, Volume 08\, Issue 8 \, Pages 866-886\, 2018\n7. Xingjian Shi\,
  Zhourong Chen\, and Hao Wang. Convolutional LSTM Network : A Machine Lear
 ning Approach for Precipitation Nowcasting. arXiv\, pages 1–11\, 2015.\n
 8. Yania Molina Souto\, Fábio Porto \, Ana Maria de Carvalho Moura \, Edu
 ardo Bezerra : A Spatiotemporal Ensemble Approach to Rainfall Forecasting.
  IJCNN \, pages 1-8. 2018\n9. Raftery\, A. E.\, Gneiting\, T.\, Balabdaoui
 \, F.\, and Polakowski\, M.: Using Bayesian model averaging to calibrate f
 orecast ensembles\, Mon. Weather Rev.\, 133\, 1155–1174\, 2005.\n10.Rais
 si\, M.\, Predikaris\, P.\, Karniadakis\, G.E.\, Physics-informed neural n
 etwork: A deep learning framework for solving forward and inverse problems
  involving nonlinear partial differential equations. Journal of Computatio
 nal Physics\, 378\, pp.686-707\, 2019.\n11.Tadas Baltrusaitis\, Chaitanya 
 Ahuja\, and Louis-Philippe Morency\, MultiModal Machine Learning: A Survey
  and Taxonomy\, https://arxiv.org/pdf/1705.09406.pdf\n\nhttps://events.chp
 c.ac.za/event/47/contributions/1058/
LOCATION:Birchwood
URL:https://events.chpc.ac.za/event/47/contributions/1058/
END:VEVENT
BEGIN:VEVENT
SUMMARY:The research benefits of improving equity\, diversity and inclusio
 n
DTSTART;VALUE=DATE-TIME:20191203T113000Z
DTEND;VALUE=DATE-TIME:20191203T115000Z
DTSTAMP;VALUE=DATE-TIME:20260413T043753Z
UID:indico-contribution-47-1056@events.chpc.ac.za
DESCRIPTION:Speakers: Toni Collis (WHPC & Collis-Holmes Innovations)\nEqui
 ty\, diversity and inclusion seem to be the magic words of the 21st Centur
 y. We’ve all been told they are important\, but for many\, it is nice-to
 -have but difficult-to-implement. As there is an increasing shortage of hi
 ghly skilled technology specialists and an increasingly large array of app
 lications\, it has never been more important to implement workable equity\
 , diversity and inclusion practices to ensure that we can attract and reta
 in talent \n \nThis session will discuss the benefits of diversity and wha
 t we all need to be doing to ensure our community benefits. Discussions wi
 ll include the work being carried out by Women in High Performance Computi
 ng to diversify the international supercomputing community and what the Af
 rican HPC community can learn from this.\n\nhttps://events.chpc.ac.za/even
 t/47/contributions/1056/
LOCATION:Birchwood
URL:https://events.chpc.ac.za/event/47/contributions/1056/
END:VEVENT
BEGIN:VEVENT
SUMMARY:MeerKAT archive and data access
DTSTART;VALUE=DATE-TIME:20191202T093000Z
DTEND;VALUE=DATE-TIME:20191202T100000Z
DTSTAMP;VALUE=DATE-TIME:20260413T043753Z
UID:indico-contribution-47-959@events.chpc.ac.za
DESCRIPTION:Speakers: Martin Slabber (SARAO)\nThe MeerKAT archive was made
  accessible from the internet earlier this year.\nThis allow researchers f
 rom across the world to pull data from the MeerKAT archive.\nIn this talk 
 I'll describe the MeerKAT storage system that consists out of several peta
 bytes of observations backed by a Ceph distributed storage system. The use
  of Ceph at SARAO and the infrastructure around the archive will be descri
 bed. There after the data access methods from the internet and from partne
 r institutes like the CHPC will be presented.\n\nhttps://events.chpc.ac.za
 /event/47/contributions/959/
LOCATION:Birchwood
URL:https://events.chpc.ac.za/event/47/contributions/959/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Computational Fluid Dynamics: from solar thermal receivers and col
 lectors to sub-antarctic island wind simulation
DTSTART;VALUE=DATE-TIME:20191202T115000Z
DTEND;VALUE=DATE-TIME:20191202T121000Z
DTSTAMP;VALUE=DATE-TIME:20260413T043753Z
UID:indico-contribution-47-1048@events.chpc.ac.za
DESCRIPTION:Speakers: Ken Craig (University of Pretoria)\nThe parallel clu
 ster at the CHPC is used to provide speed-up on three fronts in the Comput
 ational Fluid Dynamics (CFD) simulations discussed. The first is for optim
 ization of the parameterized geometry of a swirling jet impingement solar 
 thermal receiver where many runs of fairly large Large Eddy Simulation CFD
  models are required. The second problem that benefits from the massively 
 parallel approach\, is that of a transient simulation of the atmospheric b
 oundary layer turbulent flow field around a heliostat with a very small ti
 me step\, requiring many time steps for a meaningful time series. This sim
 ulation is required to determine peak loads and perform the fluid-structur
 e interaction of such a solar collector. The last type is for large models
  of wind flow over Marion island containing close to hundred million compu
 tational cells. These models are used to predict wind patterns that affect
  plant and bird life\, especially as influenced by continued climate chang
 e.\n\nhttps://events.chpc.ac.za/event/47/contributions/1048/
LOCATION:Birchwood
URL:https://events.chpc.ac.za/event/47/contributions/1048/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Design of selective reagents and functional materials: A theoretic
 al and experimental approach
DTSTART;VALUE=DATE-TIME:20191203T090000Z
DTEND;VALUE=DATE-TIME:20191203T092000Z
DTSTAMP;VALUE=DATE-TIME:20260413T043753Z
UID:indico-contribution-47-1052@events.chpc.ac.za
DESCRIPTION:Speakers: Zenixole Tshentu (Nelson Mandela University)\nThe wo
 rk illustrates the integration of basic and applied chemistry in the devel
 opment and application of selective materials for application in desulfuri
 zation and denitrogenation of fuel as well as in separation of precious me
 tals. The fuel chemistry study is important from the point of view of the 
 need to drive towards a zero sulfur fuel as mandated by environmental prot
 ection agencies in many countries around the world.1 Challenges exist with
  the current hydrodesulfurization and hydrodenitrogenation processes that 
 are being applied in refineries as they fail to achieve the requisite fuel
  standards. The second application of functional materials is in separatio
 n of important metals. The demand for precious metals is driven by their i
 mportant applications\, and the development of better separating reagents/
 materials has become important given that the quality of ores is decreasin
 g\, and better recovery rates of the metals from secondary sources (such a
 s electronic boards and catalytic converters) will be required in future. 
 This necessitates improvement of the current chemistry in order to process
  the new feeds. \n\nExperimental and theoretical studies were carried out 
 during the development of the functional chemistry for recognition of targ
 et metals and organic compounds.  The selective chemistry towards fuel con
 taminants such as organosulfur and organonitrogen compounds has been devel
 oped\, and the results are promising as the best material (polymenzimidazo
 le nanofibers) achieve sulfur removal of less than 2 ppm.2 A process invol
 ving conversion of sulfur compounds to organosulfones compounds3 has been 
 developed followed by removal of the polar sulfones using selective materi
 als.2 The approach for materials development for metal ions\, such as plat
 inum group metals (PGMs)\, follows the development of reagents that are sp
 ecific for metal ion chlorido complexes of interest. The innovation of the
  aforesaid reagents undoubtedly requires a design strategy that considers 
 both the electronic and stereochemical requirements of the target anion. T
 hrough a combination of molecular modelling techniques and experimental te
 chniques\, we have been able to derive factors that lead to successful sep
 arations.  Cations as anion receptors specific for [IrCl6]2- and [PtCl6]2
 ⁻ will be presented as well as selective chemistry for orgnosulfur and o
 rganonitrogen compounds in fuel. Binding energies and other thermodynamic 
 parameters have been calculated in silico to explain the chemistry involve
 d.\n\nReferences: \n1.	Barbara\, P.\; Rufino\, M.N.\; Campos-Martin\, J.M.
 \; Fierro\, J.L.G.\; Catal. Sci. Technol.\, 2011\, 1\, 23-42.  \n2.	Ogunla
 ja\, A.S.\; du Sautoy\, C.\; Torto\, N.\; Tshentu\, Z.R.\; Talanta\, 2014\
 , 126\, 61–72.\n3.	Ogunlaja\, A.S.\; Chidawanyika\, W.\; Antunes\, E.\; 
 Fernandes\, M.A.\; Nyokong\, T.\; Torto\, N.\; Tshentu\, Z.R.\; Dalton Tra
 ns\, 2012\, 41\, 13908-13918.\n\nhttps://events.chpc.ac.za/event/47/contri
 butions/1052/
LOCATION:Birchwood
URL:https://events.chpc.ac.za/event/47/contributions/1052/
END:VEVENT
BEGIN:VEVENT
SUMMARY:SAVIME - Simulation Analysis and Visualization in-Memory
DTSTART;VALUE=DATE-TIME:20191203T092000Z
DTEND;VALUE=DATE-TIME:20191203T094000Z
DTSTAMP;VALUE=DATE-TIME:20260413T043753Z
UID:indico-contribution-47-1053@events.chpc.ac.za
DESCRIPTION:Speakers: Fabio Porto (LNCC)\nThe increasing computational pow
 er of HPC systems fosters the\ndevelopment of complex numerical simulation
 s of phenomena in different\ndomains\, such as medicine [1]\, Oil & Gas [2
 ] and many other fields [3\,4\,5].\nIn such applications\, a huge amount o
 f data in the form of multidimensional\narrays is produced and need to be 
 analyzed and visualized enabling\nresearchers to gain insights about the p
 henomena being studied.\n\nScientists also generate huge multidimensional 
 arrays through\nenvironmental observations\, measurements of physical cond
 itions and\nother types of sensors. For instance\, satellite data for Eart
 h's weather\,\noceans\, atmosphere and land [6] are kept in the form of mu
 ltidimensional\narrays in scientific file formats. Data collected by senso
 rs in physics\nexperiments\, such as the ones conducted in the photon stud
 ies by SLAC\nNational Accelerator Laboratory [7]\, are also represented an
 d processed in\nthe form of multidimensional arrays.\n\nMachine learning i
 s another context in which multidimensional arrays are\npresent. They are 
 the basic input format for the heavily optimized linear\nalgebra algorithm
 s implemented in deep learning frameworks\, such as:\nTensorFlow\, Keras a
 nd Torch. Deep Learning algorithms were able to\nachieve superhuman perfor
 mance for image recognition problems in the\npast few years [8]\, and they
  are among the most promising alternative for\ntackling difficult problems
  in Natural Language Processing\, Image and\nVideo Recognition\, Medical I
 mage Analysis\, Recommendation Systems\nand many others. Thus\, managing t
 hese large arrays in the context of deep\nlearning is a very important tas
 k.\n\nThe traditional approach for managing data in multidimensional array
 s in\nscientific experiments is to store them using file formats\, such as
  netCDF\nand HDF5. The use of file formats\, and not a database management
 \nsystems (DBMS)\, in storing scientific data has been the traditional cho
 ice\ndue to the fact that DBMSs are considered inadequate for scientific d
 ata\nmanagement. Even specialized scientific data management systems\, suc
 h\nas SciDB [10]\, are not very well accepted for a myriad of reasons list
 ed in\n[11\, such as]:\n\n● the impedance mismatch problem [12\,13]\, th
 at makes the process of\ningesting data into a DBMS very slow.\n● the in
 ability to directly access data from visualization tools like\nParaview Ca
 talyst [14] and indexing facilities like FastQuery [13].\n● the Inabilit
 y to directly access data from custom code\, which is\nnecessary for domai
 n specific optimized data analysis.\n\nHowever\, by completely dismissing 
 DBMSs\, some nice features also\nbecome unavailable. Including the access 
 for out-the-box parallel\ndeclarative data processing with the usage of qu
 ery languages and query\noptimization\, and management of dense and sparse
  matrices. In this talk\,\nwe will present SAVIME\, a Database Management 
 System for Simulation\nAnalysis and Visualization in-Memory. SAVIME implem
 ents a\nmulti-dimensional array data model and a functional query language
 . The\nsystem is extensible to support data analytics requirements of nume
 rical\nsimulation applications.\n\nReferences\n[1] Pablo J. Blanco\, M. R.
  Pivello\, S. A. Urquiza\, and Raúl A. Feijóo. 2009.\nOn the potentialit
 ies of 3D-1D coupled models in hemodynamics\nsimulations.Journal of Biomec
 hanics 42\, 7 (2009)\, 919–930.\n[2] Cong Tan et al. 2017. CFD Analysis 
 of Gas Diffusion and Ventilation\nProtection in Municipal Pipe Tunnel. In 
 Proceedings of the 2017\nInternational Conference on Data Mining\, Communi
 cations and Information\nTechnology (DMCIT ’17). ACM\, New York\, NY\, U
 SA\, Article 28\, 6 pages.\nHttps: //doi.org/10.1145/3089871.3089904\n[3] 
 Igor Adamovich et al. 2015. Kinetic mechanism of molecular energy\ntransfe
 r and chemical reactions in low-temperature air-fuel plasmas.\nPhilosophic
 al transactions. Series A\, Mathematical\, physical\, and\nengineering sci
 ences 373 (08 2015).\nhttps://doi.org/10.1098/rsta.2014.0336\n[4] Mollona\
 , Edoardo\, Computer simulation in social sciences\, Journal of\nManagemen
 t & Governance\, May\, 2008\, N(2) V(12).\n[5] Mitsuo Yokokawa\, Ken’ich
 i Itakura\, Atsuya Uno\, Takashi Ishihara\, and\nYukio Kaneda. 2002. 16.4T
 flopss Direct Numerical Simulation of\nTurbulence by a Fourier Spectral Me
 thod on the Earth Simulator. In\nProceedings of the 2002 ACM/IEEE Conferen
 ce on Supercomputing (SC\n’02). IEEE Computer Society Press\, Los Alamit
 os\, CA\, USA\, 1–17.\nhttp://dl.acm.org/citation.cfm?id=762761.762808 .
 \n[6] R. Ullman and M. Denning. 2012. HDF5 for NPP sensor and\nenvironment
 al data records. In 2012 IEEE International Geoscience and\nRemote Sensing
  Symposium. 1100–110\n[7] Jack Becla\, Daniel Wang\, and Kian-Tat lim. 2
 011. Using SciDB to\nSupport Photon Science Data Analysis. (01 2011).\n[8]
  Dan Ciresan\, Alessandro Giusti\, Luca M. Gambardella\, and Jürgen\nSch
 midhuber. 2012. Deep Neural Networks Segment Neuronal\nMembranes in Electr
 on Microscopy Images. In Advances in Neural\nInformation Processing System
 s 25 \, F. Pereira\, C. J. C. Burges\, L. Bottou\,\nand K. Q. Weinberger (
 Eds.). Curran Associates\, Inc.\, 2843–2851.\n[9] Z Zhao. 2014. Automati
 c library tracking database at NERSC. (2014).\nhttps://www.nersc.gov/asset
 s/altdatNERSC.pdf\, J. ACM\, Vol. 37\, No. 4\,\nArticle 111. Publication d
 ate: August 2019.\n[10] Paradigm4. 2019. SciDB. http://www.paradigm4.com/ 
 [Online\;\naccessed 01-sep-2019].\n[11] Haoyuan Xing\, Sofoklis Floratos\,
  Spyros Blanas\, Suren Byna\, Prabhat\,\nKesheng Wu\, and Paul Brown. 2017
 . ArrayBridge: Interweaving declarative\narray processing with high-perfor
 mance computing. arXiv e-prints \, Article\narXiv:1702.08327 (Feb 2017).\n
 [12] Spyros Blanas\, Kesheng Wu\, Surendra Byna\, Bin Dong\, and Arie\nSho
 shani. 2014. Parallel Data Analysis Directly on Scientific File Formats.\n
 In Proceedings of the 2014 ACM SIGMOD International Conference on\nManagem
 ent of Data (SIGMOD ’14) . ACM\, New York\, NY\, USA\, 385–396.\n[13] 
 Luke Gosink\, John Shalf\, Kurt Stockinger\, Kesheng Wu\, and Wes\nBethel.
  2006. HDF5-FastQuery: Accelerating Complex Queries on HDF\nDatasets Using
  Fast Bitmap Indices. In Proceedings of the\n18thInternational Conference 
 on Scientific and Statistical Database\nManagement (SSDBM ’06). IEEE Com
 puter Society\, Washington\, DC\,\nUSA\, 149–158\n[14] Utkarsh Ayachit\,
  Andrew Bauer\, Berk Geveci\, Patrick O’Leary\,\nKenneth Moreland\, Nath
 an Fabian\,and Jeffrey Mauldin. 2015. ParaView\nCatalyst: Enabling In Situ
  Data Analysis and Visualization. In Proceedings\nof the First Workshop on
  In Situ Infrastructures for Enabling Extreme-Scale\nAnalysis and Visualiz
 ation (ISAV2015) . ACM\, New York\, NY\, USA\, 25–29\n\nhttps://events.c
 hpc.ac.za/event/47/contributions/1053/
LOCATION:Birchwood
URL:https://events.chpc.ac.za/event/47/contributions/1053/
END:VEVENT
BEGIN:VEVENT
SUMMARY:High Performance Data Access: the Hermes approach
DTSTART;VALUE=DATE-TIME:20191202T090000Z
DTEND;VALUE=DATE-TIME:20191202T093000Z
DTSTAMP;VALUE=DATE-TIME:20260413T043753Z
UID:indico-contribution-47-956@events.chpc.ac.za
DESCRIPTION:Speakers: Xian-He Sun ()\nHigh-performance computing (HPC) app
 lications generate massive amounts of data. However\, the performance impr
 ovement of disk-based storage systems has been much slower than that of me
 mory\, creating a significant I/O performance gap. To reduce the performan
 ce gap\, storage subsystems are under extensive changes\, adopting new tec
 hnologies and adding more layers into the memory/storage hierarchy. With a
  deeper memory hierarchy\, the data movement complexity of memory systems 
 is increased significantly\, making it harder to utilize the potential of 
 the deep memory-storage hierarchy (DMSH) architecture. In this talk\, we p
 resent the development of Hermes\, an intelligent\, multi-tiered\, dynamic
 \, and distributed I/O caching system that utilizes DMSH to significantly 
 accelerate I/O performance. Hermes is a US NSF supported large software de
 velopment project. It extends HPC I/O stacks to integrated memory and para
 llel I/O systems\, extends the widely used Hierarchical Data Format (HDF) 
 and HDF5 library to achieve application-aware optimization in a DMSH envir
 onment\, and enhances caching systems to support vertical and horizontal n
 on-inclusive caching in a distributed parallel I/O environment. We will in
 troduce the Hermes’ design and implementation\; discuss its uniqueness a
 nd challenges\; and present some initial implementation results.\n\nhttps:
 //events.chpc.ac.za/event/47/contributions/956/
LOCATION:Birchwood
URL:https://events.chpc.ac.za/event/47/contributions/956/
END:VEVENT
BEGIN:VEVENT
SUMMARY:PETSc: High-Performance Software Library for Engineering and Scien
 ce Simulation
DTSTART;VALUE=DATE-TIME:20191202T120000Z
DTEND;VALUE=DATE-TIME:20191202T123000Z
DTSTAMP;VALUE=DATE-TIME:20260413T043753Z
UID:indico-contribution-47-958@events.chpc.ac.za
DESCRIPTION:Speakers: Hong Zhang (Mathematics and Computer Science Divisio
 n Argonne National Laboratory\, U.S.)\nPortable\, Extensible Toolkit for S
 cientific Computation (PETSc) is a suite of\ndata structures and routines 
 for the scalable (parallel) solution of scientific applications. Due to it
 s solid mathematical grounding\, careful software design\, and most import
 antly\, evolution resulting from the usage of many users in various applic
 ation areas\, PETSc is enabling engineers and scientists to solve large sc
 ale problems\, with previously unreachable resolution\, in areas as divers
 e as groundwater contamination\, cardiology\, fusion\, nuclear energy\, as
 tro-physics\, and climate change.\n\nAs a PETSc developer\, I will give an
  overview of the PETSc\, and briefly introduce its basic use in algorithmi
 c research\, numerical production simulation and parallel performance eval
 uation. As an example\, I will present our recent simulation of the U.S. r
 iver systems on extreme-scale computers.\n\nhttps://events.chpc.ac.za/even
 t/47/contributions/958/
LOCATION:Birchwood
URL:https://events.chpc.ac.za/event/47/contributions/958/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Updating the Outdated Storage Paradigm to Handle Complex Computati
 onal Workloads
DTSTART;VALUE=DATE-TIME:20191202T100000Z
DTEND;VALUE=DATE-TIME:20191202T103000Z
DTSTAMP;VALUE=DATE-TIME:20260413T043753Z
UID:indico-contribution-47-960@events.chpc.ac.za
DESCRIPTION:Speakers: Craig Bungay (Spectra Logic)\nAccelerating discovery
  in computational science and high performance computing environments requ
 ires compute\, network and storage to keep pace with technological innovat
 ions. Within a single organization\, interdepartmental and multi-site shar
 ing of assets has become more and more crucial to success. Furthermore\, a
 s the growth of data is constantly expanding\, storage workflows are excee
 ding the capabilities of the traditional filesystem. For most organization
 s\, facing the challenge of managing terabytes\, petabytes and even exabyt
 es of archive data for the first time can force the redesign of their enti
 re storage strategy and infrastructure. Increasing scale\, level of collab
 oration and diversity of workflows are driving users toward a new model fo
 r data storage. \n\nIn the past\, data storage usage was defined by the te
 chnology leveraged to protect data using a pyramid structure\, with the to
 p of the pyramid designated for SSD to store ‘hot’ data\,’ SATA HDDs
  used to store ‘warm’ data and tape used for the bottom of the pyramid
  to archive ‘cold’ data. Today\, modern data centers have moved to a n
 ew two-tier storage architecture that replaces the aging pyramid model. Th
 e new two-tier paradigm focuses on the actual usage of data\, rather than 
 the technology on which it resides. The new two-tier paradigm combines a p
 roject tier that is file-based and a second or perpetual tier which is obj
 ect based. The object based perpetual tier includes multiple storage media
  types\, multi-site replication (sharing)\, cloud\, and data management wo
 rkflows. Data moves seamlessly between the two tiers as data is manipulate
 d\, analyzed\, shared and protected – essentially creating yin and yang 
 between the two storage tiers. Solutions designed to natively use the Perp
 etual Tier empower organizations to fully leverage their primary storage i
 nvestments by reducing the overall strain on the Primary Tier\, while at t
 he same time\, enabling data centers to realize numerous benefits of the P
 erpetual Tier that only increase as the amount of storage to manage increa
 ses.\n\nThe next logical question is how to manage data between the two ti
 ers while maintaining user access and lowering overall administration burd
 ens. Join us for a deeper look into the nuances of the two-tier system and
  data management between them. We will cover storage management software o
 ptions\; cloud vs. on-premise decisions\; and using object storage to expa
 nd data access and create a highly effective storage architecture to break
  through data lifecycle management barriers\n\nhttps://events.chpc.ac.za/e
 vent/47/contributions/960/
LOCATION:Birchwood
URL:https://events.chpc.ac.za/event/47/contributions/960/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Designing Reusable Composable Components for the (HPC) I/O Stack
DTSTART;VALUE=DATE-TIME:20191202T113000Z
DTEND;VALUE=DATE-TIME:20191202T120000Z
DTSTAMP;VALUE=DATE-TIME:20260413T043753Z
UID:indico-contribution-47-957@events.chpc.ac.za
DESCRIPTION:Speakers: Ali Butt (Virginia Tech)\nThe rise of AI/ML in HPC a
 pplications is also driving the need for suitable storage abstractions suc
 h as the key-value (KV) stores. These abstractions pose new challenges for
  the HPC I/O stack. Enterprise KV stores are not well suited for HPC appli
 cations\, and entail customization and cumbersome end-to-end KV design to 
 extract the applications needs. To this end\, I will present BESPOKV\, an 
 adaptive\, extensible\, and scale-out KV store framework. BESPOKV decouple
 s the KV store design into the control plane for distributed management an
 d the data plane for local data store. BESPOKV takes as input a single-ser
 ver KV store\, called a datalet\, and transparently enables a scalable and
  fault-tolerant distributed KV store service. The resulting distributed st
 ores are also adaptive to consistency or topology requirement changes and 
 can be easily extended for new types of services. I’ll show that BESPOKV
 -enabled distributed KV stores scale horizontally to  a large number of no
 des\, and performs comparably and sometimes better than the state-of-the-a
 rt systems.\n\nhttps://events.chpc.ac.za/event/47/contributions/957/
LOCATION:Birchwood
URL:https://events.chpc.ac.za/event/47/contributions/957/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Smoke and ventilation simulation using the CHPC
DTSTART;VALUE=DATE-TIME:20191204T092000Z
DTEND;VALUE=DATE-TIME:20191204T094000Z
DTSTAMP;VALUE=DATE-TIME:20260413T043753Z
UID:indico-contribution-47-1061@events.chpc.ac.za
DESCRIPTION:Speakers: Kenneth Allen (Greenplan Consultants)\nKenneth Allen
   -  Greenplan Consultants  -  kennethguyallen@greenplan.co.za \n\nGreenpl
 an Consultants undertook a combined smoke\, ventilation and wind study for
  a basement/underground parking area of approximately 27 000 m$^2$. This p
 resentation gives an overview of the project and the experience of running
  it at the CHPC.  In order to reduce the size of the transient model\, the
  full domain (about 0.25 km$^3$) was modelled under steady-state condition
 s using OpenFoam. The wind flow patterns around the basement were then imp
 osed as boundaries on the transient model of the basement. Even with this 
 approach\, the transient model was too large to simulate on a small office
  network\, so high performance computing (HPC) was essential. \n\nFire Dyn
 amics Simulator (FDS) 6.7.0. was used for the transient simulations. It is
  purpose-written for simulating fire and smoke\, and uses the computationa
 lly intensive Large Eddy Simulation (LES) method. The FDS model CHPC requi
 rements were as follows - RAM: ≈ 100 GB\; Nodes used: 10-15\; Cores used
 : 240-360\; Simulated time per case: ≈ 1-2 min\; Wall time per case: ≈
  20-60 hours\; Total cells: max. 70-80 million\; Cell size: 100 mm\; Data 
 output per simulation: 200-250 GB. \n\nFDS can make use of OpenMP (Multi-P
 rocessing) and MPI (Message Passing Interface). Tests on the CHPC with Ope
 nMP enabled showed little-to-no improvement\, so OpenMP was set to 1 (disa
 bled). An experiment was made whereby all cores per node were booked on PB
 S but only half were used to run MPI process. This did not give better per
 formance – possibly because of ghost processes running on one or two of 
 the cores. Subsequently\, all models were run with 1 MPI process per core\
 , with 24 MPI processes per node. This seemed to be the best option for co
 st-effective performance. \n\nFDS makes use of manually-specified rectilin
 ear meshes. Unfortunately\, where parallel simulation is desired\, each MP
 I process requires at least one mesh\, which means the mesh domain must be
  manually split up into sub-meshes. Any mismatch in mesh size leads to une
 ven loading on the cores\, which means that the less heavily loaded cores 
 have to wait. Due to the rectilinear grid\, there are also “wasted” ce
 lls in walls/floors/roofs which have no function but contribute to the com
 putational load. Thus\, although the initial CSIR scaling tests on the CHP
 C (6 million cells) were promising for cell counts as low as 30 000 cells 
 per core\, scaling tended to be less efficient than expected. \n\nOn a num
 ber of occasions the simulations progressed at different speeds despite ha
 ving the same configuration\, flow speeds\, and boundary conditions. It is
  possible that this might have been caused by ghost processes on individua
 l cores and the CHPC architecture effect – in particular\, the blocking 
 ratio between racks. As an experiment on our final model geometry\, we red
 uced the number of nodes in use from 15 (360 cores) to 10 (240 cores). The
  cell count per core was a factor of 2.5 higher\, the area modelled was la
 rger\, and there were more jet fans and extraction fans than in the previo
 us model. Despite this\, the CHPC wall time required per unit simulated ti
 me increased by a factor of only 2. Rigorous testing is necessary before a
 ny conclusions are drawn\, as a rough test like this does not provide suff
 icient data and there might be factors not taken into account.  While the 
 above meant that simulations did not run as fast as desired\, they ran far
  faster than they would have on a small office network (it would have take
 n years\, if they ran at all). Greenplan had a good experience with the CH
 PC and are keen to use it for future projects of this nature.\n\nhttps://e
 vents.chpc.ac.za/event/47/contributions/1061/
LOCATION:Birchwood
URL:https://events.chpc.ac.za/event/47/contributions/1061/
END:VEVENT
BEGIN:VEVENT
SUMMARY:High throughput in silico screening for tailored catalytic reactiv
 ity and selectivity
DTSTART;VALUE=DATE-TIME:20191204T090000Z
DTEND;VALUE=DATE-TIME:20191204T092000Z
DTSTAMP;VALUE=DATE-TIME:20260413T043753Z
UID:indico-contribution-47-1060@events.chpc.ac.za
DESCRIPTION:Speakers: Caroline M. Krauter (Schrödinger GmbH)\nFirst-princ
 iples simulation has become a reliable tool for the prediction of structur
 es\, chemical mechanisms\, and reaction energetics for the fundamental ste
 ps in homogeneous and heterogeneous catalysis. Details of reaction coordin
 ates for competing pathways can be elucidated to provide the fundamental u
 nderstanding of observed catalytic activity\, selectivity\, and specificit
 y. Such predictive capability raises the possibility for computational dis
 covery and design of new catalysts with enhanced properties.\n\nIn the cas
 e of mesoporous materials like zeolites\, the well-defined pore structures
  and adjustable reactivity centers in the pore walls allow for efficient c
 ontrol of the catalytic properties. In addition to the reactivity at the c
 atalytic center\, the mobility of the reaction components throughout the n
 etwork structure is crucial to the design. In this contribution we will us
 e GPU-accelerated molecular dynamics simulations to study the diffusion of
  small molecules through zeolite structures.\n\nhttps://events.chpc.ac.za/
 event/47/contributions/1060/
LOCATION:Birchwood
URL:https://events.chpc.ac.za/event/47/contributions/1060/
END:VEVENT
BEGIN:VEVENT
SUMMARY:NEXTGenIO: exploring the potential of non-volatile memory in HPC
DTSTART;VALUE=DATE-TIME:20191202T123000Z
DTEND;VALUE=DATE-TIME:20191202T130000Z
DTSTAMP;VALUE=DATE-TIME:20260413T043753Z
UID:indico-contribution-47-961@events.chpc.ac.za
DESCRIPTION:Speakers: David Homan (EPCC)\nMemory and storage read and writ
 e costs can lead to a significant loss of time and energy in current HPC s
 ystems. Byte-addressable non-volatile memory (NVM) could provide considera
 ble improvements in both time and energy requirements over conventional DR
 AM memory. Using Optane DCPMM\, Intel's new byte-addressable and persisten
 t memory\, the NEXTGenIO project investigated the performance of NVRAM by 
 designing\, building and testing a bespoke prototype NVM system. The main 
 goal of the project was to explore the potential of NVRAM in overcoming pe
 rformance bottlenecks in I/O and main memory\, which are considered signif
 icant barriers to Exascale computing.\n\nIn this talk we will give a brief
  overview of the NEXTGenIO system (192GB DRAM and 3TB of NVM per dual sock
 et node)\, and the various NVRAM usage modes. The results from a number of
  investigative test cases run on the NEXTGenIO prototype system will be pr
 esented. In particular we will discuss I/O performance\, run-time\, and en
 ergy consumption for applications with large I/O demands\, such as OpenFOA
 M and CASTEP. Comparison of the results from NVRAM and DRAM shows that NVR
 AM can indeed provide significant improvement in both performance and ener
 gy consumption.\n\nhttps://events.chpc.ac.za/event/47/contributions/961/
LOCATION:Birchwood
URL:https://events.chpc.ac.za/event/47/contributions/961/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Evaluating the Compressibility of Elevation Data using Space-Filli
 ng Curves
DTSTART;VALUE=DATE-TIME:20191203T100000Z
DTEND;VALUE=DATE-TIME:20191203T103000Z
DTSTAMP;VALUE=DATE-TIME:20260413T043753Z
UID:indico-contribution-47-1150@events.chpc.ac.za
DESCRIPTION:Speakers: Conrad Haupt (University of the Witwatersrand)\nTwo 
 of the typical points of interest with elevation data\, or Geographic Info
 rmation Systems\n(GIS) data in general\, are storage and query costs. The 
 former is typically addressed by\nintegrating standard compression schemes
  into already existing storage mechanisms\, such\nas GZIP in HDF5. Space-F
 illing Curves (SFCs) have already been used to reduce access\ntime for spa
 tial operations on point and polygon data. In this research\, we evaluate 
 the effect\nof using SFCs as a pre-processing step for standard compressio
 n schemes on elevation\ndata. We break up common compression tools into th
 eir base algorithms and identify\ncanonical SFCs from the literature (for 
 example\, the Hilbert curve).\n\nWe use 1-arcsecond resolution elevation m
 aps from the Shuttle Radio Topographic Mission\n(SRTM) as the comparative 
 data-set upon which we apply all combinations of SFCs and\ncompression sch
 emes. The SFCs\, in most cases\, neither significantly improve nor worsen\
 ncompression ratios when compared to non-preprocessed results. However\, w
 e show that\ncertain pre-processing steps improve the compression performa
 nce of otherwise ineffective\ncompression techniques. This research shows 
 the potential for future work on compression\nschemes which allow for in-p
 lace search and modifications without the loss of compression\nperformance
 . Another application is to apply these techniques to astronomical data fr
 om the\nSquare-Kilometre Array\, a major scientific and engineering projec
 t in South Africa\, for which\nsome preliminary results have been attained
 .\n\nhttps://events.chpc.ac.za/event/47/contributions/1150/
LOCATION:Birchwood
URL:https://events.chpc.ac.za/event/47/contributions/1150/
END:VEVENT
BEGIN:VEVENT
SUMMARY:KEYNOTE 1: Digital Transformation: Issues of the Public Service
DTSTART;VALUE=DATE-TIME:20191202T074500Z
DTEND;VALUE=DATE-TIME:20191202T083000Z
DTSTAMP;VALUE=DATE-TIME:20260413T043753Z
UID:indico-contribution-47-1169@events.chpc.ac.za
DESCRIPTION:Speakers: Mandla Ngcobo ()\nGenerally\, a lot of awareness has
  been created around Digital Transformation/ the Fourth Industrial Revolut
 ion (4IR) with various stakeholders electing to focus on matters of intere
 st to them. However\, there are known challenges and requirements for the 
 public service to be able to transform digitally. The presentation would t
 alk about challenges/ issues (people/ process/ technology issues) that req
 uire interventions/ solutions to achieve a Digitally Transformed public se
 rvice.\n\nhttps://events.chpc.ac.za/event/47/contributions/1169/
LOCATION:Birchwood
URL:https://events.chpc.ac.za/event/47/contributions/1169/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Research highlights from the Molecular Bio-computations and Drug D
 esign Research Group at UKZN
DTSTART;VALUE=DATE-TIME:20191202T113000Z
DTEND;VALUE=DATE-TIME:20191202T115000Z
DTSTAMP;VALUE=DATE-TIME:20260413T043753Z
UID:indico-contribution-47-1165@events.chpc.ac.za
DESCRIPTION:Speakers: Mahmoud  soliman  (Molecular Bio-computation and Dru
 g Design Laboratory\, School of Health Sciences\, University of  KwaZulu-N
 atal\, Westville Campus\, Durban 4001\, South Africa)\nProf Mahmoud Solima
 n (http://soliman.ukzn.ac.za/) will provide a presentation that highlights
  the various research scopes and outcomes of research that is being conduc
 ted in his research group at UKZN over the last 8 years with more emphasis
  on the applications of computational simulations and the contribution of 
 CHPC in drug design and discovery as well as capacity development. A few s
 elected research topics will be presented such as: the irony of chirality\
 ; covalent drug inhibition\; does drug size matter\; and other topics.\n\n
 https://events.chpc.ac.za/event/47/contributions/1165/
LOCATION:Birchwood
URL:https://events.chpc.ac.za/event/47/contributions/1165/
END:VEVENT
BEGIN:VEVENT
SUMMARY:More than meets the eye: Towards an Artificial Intelligence Observ
 atory
DTSTART;VALUE=DATE-TIME:20191204T090000Z
DTEND;VALUE=DATE-TIME:20191204T092000Z
DTSTAMP;VALUE=DATE-TIME:20260413T043753Z
UID:indico-contribution-47-1162@events.chpc.ac.za
DESCRIPTION:Speakers: Matias Carrasco Kind (NCSA)\nFile attached\n\nhttps:
 //events.chpc.ac.za/event/47/contributions/1162/
LOCATION:Birchwood
URL:https://events.chpc.ac.za/event/47/contributions/1162/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Cloud Computing a Solution to Internet Services Downtime at Namibi
 an Institutions of Higher Learning
DTSTART;VALUE=DATE-TIME:20191203T090000Z
DTEND;VALUE=DATE-TIME:20191203T092500Z
DTSTAMP;VALUE=DATE-TIME:20260413T043753Z
UID:indico-contribution-47-1148@events.chpc.ac.za
DESCRIPTION:Speakers: Shadreck Chitauro (Namibia University of Science and
  Technology)\nE-learning has been adopted by many institutions as a means 
 by which to foster an effective teaching and learning environment. Namibia
 n institutions of higher learning have not been left behind and are also u
 sing e-learning systems. Use of e-learning brings about many benefits like
  that of students being able to access learning material anywhere and anyt
 ime but implementation of e-learning has its own drawbacks of which cloud 
 computing could be the answer. Furthermore\, higher learning institutions 
 run their own IT systems and buy their own IT infrastructure which has imp
 lications on the overall institutions’ budgets. Data on issues with over
 all IT operations not just for e-learning systems was obtained from a case
  site at one of the Namibian institutions of higher learning. The study sh
 owed that the institutions’ IT systems have problems that cloud computin
 g could be able to solve. \n\nKeywords: Cloud computing\, e-learning\, edu
 cation\, downtime\n\nhttps://events.chpc.ac.za/event/47/contributions/1148
 /
LOCATION:Birchwood
URL:https://events.chpc.ac.za/event/47/contributions/1148/
END:VEVENT
BEGIN:VEVENT
SUMMARY:KEYNOTE 3 (SA NREN): eInfrastructures as enablers for global scien
 ce and education
DTSTART;VALUE=DATE-TIME:20191203T070000Z
DTEND;VALUE=DATE-TIME:20191203T074500Z
DTSTAMP;VALUE=DATE-TIME:20260413T043753Z
UID:indico-contribution-47-1139@events.chpc.ac.za
DESCRIPTION:Speakers: Josva Kleist (NORDUnet A/S)\nScience and education a
 re increasingly becoming global endeavors. Scientists collaborate internat
 ionally and questionnaires indicate that scientists today are more likely 
 to collaborate with partners outside their home institutions than with loc
 al research fellows. Students no longer need to physically be on campus to
  take classes\, as they follow courses remotely and can even take a degree
  at foreign university without leaving home. Furthermore\, there is a grow
 ing number of international research infrastructures\, some centralized an
 d some highly distributed\, in all areas of science. The construction of t
 hose infrastructures is partly driven by financial necessity\, but also fr
 om a need to bring together the necessary competences. Furthermore\, the p
 aradigms of Open Science and Citizen Science foster foster an environment 
 of sharing and inclusion. Collaboration is no longer just a matter of scie
 ntist visiting for weeks and months.\n \nFor these reasons\, access to fac
 ilities\, research infrastructures\, and data must be global\, universal\,
  fine-grained\, and instant.  As an eInfrastructure provider\, R&E Network
  organizations are one of the fundamental building blocks supporting today
 ’s global science\, and just as scientific activities are becoming more 
 and more global\, we need to think global. With the network\, we already s
 ee this happening\, where leading NRENs\, such as the South African NREN\,
  participate in the Global Network Advancement Group\, that works on the i
 ntercontinental aspects of the (GREN) Global R&E Network. Moreover\, netwo
 rk services such as eduroam\, eduGAIN\, and eduVPN take a truly global app
 roach.  The infrastructure needed to support Open Science and Citizen Scie
 nce drives us to break down silos between storage\, compute\, and network 
 infrastructures. We need to think about it in its entirety. The same goes 
 for some of the large international research infrastructures\, where stora
 ge\, compute\, and the network become an integral part of the science inst
 rument.  In this talk\, I will present my views on the role of R&E Network
 s as enablers for science and education\, with an outset on what I see hap
 pening in the European Nordics\, at a European level\, and globally.\n\nht
 tps://events.chpc.ac.za/event/47/contributions/1139/
LOCATION:Birchwood
URL:https://events.chpc.ac.za/event/47/contributions/1139/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Cybers Security Threats – Is Our Research Data Safe?
DTSTART;VALUE=DATE-TIME:20191204T123000Z
DTEND;VALUE=DATE-TIME:20191204T125000Z
DTSTAMP;VALUE=DATE-TIME:20260413T043753Z
UID:indico-contribution-47-1138@events.chpc.ac.za
DESCRIPTION:Speakers: Isak van der Walt (University of Pretoria)\nIn a wor
 ld where more and more people are connected every day\, new systems pop up
  and\naccess can be gained from any device anywhere\, we need to ask if th
 e data that we are working\non (viewing\, editing and manipulating) is ori
 ginal and safe? The value of data as an asset is\nincreasing as well as th
 e potential threat in its transformative and informative power. The concep
 t\nand roll out of Institutional Repositories (IR) and Institutional Data 
 Repositories (IDR) is fairly\nmainstream in South Africa\, hosting large v
 aluable and often invaluable sets of data. In a\nconnected world\, cyber t
 hreats are real and have to date caused harm to many organisations. Is\nou
 r data really safe and trustworthy in South Africa? South Africa has seen 
 a dramatic increase\nin the amount of cyber-attacks in 2019\, with as much
  as a 22% increase in malware attacks\ncompared to 2018 alone. Malware att
 acks only form part of an array of cyber threats that are\ncarried out on 
 a daily basis. In short\, the answer is that no system is 100% safe agains
 t cyber\nattacks\, there are however systems and processes in place to inc
 rease and safeguard our data.\n\nToday’s cybercriminal strategies target
  every link in the attack chain to gain access to resources\nand data\, ex
 ploiting them relentlessly. Holding data for ransom\, modifying data\, def
 acing of data\nand selling of data are some of the perils that more and mo
 re organisations are facing when\nbreached. The key to safeguarding data i
 s to ensure that policies\, systems and procedures are\nput in place to de
 al with the various links in the attack chain. This presentation will focu
 s on what\nis happening on an international and national level with regard
 s to cyber security threats and how\nit affects our research data. The pre
 sentation will also highlight the attack chain and what can be\ndone at th
 e various linkages to make systems and organizations more secure.\n\nhttps
 ://events.chpc.ac.za/event/47/contributions/1138/
LOCATION:Birchwood
URL:https://events.chpc.ac.za/event/47/contributions/1138/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Evolution of a South African eResearch support service
DTSTART;VALUE=DATE-TIME:20191204T121000Z
DTEND;VALUE=DATE-TIME:20191204T123000Z
DTSTAMP;VALUE=DATE-TIME:20260413T043753Z
UID:indico-contribution-47-1137@events.chpc.ac.za
DESCRIPTION:Speakers: Dale Peters (UCT)\nINTRODUCTION\n\nGiven the complex
  relationship between eResearch stakeholders within universities and resea
 rch organisations\, a “one size fits all” solution to the development 
 of a national eResearch support service is not practical.  Services contri
 buted by Libraries\, IT and research administrations are at different stag
 es of maturity\, and the South African landscape is characterised by scarc
 e skills in areas of software and systems support. To advance national eRe
 search capability\, a considered approach requires both costly infrastruct
 ure investment and collaborative support services. \n\nBACKGROUND\n\nThe e
 mergence of a new paradigm\, “sometimes called eResearch”\, gave rise 
 to the examination of a national information service framework in 2005.( )
  The need for joint action was identified to meet the challenges of eResea
 rch cost-effectively in South Africa. A specialized agency was proposed to
  provide support services\, with a governance model that should work well 
 for all participants.  Two reports commissioned by the Department of Scien
 ce and Technology assisted in conceptualising strategic plans for the furt
 her development of South Africa’s research infrastructure\, including th
 e cyberinfrastructure component.  The recommendation to establish of a Nat
 ional Integrated Cyberinfastructure System (NICIS)  was accepted and plans
  for follow-up activities approved in 2013.  NICIS comprises several core 
 components of the Tier 1 infrastructure: a national Center for High Perfor
 mance Computing (CHPC)\, the South African Research Network (SANReN)\, and
  the more recently established Data Intensive Research Initiative for Sout
 h Africa (DIRISA).  Experience of the Research Data Management project com
 ponent of the DIRISA Tier 2 ilifu infrastructure has provided valuable les
 sons the collaborative development of shared services that can now be evol
 ved to the wider community.  \n\nFROM INFRASTRUCTURE DEVELOPMENT TO SERVIC
 E ORIENTATION\n\nAs research becomes more multidisciplinary\, more collabo
 rative and more global\, researchers seek to leverage the South African in
 vestment in specialist scientific equipment and domain-specific infrastruc
 tures\, often generating massive data outputs for analysis in internationa
 l collaboration.  As the national research infrastructure moves from an ex
 perimental testbed to a user-oriented environment\, a challenge faced by m
 ost eResearch infrastructures is the provisioning of sustainable services\
 , and the monitoring of user experience (UX)\, to improve the interaction 
 of researchers with the infrastructure.  This critical component is seldom
  defined explicitly in the infrastructure development\, and the research c
 ommunity have little interest in the expansion of cost-effective services 
 beyond their own needs\, and especially beyond the duration of their funde
 d project. Responsibility at present\, falls to the host entity to realise
  the full potential of the national cyberinfrastructure\, and the collabor
 ation enabled with global infrastructures.  A limited science system sugge
 sts a federation of distributed support services\, including multiple univ
 ersities and institutional partners to meet the ever-increasing need to me
 et both current user support and ongoing data access.\n\nA pilot project t
 o support a South African eResearch support service will build on the eRes
 earch Africa conference hosted bi-annually at the University of Cape Town 
 since the initial event in 2013.  An annual training workshop aimed at pro
 fessional development and career enhancement opportunities recognizes the 
 varied job roles associated with eResearch. Institutional eResearch capaci
 ty building will focus on selected teams of information professionals thro
 ugh sponsored participation in designated training programmes and national
  events. \n\nCONCLUSIONS\n\nThe development of a national support service 
 model is intended to improve distributed efficiency\, rather than to centr
 ally consolidate a limited pool of existing human resources.  The effect o
 f overextending the existing capacity poses serious threat to the realisat
 ion of the national cyberinfrastructure\, with discussion of actual use ca
 ses in this presentation. \n\nDue to the complex relationship between eRes
 earch stakeholders within institutions\, a “one size fits all” solutio
 n is impractical\, and a phased approach is recommended\, leveraging a bro
 kerage model to access third party services and avoid scenarios where serv
 ices are developed and implemented and then subsequently “orphaned” by
  lack of support and changing financial priorities. \n\nThe potential admi
 nistrative overhead of service development projects\, established by indiv
 idual service level agreements with multiple institutions\, warrants furth
 er consultation on the project governance with university executives\, sen
 ior researchers and infrastructure managers.  \n\nCapability approaches to
  advanced computing technologies must address more than the big shiny stuf
 f. A considered approach requires both costly infrastructure investment an
 d collaborative support services. The user experience of researchers\, and
  their improved interaction with the national cyberinfrastructure should u
 ltimately direct the project and its evaluation.\n\nhttps://events.chpc.ac
 .za/event/47/contributions/1137/
LOCATION:Birchwood
URL:https://events.chpc.ac.za/event/47/contributions/1137/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Developments Around Data in Botswana to support 4iR preparedness\,
  Research\, Innovation in a Developing Country
DTSTART;VALUE=DATE-TIME:20191204T115000Z
DTEND;VALUE=DATE-TIME:20191204T121000Z
DTSTAMP;VALUE=DATE-TIME:20260413T043753Z
UID:indico-contribution-47-1136@events.chpc.ac.za
DESCRIPTION:Speakers: Tshiamo Motshegwa (University Of Botswana)\nThe 3rd 
 Industrial revolution of the 20th Century ushered in the 1st Information R
 evolution that brought the internet\, digitisation\, digitilisation and di
 gital transformation and created a basis for knowledge-based economies. It
  is now widely accepted that the world is experiencing the advent of the 2
 nd Information revolution that is ushering in a 4th  Industrial Revolution
  - a revolution that is characterised by a fusion of technologies to addre
 ss current and future human needs. The 4th Industrial revolution is also c
 haracterised by  large amounts and variety of data – coming from various
  sources at high frequency – and our ability to analyse them\, in real-t
 ime and derive information and knowledge for timely decision making. It is
  anticipated that the 4th Industrial revolution will revolutionise Industr
 y production processes through  advanced automation (often referred to Ind
 ustry 4.0)\, it also further  anticipated that it will revolutionise pract
 ice and effectiveness in a variety of areas -  health care provision (e.g.
  personalised medicine)\, development of Smart Cities \, precision agricul
 ture and help address weather and climate change. \n\nThere is therefore n
 eed for African countries and developing countries to respond to the onset
  on the 4th Industrial revolution (amidst arguably addressing challenges f
 rom previous industrial revolutions still unravelling in the continent) . 
 This will help to bridge the digital divide and help not leave anyone behi
 nd – and achieve Africa’s vision 2063 – the Africa We Want. It will 
 also help accelerate attainment of Sustainable Development goals through r
 iding on technology advances\, efficiency  and transparency. There is need
  for Africa to transform its infrastructure\, research & innovation ecosys
 tems\, skills and education systems etc – this for 4th Industrial revolu
 tion readiness and competitiveness of African economies and Africans in th
 is new dispensation. Africa and African countries need to have tailor made
  responses to the 4th Industrial revolution and its implications to the Af
 rican context. This can be done by concretising National\, Regional and Co
 ntinental Policy Frameworks\, structures\, resourced roadmaps and increase
 d expenditure in Research\, Science\, Technology and Innovation and develo
 ping partnerships . The universities and other centers of knowledge creati
 on and skills development must play a critical role. The Universities must
  be alive to this responsibility and aim to transform to be a research-int
 ensive institutions – and is enhance their  internal university innovati
 on ecosystems including around data exploitation through innovations.\n\nT
 his talk will provide an update on the developments of around data in Bots
 wana\, this from the prism of policy and strategy development\, research a
 nd Innovation\, skills development and science communication for public an
 d policy engagement – this to help address Botswana’s socio-economic c
 hallenges and attainment of Vision 2036 – Prosperity for all and address
  4IR preparedness. The talk will discuss developments around Botswana Open
  Data Open Science\, Botswana Space Science and Technology Strategy Develo
 pment\, highlight example National Open Data projects and discuss the Univ
 ersity Industry Government Co-creation Initiative that aims to foster inno
 vation- including around exploitation of open data.\n\nhttps://events.chpc
 .ac.za/event/47/contributions/1136/
LOCATION:Birchwood
URL:https://events.chpc.ac.za/event/47/contributions/1136/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Developing infrastructure for federated data analysis for protecte
 d human data
DTSTART;VALUE=DATE-TIME:20191204T113000Z
DTEND;VALUE=DATE-TIME:20191204T115000Z
DTSTAMP;VALUE=DATE-TIME:20260413T043753Z
UID:indico-contribution-47-1135@events.chpc.ac.za
DESCRIPTION:Speakers: Nicola Mulder (University of Cape Town)\nDatasets of
  ever increasing size and complexity are being generated in the biomedical
  field to answer questions about human and animal health. Data on human he
 alth have to be managed responsibly to ensure protection of participants i
 n health studies. Additionally\, many governments are clamping down on the
  transfer of datasets out of country borders. In order to respect these co
 ncerns while still facilitating ethical and responsible data sharing for a
 nalysis\, new policies and infrastructure need to be developed. There are 
 several initiatives working in this space\, including the Global Alliance 
 for Genomics and Health (GA4GH)\, which is building standards and tools fo
 r sharing of genomic data globally. A new EU funded research project\, CIN
 ECA (Common Infrastructure for National Cohorts in Europe\, Canada\, and A
 frica)\, is developing infrastructure to implement GA4GH standards to enab
 le the analysis of data across cohorts without the requirement for the tra
 nsfer of large datasets to third parties. This includes development of sec
 urity systems for authentication and authorization of researchers\, harmon
 ization of data across heterogenous studies\, and development of cloud-bas
 ed tools for federated data analysis within the confines of participant co
 nsent. This presentation will describe some of the standards and tools bei
 ng developed and implemented in the CINECA project.\n\nhttps://events.chpc
 .ac.za/event/47/contributions/1135/
LOCATION:Birchwood
URL:https://events.chpc.ac.za/event/47/contributions/1135/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Federating the Ilifu Cloud
DTSTART;VALUE=DATE-TIME:20191204T100000Z
DTEND;VALUE=DATE-TIME:20191204T102000Z
DTSTAMP;VALUE=DATE-TIME:20260413T043753Z
UID:indico-contribution-47-1134@events.chpc.ac.za
DESCRIPTION:Speakers: Rob Simmonds (University of Cape Town)\nIlifu is an 
 Infrastructure as a Service cloud that utilises OpenStack to provide its c
 ore services. It is run by a consortium of South African universities and 
 provides data intensive computing resources to Astronomy and Bioinformatic
 s users. This talk describes how we are utilising federated identity servi
 ces enable the use of ilifu by users in a way that can be managed by indiv
 idual project groups without needing to contact ilifu support to have them
  create or remove accounts. It will explain how this has been done using t
 ools run by EGI\, and using tools integrated locally.\n\nhttps://events.ch
 pc.ac.za/event/47/contributions/1134/
LOCATION:Birchwood
URL:https://events.chpc.ac.za/event/47/contributions/1134/
END:VEVENT
BEGIN:VEVENT
SUMMARY:The Capture\, Storage and Consumption of the MeerKAT Radio Telesco
 pe’s Sensor Data
DTSTART;VALUE=DATE-TIME:20191204T094000Z
DTEND;VALUE=DATE-TIME:20191204T100000Z
DTSTAMP;VALUE=DATE-TIME:20260413T043753Z
UID:indico-contribution-47-1133@events.chpc.ac.za
DESCRIPTION:Speakers: Suleiman Hoosen (South African Radio Astronomy Obser
 vatory)\nThe Control and Monitoring (CAM) Team at the South African Radio 
 Astronomy Observatory (SARAO) is responsible for implementing software sol
 utions for the collection of sensor data from all of the components\, user
 -supplied equipment and ancillary devices that make up the 64-dish MeerKAT
  Radio Telescope. Recently\, the CAM Team developed and deployed a new sol
 ution called KatStore64 which provides services to the MeerKAT Telescope O
 perators\, Astronomers and academia to access the telescope’s sensor dat
 a. \nIn order to capture\, store and consume all of this data\, the CAM Te
 am makes use of the services offered by the Centre for High Performance Co
 mputing (CHPC) for long term storage of the data and employs APIs to extra
 ct and present the stored data to users. \n\nMy poster/talk will illustrat
 e how the CAM Team has built and implemented the KatStore64 service for th
 e storage and retrieval of sensor data for the MeerKAT Radio Telescope.\n\
 nhttps://events.chpc.ac.za/event/47/contributions/1133/
LOCATION:Birchwood
URL:https://events.chpc.ac.za/event/47/contributions/1133/
END:VEVENT
BEGIN:VEVENT
SUMMARY:EOS storage system
DTSTART;VALUE=DATE-TIME:20191204T092000Z
DTEND;VALUE=DATE-TIME:20191204T094000Z
DTSTAMP;VALUE=DATE-TIME:20260413T043753Z
UID:indico-contribution-47-1132@events.chpc.ac.za
DESCRIPTION:Speakers: Sean Murray (CHPC\, CSIR)\nEOS is the storage system
  of choice for CERN's data storage and\nis used by various WLCG Tier1 and 
 Tier2 facilities\, including at the CHPC.\nEOS is an elastic\, adaptable\,
  and scalable software based solution for\ncentral data recording\, user a
 nalysis and data processing.\n\nIt has a multitude of supported protocols 
 and authentication methods.\nWe will present what EOS is\, what is does\, 
 and how we use EOS in \nconjunction with our Tier 2 facility\, and how EOS
  is used in a couple of \nother examples.\n\nhttps://events.chpc.ac.za/eve
 nt/47/contributions/1132/
LOCATION:Birchwood
URL:https://events.chpc.ac.za/event/47/contributions/1132/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Next Generation Sequencing: where big data and high-performance co
 mputing meet
DTSTART;VALUE=DATE-TIME:20191202T092000Z
DTEND;VALUE=DATE-TIME:20191202T094000Z
DTSTAMP;VALUE=DATE-TIME:20260413T043753Z
UID:indico-contribution-47-1045@events.chpc.ac.za
DESCRIPTION:Speakers: Brigitte Glanzmann (DST-NRF Centre of Excellence for
  Biomedical Tuberculosis Research\, South African Medical Research Council
  Centre for Tuberculosis Research\, Division of Molecular Biology and Huma
 n Genetics\, Faculty of Medicine and Health Sciences\, Stellenbosch Univer
 sity\, Cape Town\, South Africa. )\nThe advent and evolution of next gener
 ation sequencing (NGS) has considerably impacted genomic research\, includ
 ing precision medicine. High-throughput technology currently allows for th
 e generation of billions of short DNA or RNA sequence reads within a matte
 r of hours. This becomes extremely important in the case of genetic disord
 ers where rapid and inexpensive access to a patient’s individual genomic
  sequence is imperative and enables target variant identification. NGS tec
 hnologies results in the generation of large data sets which require exten
 sive bioinformatic and computational resources. Computational life science
 s therefore relies on the implementation of well-structured data analysis 
 pipelines as well as high-performance computing (HPC) for large-scale appl
 ications. Here\, we report the sequencing of the first six whole human gen
 omes in South Africa and the processing of the data in collaboration with 
 the Centre for High Performance Computing (CHPC). Efficient parallel and d
 istributed implementations of common time-consuming NGS algorithms on mode
 rn computational infrastructures are imperative. The latter becomes pivota
 l as NGS will continue to transcend from research labs to clinical applica
 tions in the near future.\n\nhttps://events.chpc.ac.za/event/47/contributi
 ons/1045/
LOCATION:Birchwood
URL:https://events.chpc.ac.za/event/47/contributions/1045/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Parallel and Distributed Search Algorithms
DTSTART;VALUE=DATE-TIME:20191203T090000Z
DTEND;VALUE=DATE-TIME:20191203T093000Z
DTSTAMP;VALUE=DATE-TIME:20260413T043753Z
UID:indico-contribution-47-1041@events.chpc.ac.za
DESCRIPTION:Speakers: Cornelia Inggs (Stellenbosch University)\nWe investi
 gate the parallelisation and performance analyses of search and planning a
 lgorithms for artificial intelligence\, machine learning\, and software ve
 rification. These applications involve the exploration of large state spac
 es\, which requires at its core a combinatorial search algorithm. Much of 
 our work\, therefore\, focuses on evaluating and improving the scalability
  of algorithms used in all these tasks.\n\nIn recent work we have implemen
 ted various parallel and distributed MCTS algorithms with different enhanc
 ement strategies for artificial intelligence\, tested them for scalability
 \, and compared the performance of these approaches on the same domain and
  the same hardware infrastructure.  We make use of the CHPC's large queue 
 to determine scalability up to 128 12-core compute nodes with 32GB RAM eac
 h---values that are in line with previous publications and distributed sea
 rch implementations. We wrote our application code in Java\, using an acto
 r model framework (Akka) to simplify concurrency and distributed computing
 . We make limited use of MPI---more specifically\, just mpirun---in order 
 to easily launch our application on the available nodes using the PBS node
 file.\n\nThis talk will provide an overview of our research and the proble
 ms we investigate\, as well as a discussion of recent results.\n\nhttps://
 events.chpc.ac.za/event/47/contributions/1041/
LOCATION:Birchwood
URL:https://events.chpc.ac.za/event/47/contributions/1041/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Advancing the understanding of interactions between plasma arcs an
 d power electronics with high performance computing
DTSTART;VALUE=DATE-TIME:20191203T094000Z
DTEND;VALUE=DATE-TIME:20191203T100000Z
DTSTAMP;VALUE=DATE-TIME:20260413T043753Z
UID:indico-contribution-47-1054@events.chpc.ac.za
DESCRIPTION:Speakers: Quinn Reynolds (Mintek)\nDirect-current (DC) arc fur
 naces account for a significant proportion of installed pyrometallurgical 
 capacity worldwide. Their applications include steel recycling as well as 
 smelting of various materials such as ferrochromium\, ferronickel\, ilmeni
 te\, and others. In order to provide power to such furnaces\, alternating 
 current from the grid or other generation sources must be converted into D
 C by rectification. At industrial scales the rectifier unit is often the s
 ingle largest capital cost item\, and any errors in its specification can 
 result in the entire plant operating inefficiently (or not at all).\n\nIn 
 this presentation\, computational plasma arc models developed in OpenFOAM
 ® are coupled with circuit simulations of solid-state furnace rectifiers 
 in order to gain insight into the complex interactions between the rectifi
 er’s design parameters and the behaviour of the arc. Such approaches pro
 vide a first step toward true virtual prototyping and digital twin modelli
 ng for the electrical design and optimisation of DC arc furnaces. \n\nHigh
  performance computing is a critical enabling tool in such studies\, and v
 arious aspects of this – including solver performance scaling analysis\,
  software automation\, and use of methodologies from other HPC fields – 
 will be touched on during the presentation.\n\nhttps://events.chpc.ac.za/e
 vent/47/contributions/1054/
LOCATION:Birchwood
URL:https://events.chpc.ac.za/event/47/contributions/1054/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Computational Modelling Study on Stability of Li-S/Se System
DTSTART;VALUE=DATE-TIME:20191204T121000Z
DTEND;VALUE=DATE-TIME:20191204T123000Z
DTSTAMP;VALUE=DATE-TIME:20260413T043753Z
UID:indico-contribution-47-1067@events.chpc.ac.za
DESCRIPTION:Speakers: Masedi Cliffton (UL)\nLithium Sulphur batteries suff
 ers from the low conductivity of S and the solubility of intermediary poly
 sulfide species during cycling. It has been reported that Se and mixed Sex
 Sy represent an attractive new class of cathode materials with promising e
 lectrochemical performance in reactions with both Li ions. Notably\, unlik
 e existing Li/S batteries that only operate at high temperature\, these ne
 w Se and Li/SexSy electrodes are capable of room temperature cycling. To s
 tudy large systems and impact of temperature effectively\, empirical inter
 atomic potentials of Li2S were derived and validated against available exp
 erimental structure and elastic properties. Complex high temperature trans
 formations and melting of Li2S was reproduced\, as deuced from molecular d
 ynamics simulations. Li2S was found to withstand high temperatures\, up to
  1250K each which is a desirable in future advanced battery technologies. 
 Cluster expansion and Monte-Carlo simulations were employed to determine p
 hase changes and high temperature properties of mixed Li2S-Se. The former 
 generated 42 new stable multi-component Li2S-Se structures. Monte Carlo si
 mulations produced thermodynamic properties of Li2S-Se system for the enti
 re range of Se concentrations obtained from cluster expansion and it demon
 strated that Li2S-Se is a phase separating system at 0K but changes to mix
 ed system at approximately 350K.\n\nhttps://events.chpc.ac.za/event/47/con
 tributions/1067/
LOCATION:Birchwood
URL:https://events.chpc.ac.za/event/47/contributions/1067/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Computational Catalysis @ NWU
DTSTART;VALUE=DATE-TIME:20191204T113000Z
DTEND;VALUE=DATE-TIME:20191204T115000Z
DTSTAMP;VALUE=DATE-TIME:20260413T043753Z
UID:indico-contribution-47-1064@events.chpc.ac.za
DESCRIPTION:Speakers: Cornie van Sittert (North-West University)\nAt the N
 orth-West University (NWU) interest in incorporating computational chemist
 ry in training and research started in the late 1990.  Although there was 
 not much support for this interest in computational chemistry\, the need f
 or understanding the chemistry in especially catalysis was identified.\n\n
 Starting with old discarded computers and the cheapest possible software\,
  the first attempts were made in calculating structures of transition meta
 l complexes used in catalysis research at NWU.  Due to the limitation of t
 he resources\, only gas phase reactions in homogeneous catalysis could be 
 investigated.  As the value of the computational investigations became evi
 dent\, more resources were acquired.\n\nIt was however only in 2002 that s
 upport from the Research Focus Area (previously called Separation Technolo
 gy) at NWU was obtained.  Formal training of one staff member and the esta
 blishment of a dedicated Laboratory for Applied Molecular Modelling was fu
 nded.  After careful evaluation of the needs in the research and the abili
 ties of the researchers\, it was decided to invest in Accelrys Materials S
 tudio (for research) and Spartan (for training) software.  At the same tim
 e 10 workstations and a 12 CPU cluster were acquired.\n\nAlthough this was
  a major step forward\, catalysis research was still limited to gas phase 
 reactions in homogeneous catalysis investigations\, with transition state 
 calculations being a challenge.  At this stage the CHPC was established.  
 After a short phase of development and streamlining operations and softwar
 e at CHPC\, access to these resources were obtained by NWU researchers.\n\
 nWith the access to CHPC resources\, limitations to the type of investigat
 ions gradually disappeared.  The homogeneous catalysis investigations coul
 d be expanded to real system investigation\, including solvents.  Models c
 ould be expanded from explanations of observations to prediction for activ
 ity.  Heterogeneous catalysis could also be included in research.\n\nNow\,
  computational catalysis research at NWU was ready to investigate real pro
 blems and try to find solutions.  One such real problem being investigated
  at NWU is the development of new/alternative catalysts to apply in the ge
 neration of alternative and renewable energy.\n\nhttps://events.chpc.ac.za
 /event/47/contributions/1064/
LOCATION:Birchwood
URL:https://events.chpc.ac.za/event/47/contributions/1064/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Field-induced exotic electronic phases in spin-filter tunnel junct
 ions
DTSTART;VALUE=DATE-TIME:20191202T094000Z
DTEND;VALUE=DATE-TIME:20191202T100000Z
DTSTAMP;VALUE=DATE-TIME:20260413T043753Z
UID:indico-contribution-47-1046@events.chpc.ac.za
DESCRIPTION:Speakers: Aniekan Ukpong (UKZN)\nABSTRACT:\nThe future of spin
 tronics based on 2D-materials is dependent on the effectiveness of the inj
 ection of pure spin current into a tunnel barrier region. Here\, first pri
 nciples calculations are used to show that the efficiency of the spin-filt
 ering across the semiconducting barriers of monolayer hBN is mainly limite
 d by the dynamical response of tunneling electrons to the applied axial fi
 eld. By projecting the effective electric field gradient densities and mag
 netic shielding constants across constitutive atomic layers in the scatter
  region of spin-filter tunnel junctions\, an unusual site-dependent spin r
 esponse is unraveled at the Fe/hBN and hBN/metal heterobilayer interfaces.
  Since the ground-state energy has no lower bound in extended electric fie
 lds\, our analyses of the dependence of the Fermi surface topology on appl
 ied electric fields show the emergence of a frustrated electronic order. T
 his exotic electronic phase is characterized by electric-field induced spi
 n-flip relative to the ferromagnetic ground state\, and observable as fiel
 d-tunable perpendicular magnetic anisotropy.\n\nHPC Content: \nAll the cal
 culations were performed in parallel using version 6.4.1` of the Quantum E
 SPRESSO suite. Due to poor code scalability\, all the computations were ca
 rried out on the 'SMP que' using 1 Node of 24 CPU cores. No net gain in co
 mputing speed was observed when more nodes were used on larger system size
 s. In fact\, the speed of the computations significantly reduced to > 6 CP
 U hours per scf-cycle when the same jobs were running on the 'Normal que' 
 at 10 Nodes at double the system-size. The main computational challenge li
 es in solving the associated Poisson’s equation for atoms in the presenc
 e of the compensating potential due to externally-applied fields under gau
 ge-corrections\, in a more efficient manner. For fully-converged field-dep
 endent computations\, an average duration per task was timed at 2d 0h19m (
 CPU time) and 2d 0h40m (WALL time). This is still too 'slow' for scientifi
 c computing jobs executed on a supercomputer.\n\nhttps://events.chpc.ac.za
 /event/47/contributions/1046/
LOCATION:Birchwood
URL:https://events.chpc.ac.za/event/47/contributions/1046/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Convective scale modelling on the CHPC: ICON vs COSMO models
DTSTART;VALUE=DATE-TIME:20191203T100000Z
DTEND;VALUE=DATE-TIME:20191203T102000Z
DTSTAMP;VALUE=DATE-TIME:20260413T043753Z
UID:indico-contribution-47-1055@events.chpc.ac.za
DESCRIPTION:Speakers: Patience Mulovhedzi (South African Weather Service)\
 nThe South African Weather Service has recently embarked on a business cas
 e to address its computational needs. Part of this was to identify the mos
 t suitable convective scale Numerical Weather Prediction (NWP) model for t
 he Southern African region. The Unified model (UM)\, the main model run by
  SAWS for operational purposes\, the Weather Research and Forecasting (WRF
 ) Model and the Consortium for Small-scale Modeling (COSMO) model were use
 d for the study. A number of weather parameters were selected for the stud
 y\, and results generally showed that the three models are comparable. How
 ever\, with much model development taking place around the world\, the COS
 MO will soon be replaced by the Icosahedral Non-hydrostatic (ICON) model. 
 It\, therefore\, makes sense to conduct the same study for the ICON as for
  the COSMO in order to investigate whether the new model is an improvement
  of the former one. Simulations for both the COSMO and ICON are run on the
  CHPC.\n\nhttps://events.chpc.ac.za/event/47/contributions/1055/
LOCATION:Birchwood
URL:https://events.chpc.ac.za/event/47/contributions/1055/
END:VEVENT
BEGIN:VEVENT
SUMMARY:In-silico investigations of metal coordinating enzymes: From Biofu
 el production to Antimicrobial drug resistance
DTSTART;VALUE=DATE-TIME:20191204T094000Z
DTEND;VALUE=DATE-TIME:20191204T100000Z
DTSTAMP;VALUE=DATE-TIME:20260413T043753Z
UID:indico-contribution-47-1062@events.chpc.ac.za
DESCRIPTION:Speakers: Vuyani Moses (Vuyani)\nEnzymes which are directly bo
 und to metal cofactors are referred to as Metalloenzymes. These enzymes pl
 ay various biologically important roles from catalyzing electron transfer 
 reactions to being important structural components of protein structures. 
 Due to the abundance of metal containing enzymes and the role they play in
  important biological processes\, it is important to study these enzymes. 
 In-sillico approaches are readily used to study protein structure and Mole
 cular Mechanics (MM) is an essential tool used for understanding protein d
 ynamics. MM is used to describe protein behavior by applying a MM force fi
 eld to describe bonded and nonbonded terms of a protein structure. The acc
 uracy of the force field in describing a particular protein structure is h
 ighly dependent on the force field parameters. Unfortunately\, for metallo
 enzymes there are no currently available force fields which can accurately
  describe the coordination environment of metals in metaloenzymes. As a re
 sult\, performing an accurate Molecular Dynamics (MD) for metalloenzymes i
 s extremely challenging using available force fields. To overcome this lim
 itation Quantum Mechanics (QM) may be applied to elucidate the parameters 
 required for accurate description of metalloenzymes during MD simulations.
  This approach involves the use of potential energy surface (PES) scans to
  evaluate the angles\, bonds and dihedral parameters that are important to
  describe the metal binding site. Experimentally derived energy profiles g
 enerated from PES scans are then fitted using least squares fitting to a t
 heoretical force field to generate the force field parameters. This approa
 ch three cases of metal coordinating enzymes.  The first are the Auxilliar
 y Activity family 9 (AA9) enzymes which are Cu(II) containing enzymes that
  have been shown to increase the rate of cellulose degradation. Secondly\,
  new parameters were also used in the identification of novel inhibitory c
 ompounds against the Mn(II) coordinating HIV-1 reverse transcriptase enzym
 e. Finally\, this approach was applied to the Zn(II) Bi metallic active si
 te center of Beta lactamase enzymes which are contributors to the developm
 ent of bacterial antibiotic resistance. For all three cases force field pa
 rameters were successfully generated and validated using MD simulations\n\
 nhttps://events.chpc.ac.za/event/47/contributions/1062/
LOCATION:Birchwood
URL:https://events.chpc.ac.za/event/47/contributions/1062/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Use of high-throughput sequencing to uncover resistance mechanisms
  in sugarcane
DTSTART;VALUE=DATE-TIME:20191204T100000Z
DTEND;VALUE=DATE-TIME:20191204T102000Z
DTSTAMP;VALUE=DATE-TIME:20260413T043753Z
UID:indico-contribution-47-1063@events.chpc.ac.za
DESCRIPTION:Speakers: Robyn Jacob (South African Sugarcane Research Instit
 ute)\nThe sugarcane industry is an important agricultural activity in Sout
 h Africa generating an annual estimated average direct income of R14 billi
 on. Economic loss due to Eldana saccharina (eldana)\, a lepidopteran stem-
 borer\, is estimated to be R1 billion per annum. Commercial sugarcane cult
 ivars (Saccharum spp. hybrids)\, have different susceptibility ratings to 
 eldana\, varying from low to high risk of sustaining economically damaging
  infestations. The South African Sugarcane Research Institute has utilised
  the resources of the Centre for High Performance Computing in an approach
  involving high-throughput RNA sequencing (RNA-seq) to identify early and 
 late response genes that are differentially expressed in two sugarcane cul
 tivars possessing contrasting resistance phenotypes when challenged with e
 ldana herbivory. The results will be used to identify molecular mechanisms
  involved in the successful defence response and identify candidate genes 
 which are most likely to be useful in breeding for resistance to eldana. \
 n\nHPC content: \nAnnotation and assembly is a computationally intensive p
 rocess that requires considerable CPU time and effort. Various bioinformat
 ic tools were used for the de novo transcriptome assembly and the differen
 tial expression analyses required in this project.\n\nhttps://events.chpc.
 ac.za/event/47/contributions/1063/
LOCATION:Birchwood
URL:https://events.chpc.ac.za/event/47/contributions/1063/
END:VEVENT
BEGIN:VEVENT
SUMMARY:KEYNOTE 5 (DIRISA):  DALI: A Data life cycle instrument for manage
 ment and sharing of data: Towards the reproducibility and data reuse of sc
 ientific research
DTSTART;VALUE=DATE-TIME:20191204T070000Z
DTEND;VALUE=DATE-TIME:20191204T074500Z
DTSTAMP;VALUE=DATE-TIME:20260413T043753Z
UID:indico-contribution-47-1068@events.chpc.ac.za
DESCRIPTION:Speakers: Hakizumwami Birali Runesha (University of Chicago)\n
 Researchers today are generating volume of data from simulations\, instrum
 ents and observations at accelerating rates\, resulting in extreme challen
 ges in data management and computation. In addition to publications\, scie
 ntists now produce a vast array of research products such as data\, code\,
  algorithms and a diversity of software tools. However\, scholarly publica
 tions today are still mostly disconnected from the underlying data and cod
 e used to produce the published results and findings\, which need to be sh
 ared. This presentation will discuss a funded project to acquire and opera
 te an extensible Data Lifecycle instrument (DaLI) for management and shari
 ng of data from instruments and observations that will enable researchers 
 to (i) acquire\, transfer\, process\, and store data from experiments and 
 observations in a unified workflow\, (ii) manage data collections over the
 ir entire life cycle\, and (iii) share and publish data. This presentation
  will also discuss our approach in generating and sharing data and artifac
 ts associated with research publications\, and therefore\, providing acces
 s to a platform that makes data findable\, accessible\, interoperable\, re
 usable and reproducible.\n\nhttps://events.chpc.ac.za/event/47/contributio
 ns/1068/
LOCATION:Birchwood
URL:https://events.chpc.ac.za/event/47/contributions/1068/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Speech data harvesting and factorized deep neural network developm
 ent for the low-resource indigenous languages of South Africa
DTSTART;VALUE=DATE-TIME:20191204T115000Z
DTEND;VALUE=DATE-TIME:20191204T121000Z
DTSTAMP;VALUE=DATE-TIME:20260413T043753Z
UID:indico-contribution-47-1065@events.chpc.ac.za
DESCRIPTION:Speakers: Jaco Badenhorst (CSIR Next Generation Enterprises an
 d Institutions Cluster)\nSpeech-enabled human-machine interfaces are becom
 ing increasingly prevalent. There is\, however\, a disparity between the l
 anguages these systems are available in and the languages spoken in South 
 Africa. This is because a tremendous effort is required to collect and ref
 ine large speech corpora. For the majority of South Africa’s languages\,
  very little speech data is available. This creates a challenge due to the
  “data hungriness” of automatic speech analysis techniques\, specifica
 lly those which yield the best performance such as deep learning. Once suf
 ficient initial resources exist\, automatic data harvesting strategies can
  be used to further increase the amount of available data in under-resourc
 ed languages. The aim of the work we report on is to improve existing ASR 
 systems for the 11 official languages of the country. These systems are cu
 rrently based on the recordings of the NCHLT (National Centre for Human La
 nguage Technology) corpus\, which consists of an estimated 55 hours of spe
 ech obtained from 200 speakers for each language. During the NCHLT project
  additional speech data was collected but not released. To determine wheth
 er this additional data is useful\, acoustic evaluation of the audio is re
 quired to detect both low signal-to-noise ratios (SNRs) and transcription 
 errors. The decoding process with state-of-the-art Deep Neural Network-bas
 ed models is more than an order of magnitude slower than real time on CHPC
 ’s Lengau Dell cluster’s processors. It requires more than 10 CPU hour
 s to process one hour of audio data. We therefore used CHPC to run the job
 s required for processing one language in parallel. This allowed for appro
 ximately 200 hours of auxiliary data to be processed and used less than 50
  GB of memory. The harvested data from this process was subsequently used 
 to train factorized time-delay neural networks (TDNN-F). This model archit
 ecture was recently shown to yield good results in resource constrained sc
 enarios (less than 100 hours of speech). These models significantly reduce
 d phone error rates for the 11 languages. Each TDNN-F model trained in abo
 ut 8-12 hours on CHPC's Lengau GPU cluster.\n\nhttps://events.chpc.ac.za/e
 vent/47/contributions/1065/
LOCATION:Birchwood
URL:https://events.chpc.ac.za/event/47/contributions/1065/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Finite volume modelling of dense granular flow in rotary kilns
DTSTART;VALUE=DATE-TIME:20191204T123000Z
DTEND;VALUE=DATE-TIME:20191204T125000Z
DTSTAMP;VALUE=DATE-TIME:20260413T043753Z
UID:indico-contribution-47-1066@events.chpc.ac.za
DESCRIPTION:Speakers: Alfred Bogaers (Ex Mente Technologies)\nGranular mat
 erials are used in several industrial applications. One example of such an
  industrial application is rotary kilns\, often used for drying\, pre-heat
 ing and the reduction of a moving\, high-temperature granular bed. The gra
 nular flow in these reactors have an important influence on capacity\, pro
 duct quality\, and economic feasibility. Rotary kilns\, in the pyrometallu
 rgical industry\, often have diameters up to 6m\, with lengths in excess o
 f 80m\, and operating at temperatures of 1000 to 1400°C. Because of the s
 ize of these kilns\, modelling the granular flow using the discrete elemen
 t method (DEM) would result in excessively high computational costs. In th
 is work\, we therefore made use of a continuum approach to describe the gr
 anular flow. \n\nWe adopted the μ(I) dense granular flow model proposed b
 y de Cruz et al. (2005) and later extended by Jop et al. (2006). This mode
 l is a rate-dependent\, phenomenological description of dense granular mat
 erials and can be characterised as an elsto-viscoplastic material descript
 ion with a frictional yield criteria. The flow model approximates an effec
 tive friction coefficient through a relationship between plastic flow stra
 in rates and a confinement time scale to account for the internal\, inter-
 particulate motion. \n\nWe implemented the material model into OpenFOAM\, 
 an open source\, finite volume (FV) based\, partial differential equation 
 toolkit. The volume of fluid (VoF) method was used to capture the discrete
  granular-fluid interface\, enabling the simulation of large granular bed 
 deformations. The numerical scheme was stabilised by using pressure and vi
 scosity regularisation\, along with a semi-implicit coupling between the i
 nternal pressure and velocity fields.\n\nIn our project we were faced with
  serious technical and computational challenges involving combustion\, hea
 t transfer\, fluid flow\, high-temperature chemistry\, and the movement of
  a large granular bed. Our FV approach enabled us to make valuable computa
 tional modelling and simulation contributions to the development of a new 
 high-temperature process technology.\n\nhttps://events.chpc.ac.za/event/47
 /contributions/1066/
LOCATION:Birchwood
URL:https://events.chpc.ac.za/event/47/contributions/1066/
END:VEVENT
BEGIN:VEVENT
SUMMARY:KEYNOTE 4:  Scientific Applications on Heterogeneous Architectures
  – Data Analytics and the Intersection of HPC and Edge Computing
DTSTART;VALUE=DATE-TIME:20191203T074500Z
DTEND;VALUE=DATE-TIME:20191203T083000Z
DTSTAMP;VALUE=DATE-TIME:20260413T043753Z
UID:indico-contribution-47-1076@events.chpc.ac.za
DESCRIPTION:Speakers: Michela Taufer (University of Tennessee Knoxville)\n
 This talk discusses two emerging trends in computing (i.e.\, the convergen
 ce of data generation and analytics\, and the emergence of edge computing)
  and how these trends can impact heterogeneous applications. Next-generati
 on  supercomputers\, with their extremely heterogeneous resources and dram
 atically higher performance than current systems\, will generate more data
  than we need or\, even\, can handle. At the same time\, more and more dat
 a is generated at the “edge\,” requiring computing and storage to move
  closer and closer to data sources. The coordination of data generation an
 d analysis across the spectrum of heterogeneous systems including supercom
 puters\, cloud computing\, and edge computing adds additional layers of he
 terogeneity to applications’ workflows. More importantly\, the coordinat
 ion can neither rely on manual\, centralized  approaches as it is predomin
 ately done today in HPC nor exclusively be delegated to be just a problem 
 for commercial Clouds. This talk presents case studies of heterogenous app
 lications in precision medicine and precision farming that expand scientis
 t workflows beyond the supercomputing center and shed our reliance on larg
 e-scale simulations exclusively\, for the sake of scientific discovery.\n\
 nhttps://events.chpc.ac.za/event/47/contributions/1076/
LOCATION:Birchwood
URL:https://events.chpc.ac.za/event/47/contributions/1076/
END:VEVENT
END:VCALENDAR
