CP: My research field is large scale genomics analysis of human disease.  I work with HPC on a commodity basis, with very large datasets that are very well suited to parallelisation but require quite a long time to analyse.  I also Chair the Project Advisory Group for NZGL, so I have to understand the IT and technology needs of NZGL’s client researchers.

CP: Actually, either NZ solutions (NESI, NZGL) or overseas commodity services such as Amazon Web Services (AWS) are well suited to my scale and type of genomics HPC work.  The exception to this is where I’m working with confidential or private data – usually because it’s human genomic data or due to restrictions from patient consents.  Much of the recent HPC work I have been involved in has been done in Japan by my Tokyo collaborators due to cost effective, high performance resources being available there.  Bandwidth is a factor in sourcing HPC as a service, though I’m aware there are initiatives underway that might alter the bandwidth limitations between NZ and the rest of the world.

CP: Once genomics reaches a certain scale, the amount of data generated grows enormously.  I have recently analysed a cohort of 50 patients which generated approximately 1 terabyte of data per patient.  Ideally, all this data would be accumulated in one place while we scheduled the bandwidth to transfer it in bulk to an analysis facility or service.  Even better it could remain in NZ.

CP: With genomics we are fortunate in that our work is well suited for the commodity based HPC architectures commonly provided as on demand service.  I understand the IT system built within NZGL is currently seen as an interim system, providing smaller scale support while the field grows in NZ and we develop better international connectivity, then organisations can decide what to do.

CP: In my view, in the future clinical use of genomics will grow to become commonplace.  At clinical scale the quantity of data, and the processing required to make sense of it, will quickly eclipse the current infrastructure capabilities in NZ.  As genomics moves into the mainstream as a clinical tool over the next 5 to 10 years, our health and health research sectors are going to require provision of much better information technologies, security and IT infrastructure.  A key and ongoing challenge in this area will be confidentiality – the need to keep patients’ details private.

CP: What we are also starting to see are signs of change in how people think about privacy.  Attitudes are changing towards confidentiality, and people are becoming more accepting of their data being used – especially for research.  In one scenario I have heard described (but am unsure of the ethical considerations myself), people may actually wish to put their data forward and earn a small personal revenue stream by offering their information to research organisations.  On average the younger generation seem more relaxed about sharing their identities and information.  New Zealand’s is a small, educated, and genetically diverse population, and therefore a potentially valuable resource for research.

CP: Looking to the future of infrastructure, we will always face rent versus buy decisions, but there are many open questions.  How much computing capacity do we really need in NZ – and how much of it needs to be owned with the overheads of insurance, rapid obsolescence and depreciation?  Future-proofing our existing infrastructure is a big challenge.  There are also larger system questions about the security of confidential data and the degree to which we want to rely on digital records.  If we do away with paper records in the health system for example, how do we recover after a catastrophic event such as a solar flare?  As our needs grow, how can our resources be unbound for better payback?  We still have a lot of minor HPC capacity around the country – is there an approach to linking these that can pave the way for a new model?  How should we deal with the rate of change in storage capacity?