MG: I am the Director of the Centre for eResearch (CeR) at Auckland University (www.eresearch.auckland.ac.nz).  CeR was established 4 years ago, and since founding has grown from 3 initial staff to 25 people in 2013, some funded by NeSI and others supported by various deans and departments in the University.  We’ve also grown from 20 customers that first year to having almost 400 researchers involving CeR in their research work each year, as we’ve built our HPC capacity to over 5000 cores.

MG: I believe that the investment to establish the CeR facility as a service for researchers has made a real difference to research outcomes at the University of Auckland, both in terms of the scale of the research questions we are now tackling and the growing international repute of the University and its research.  The keys to making an impact have been having a local presence and offering researcher-friendly, personalised service.  Researchers come in the door looking for help.  CeR works with them to translate their research into terms that can successfully deliver results through HPC.  In this sense, HPC as a service falls into the category of complex, interpersonal, professional services.

MG: Our work can be segmented by the kind of help a researcher needs.  For example, 15% – 20% of our customers need a great deal of help.  They want HPC as a service delivered through a web browser – they don’t want to interact with code or with cores.  At the other extreme, 20% of researchers that work with us need very little help; perhaps they are experienced with HPC and computer science, and just want to be pointed to the processors and left to get on with their work.  The 60% in the middle are the group that will ultimately grow eResearch capability in New Zealand.  These are the 60% of researchers who want to be empowered, not just served.  This group are keen to learn more about the potential HPC brings to their research.

MG: Unlike many researchers using HPC in New Zealand, my background and research focus really is in eScience.  New Zealand is starting from a position with eResearch that is very different from other countries – these countries are focused on understanding and supporting reproducible science, linked science, data semantics, coupling workflows – conversations and aspirations for eResearch as a worldwide community.  New Zealand differs because we are still focused on tangible assets such as network, compute and data.  While I believe we do need to focus on getting the basics right with HPC; ultimately I would like to see NZ’s conversation move towards a more linked-up, reimaging of our science into the 21st century where computing enables rather than constrains research.

MG: If you consider our workflow, here in 2013, our thoughts, conversations, notes and outputs are fragmented into emails, code, data, files, documents – there is no “look through” process available to us to gain a cohesive picture of all of our work.  Technology and computing in particular has fragmented our creative and analytic work processes.  What if it wasn’t like that?  To me, this is what eResearch is about.  Our academic publishing paradigm hasn’t changed since the founding of the Royal Society in the 1600s.

So, my vision of 2020 and eResearch is a new paradigm for the journal article – the instrument of academic output.  We need to be able to link science to a model or pathway that also provides the semantics and connections used in the science process.  We do have some local researchers who are thought leaders in this field, yet even then NZ is well behind the rest of the world in understanding this challenge.

This vision needs to be enabled by the publishers and editors of our academic journals.  We need to make this leap if we are to continue to rely on and require the repeatability of science from our researchers.  So much scientific endeavour is now linked to compute that, without access to reproducible workflows that are self-explanatory and embedded in the underlying process of science, we will eventually be unable to repeat, prove, disprove or rely on much of the academic output we fund.

MG: Our HPC resources and investments in NZ have been made in batch-style HPC.  Interactive HPC was ruled out for our start-up phase as we lack the people to be able to support this capability; however a move to interactive HPC seems like a natural next step.  For many customers, infrastructure needs to be invisible – achieving this takes integration of multiple solutions.

MG: One challenge is to be able to describe data in a way that it can be found and reused in a manner not previously considered.  If we can achieve this, then data does not need to be re-gathered and much duplication can be avoided.  This need, to discover relevant data that already exists, has huge commercial potential.  Some organisations, such as Energistics in the US (www.energistics.org) focus on trying to produce meta-data for commercially sensitive information held by companies.  The aim is to allow others to search for useful data hidden behind proprietary screens – and if it looks useful, to purchase access.

MG: Small countries still have big problems; we can’t rely on small science to solve them.  For example, to be at the leading edge in diary genomics is an economic necessity for NZ.  A small percentage difference in growing grass, or diary nutrition, or milk production all have major impact on our comparative advantage and economic fortunes in the global economy.  For NZ, Genomics is an essential eResearch, compute driven discipline.  If we take a backseat in the science, then we are really taking a backseat in economic development and global competitiveness.

MG: We do have cloud services commercially available to us, but these are not optimised for research services.  Most science users need 3 things:

–        Highly performant compute nodes – high speed interconnect, high speed data, highly reliable and available;

–        Connectivity to those nodes (I suspect that if all our CeR users were to try to rely on an offshore compute for their research, we’d quickly run out of international bandwidth), and

–        In depth help from people – often all the way back to the hardware layer – which can’t really be done on an offshore hosted service.

MG: Unless you know how to optimise the hardware to support your research needs, you’re not really in the game.  To support NZ research we need to build this level of optimisation and specification skills into the infrastructure of the country.  By maintaining this level of infrastructure and knowledge in NZ, we then are afforded the opportunity to use international facilities.  We can be competitive in gaining access to big HPC resources because our smaller local facilities and local knowledge allow our researchers to become practiced enough to win an allocation on a major international systems such as Oakridge or Argonne.  Some of our key New Zealand research in bio-engineering and nano-materials need to run at the kind of scale only achievable in these really big systems.