Treatment of intracranial aneurysms with flow diverter stents (FDS) can lead to calibre changes of the jailed vessels in a subacute phase. The reason some branches remain unchanged and others are affected by narrowing or occlusion is unknown. This study investigates the influence of resistance to flow on FDS-induced haemodynamic modifications
in typical aneurysm locations in bifurcating arteries. Full Abstract
10:35
Jazmin Aguado-Sierra
The anatomically detailed human ventricles versus the simplified human anatomies: why shape and sex matters.
Computational modelling is becoming increasingly important towards the understanding, the diagnosis and the treatment of patients worldwide. It is a fact that modellers have been employing every last bit of information available to them to personalise and parameterise such models to increase their accuracy towards clinical applications. Data acquisition in the clinical scenario is many times hindered by monetary, ethical, time constraints, or simply the fact that there are some variables that are impossible to measure in-vivo. In this talk, we analyse the effect of trabeculae and papillary muscles on electrophysiology simulations of the male and female hearts. The aim is to characterise the effects of lack of anatomical resolution on the study of Ventricular Tachycardia. Furthermore, we analyse the role of gender phenotype on such simulations. Results show that anatomical detail and gender phenotype does really matter and provide different outcomes on computer simulation studies.
10:50
Raymond Padmos
Connecting Arterial Blood Flow to Tissue Perfusion for In Silico Trials of Acute Ischaemic Stroke
Predicting infarct volume is necessary to develop in silico trials for the treatment of acute ischaemic stroke. This requires modelling of blood flow across length scales incorporating three orders of magnitude, from the large arteries, to the arterioles, the pial surface vessels, to the penetrating vessels and the microcirculation. Blood flow in large vessels are typically modelled using lumped parameter or 1-D blood flow models, whereas the microcirculation is typically modelled as a porous medium[1]. However, the patient-specic geometry of large vessels is known, the features of the microcirculation are captured statistically. Therefore, there is an information gap between the large vessels and the microcirculation. Here, we present a method to couple blood flow in large blood vessels to cerebral tissue perfusion. A tissue perfusion model is also being developed but is outside the scope of this abstract1. For more details and derivations of the models, see [1, 2]. Full Abstract
11:05
Yun Bing
A novel multi-scale, multi-compartment model of oxygen transport – Towards in-silico clinical trials in the entire human brain
The in-silico clinical trials for the treatment of acute ischemic stroke (INSIST) consortium is a multi-disciplinary, multi-sectorial undertaking aiming to advance the understanding and treatment of ischemic stroke through computational simulations and clinical trials. The work presented here is a part of this project which aims to model oxygen transport and metabolism in the entire human brain. This will form the backbone of the in-silico trials as this model, coupled with the multi-scale model of the blood flow in the human brain presented elsewhere in this conference1, will predict regions of hypoxia post-stroke, and hence will predict tissue death. This can then be validated against an available large database of stroke patients. Full Abstract
11:20
Andrew Narracott
Delivering the CT2S computational workflow directly to the clinic
In recent years there have been significant developments in the application of computational workflows to enhance the clinical decision-making process. Many of these applications have been led by groups with an engineering focus and direct delivery of workflows within the clinical environment remains relatively uncommon. Due to the complex nature of patient specific anatomy and physiology, central to the effective development of state-of-the-art computational tools, workflows often require High Performance Computing (HPC) approaches and infrastructure to produce accurate, clinically relevant, output parameters. To improve clinical uptake of such technologies there is a need to provide direct access to such workflows to clinical end-users without exposing the complexity of the underlying HPC environment. This abstract describes the development of a software framework to deliver an existing HPC computational workflow Computed Tomography to Strength (CT2S), which provides quantitative metrics of bone strength based on CT images, directly to the clinical end-user. This provides the opportunity to initiate the request for computational analysis directly from the clinic, and returning an analysis report directly to the requesting clinician. Full Abstract
11:35
Xinshan Li
(Invited Speaker)
A finite element investigation of the positioning of Arabin® cerclage pessary in the prevention of spontaneous preterm birth
Spontaneous preterm birth (SPTB) is strongly associated with cervical funnelling. The condition is multifactorial and leads to global peri and neonatal mortality and morbidity [1,2], with the less developed countries being most affected due to a lack of management strategies. There exist a few treatment options for SPTB, including cervical cerclage [4], hormonal therapy using progesterone, and Arabin® cerclage pessary [3]. The Arabin® cerclage pessary has been widely used due to its low cost and ease of insertion. However, the mechanical interaction between the cervix and the pessary is not well understood [5]. Therefore, the aim of this study is to conduct preliminary investigation into: (a) the mechanical effect of the pessary on reducing cervix funelling, and (b) the effect of various pessary positions on supporting the cervix. Full Abstract
11:55
End of Session
12:00
LUNCH
TURING LECTURE THEATRE: Multiscale Modelling
Time
Speaker
Title
10:15
Gábor Zavodszky
(Invited Speaker)
Coupling scheme for a high-performance multiscale blood flow simulation workflow
Increasingly many scientific questions, that need modelling solutions, target processes residing on multiple scale levels. This is especially true in the domain of biomedicine, where understanding a given disease, or the effects of a treatment might involve numerous components. For a single problem blood flow mechanics can be just as important as cellular trafficking or sub-cellular biochemical signalling. One such problem presents itself with the disease of brain aneurysms. These are focal dilatations on major brain arteries with a chance to burst. The outcome of a rupture event can be devastating for the patient. The treatment usually involves endovascular brain surgery and the placement of a blood flow diverter implant, with the intent to thrombose the dilatation and therefore to close it out from the active circulation. In the following, a multiscale, multicomponent model will be presented that aims to model aspects of the thrombus formation mechanism after the medical intervention. The sub-models are fully developed and operational, and the couplings are currently under development. The model structure is discussed from the viewpoint of inter-model communication and requirements for the execution environment for the model components. Full Abstract
10:35
Sanjay Kharche
In Silico Assessment of Cardio-protection by Therapeutic Hypothermia
Hypothermia is known to impact multiple physiological mechanisms that include neurologic and cardiovascular systems. Therapeutic hypothermia (TH), as a mild reduction of body core temperature, has become the standard cardioprotective treatment for several patient groups, including those affected by ischemia. Patients undergoing long term treatments such as dialysis experience global ischemia in addition to the presence of localized myocardial stunning [1], which together may promote persistent ventricular fibrillation. Fibrillation avoidance or reduction of initiation risk using non-pharmacological TH may be beneficial to critically ill patients.
Basic science experimental studies have shown that hypothermia prolongs cardiomyocyte action potential [2] and reduces cardiac conduction velocity. However, the clinical effectiveness of TH on arrythmia abrogation remain debated. In this study, a multi-scale computational cardiology approach was used to illuminate the effects of TH on cardiomyocytes and tissue. Full Abstract
10:50
Hector Martinez-Navarro
HPC simulations for in-silico drug testing in humans: therapeutic strategies in acute myocardial ischemia
Acute myocardial ischemia is a major cause of sudden cardiac death. Anti-arrhythmic treatments or side-effects associated with cancer therapies can produce cardiotoxic effects increasing the occurrence of adverse cardiac events especially in patients with coronary artery disease. In-vivo and in-vitro drug trials have associated complications regarding ethics and costs, whereas cardiotoxic evaluation in animal experiments is not necessarily translatable to humans. Full Abstract
11:05
Sanjay Kharche
Is insulating border necessary for human sinoatrial node spontaneous activity?
Human sinoatrial node (SAN) structure-function relationships remain poorly understood, and may be drastically dierent from those in smaller mammals. Recent studies based on histology for structure and optical mapping for function (e.g. see [1]) suggest that the human SAN may be electrically insulated from atrial tissue by an insulating border, except at four discrete exit pathways (SEPs) that permit atrial excitation by the SAN. Experimental data suggests that the funny current density is three fold lower in the human SAN as compared to small animals. The lower density of this important pacemaking ion channel may lead to SAN electrical activity suppression by the physiological atrial load in the absence of substantial SAN electrical insulation. In addition to experimental evidence, a recent computer modelling study provided some insights into the human SAN electrical function [2]. However, previous studies used simplied Fenton-Karma ionic model to simulate SAN activity, while a biophysically and anatomically detailed modelling has yet to be used to investigate the role of SEPs and human SAN behavior. In this study, a multi-scale biophysically detailed model of the human SAN is presented. The model is being used to investigate the role of SEPs, as well as relevant clinical conditions that promote bradycardia and brady-tachycardia. Full Abstract
11:20
Dwight Nissley and Frederick Streiitz
(Invited Speakers)
Cancer results from modifications to cellular decision-making processes. In normal cells, the protein-mediated signaling networks that control growth and movement are tightly regulated. However, mutations that disrupt or over-activate signaling proteins can drive uncontrolled cell growth resulting in cancer. RAS, a peripheral membrane signaling protein, is mutated in 30% of all cancers, especially those of the pancreas, colon and lung. These oncogenic mutations result in the loss of GTPase activity which in turn causes persistent engagement of effectors and enhanced or continuous growth signaling. Full Abstract
11:40
End of Session
12:00
LUNCH
WATSON WATT ROOM: Cloud & High Performance Computing
Time
Speaker
Title
10:15
Tomasz Piontek
Supporting advanced HPC/HTC scientific workloads with QCG services
All researchers agree that efficiency, flexibility and ease of execution of scientific computation workloads were always key requirements of in-silico experiments. Nowadays, when the computational facilities reach exascale and the complexity of applications increases, the situation hasn’t changed. There are still the same questions and unresolved practical problems how to perform computations easily and effectively. Full Abstract
10:30
Christos Kotsalos
Digital Blood in Massively Parallel CPU/GPU Systems for the Study of Platelets deposition
We propose a novel high-performance computational framework for the simulation of fully resolved whole blood flow. The framework models blood constituents like red blood cells (RBCs) and platelets individually, including their detailed non-linear elastic properties and the complex interactions among them. This kind of simulations are particularly challenging because the large number of blood cells (up to billions) stand in contrast with the high computational requirement of individual constituents. While classical approaches address this challenge through simplified structural modelling of the deformable bodies (e.g., through mass-spring systems), the present framework guarantees accurate physics, desirable numerical properties through a fully featured FEM model and computational efficiency at the same order as the more simplified state-of-the-art models. The required numerical performance is achieved through a hybrid implementation, using CPUs for the blood plasma and GPUs for the blood cells. Full Abstract
10:45
Craig Lucas
The POP Centre of Excellence – Improving Parallel Codes
The Performance Optimisation and Productivity (POP) Centre of Excellence [1] is funded through Horizon 2020, like CompBioMed, and is made up of eight partners across Europe [2]. Our remit is to improve the performance of both academic and commercial parallel codes. Working with developers and users we promote a methodology for understanding a code’s performance which helps us go on to improve it. Full Abstract
11:00
Narges Zarrabi
Secure Processing of Sensitive Data on shared HPC systems
In this work we present a novel method for creating secure computing environments on traditional multi-tenant high-performance computing clusters. Typically, current HPC clusters operate in a shared and batch mode, which can be incompatible with the security requirements set by data providers for processing their highly sensitive data. We propose a solution using hardware and network virtualization, which runs on an existing HPC cluster, and at the same time, meets strict security requirements. We show how this setup was used in two real-world cases. The solution can be used generally for processing sensitive data. Full Abstract
We outline the vision of “Learning Everywhere” which captures the possibility and impact of learning methods coupled to traditional HPC methods. A primary driver of such coupling is the promise that learning will give major effective performance improvements for traditional HPC simulations. We will discuss how learning methods and HPC simulations are being integrated, and provide representative examples. We will discuss specific applications and software systems developed for ML driven MD simulations on Summit at Oak Ridge. Full Abstract
11:30
Phil Tooley
Parallelising Image Registration and the HPC Porting Journey
Image registration is widely used in many areas of computational biomedicine, both as a
research tool and a component in work flows providing clinical decision support. There exists a wide range of both open-source and commercial tools for performing image registration based on a variety of dierent methods.[1] However, these tools are designed to be run on a single machine, with the associated limitations of computational performance and available memory of the system, placing a limit on the maximum size of images which can be handled. A key application for image registration at the University of Sheeld is strain measurement of bone samples using digital volume correlation (DVC).[2] This makes use of tomographic imaging from synchrotron light sources, which can be many tens to hundreds of gigabytes in size | too large to be handled at full resolution by these existing codes. The solution is therefore to create a parallelised image registration code, capable of leveraging HPC infrastructure to register images of such sizes using the memory and computational capacity of multiple HPC nodes. Full Abstract
11:45
Andy Grant
(Invited Speaker)
Integrating HPC and Deep Learning in converged workflows
In the last few years the use of AI and specifically deep learning techniques has emerged as a key pillar in scientific discovery. While many of the underlying techniques have been around for some time the computational power and data volumes required to make them effective have only recently become available. Deep Learning provides new methods to improve predictive accuracy, response times and insight into new phenomena, often using data sets that would previously have been considered unmanageable. Full Abstract
12:05
LUNCH
KELVIN LECTURE THEATRE: Role of Theory of Modelling and Simulation
Time
Speaker
Title
13:00
Michael Dustin
(Invited Speaker)
An agent-based model for investigation of immunological synapse patterns
We have previously published agent-based models for the immune synapse1,2. The first generation model focused on development of the bull’s eye pattern generated by a system with small (~13 nm) TCR-ligand complexes, which are transported towards the central supramolecular activation cluster (cSMAC) and accumulate there, and large (>20 nm) LFA-1-ICAM-1 complexes that provide a “back-fill” of adhesion in the peripheral SMAC (pSMAC)1. The second generation incorporated a weak central transport for LFA-1-ICAM-1, enabling formation of a realistic radial distribution of interactions in the pSMAC2. Furthermore, we incorporated CD28-CD80 complexes, which are less abundant than LFA-1-ICAM-1 complexes and are the same size as the TCR-ligand complexes and thus could passively follow TCR-ligand complexes toward the center2. Full Abstract
13:20
Benjamin Czaja
Simulation and experimental evidence for the decrease of platelet margination with an increase in volume fraction of stiffened red blood cells in flow
Whole blood is a suspension of cells, red blood cells (RBCs), platelets, and white blood cells, in a protein rich plasma that collectively has a non-Newtonian rheology. RBCs are the most numerous blood cells and due to their deformability and bi-concave shape the RBC contributes significantly to the complex rheology of whole blood. Pathologies have been found to affect the deformability of the red blood cell such as Diabetes, Sickle Cell Anemia [1], and HIV. In this research we perform numerical and experimental analysis on the effects and outcomes of the presence of stiffened RBCs on haematocrit profiles and platelet margination in flowing whole blood. Full Abstract
13:40
Shunzhou Wan
From Genome to Personalised Medicine: Cancer Treatment and Discovery of Novel Variants in Qatar
Breast cancer is the most commonly diagnosed cancer in females and the leading cause of death in women. Given the ever increasing cases of breast cancer, it is pertinent that we devise highthroughput experimental and computational methods that provide a comprehensive and holistic understanding of the cause of cancer. In the ‘one size does not fit all’ era, personalised medicine is the way forward, considering the improved ability provided by the methodology to predict treatments that would work effectively for specific patients. Advances in genomic profiling of breast cancer have led to the identification of several key mutations in the disease. An in-depth understanding of the mechanisms of the disease requires not only knowledge of the genome and variants but the correct tools to fully interpret the knowledge. The pathways for the disease are routed via proteins, and it is their interactions that are amenable to treatment. This leads in turn to clinical decision support for personalised drug treatment. Full Abstract
13:55
Tom McLeish
(Invited Speaker)
The Noisy Physics of Protein Signalling: Global Low Frequency Protein Motions in Allosteric Binding
We present a theory and predictive methdology for how protein allostery can recruit modulation of low frequency dynamics without a change in protein structure [1]. Elastic inhomogeneities allow entropic ‘signalling at a distance’. Through multi-scale modelling of global normal modes we demonstrate negative co-operativity between the two cAMP ligands in CRP/FNR family allostery (fig. 1), without change to the mean structure. Crucially, the value of the co- operativity is itself controlled by the interactions around a set of third Figure 1 CAP protein featuring fluctuation-allosteric control sites calculated in an ENM formalism allosteric ‘control sites’. The theory makes key experimental predictions, validated by analysis of structure and isothermal calorimetry of variant proteins. Furthermore, we found that evolutionary selection pressure to conserve residues crucial for allosteric control [3]. The methodology establishes the means to engineer allosteric mechanisms that are driven by low frequency dynamics, and also suggests a programme of fundamental questions in thermally excited elastic matter [4], including control of biofilament self-assembly [5]. Full Abstract
14:15
Panel Discussion
15:00
REFRESHMENTS
TURING LECTURE THEATRE:Machine Learning applications in Oncology followed by Immunology
Time
Speaker
Title
13:00
Georgia Tourassi
(Invited Speaker)
Artificial Intelligence Solutions to Modernize Cancer Surveillance and Optimize Population-Level Cancer Outcomes
Information extraction, integration, analytics and visualization is a critical need for the Precision Medicine Initiative aimed to accelerate our understanding of individual differences in people’s genes, environment, and lifestyle and their effect on disease prevention, progression, treatment, and survival. This data-intensive challenge is collecting, integrating, and analyzing massive, multi-source, multi-scale heterogeneous patient data that must be interpreted in the context of other highly relevant but disparate data sources such as socioeconomic, environmental, lifestyle, care delivery, and community infrastructure factors (i.e., “exposome”). Full Abstract
13:30
Mari Nygard
(Invited Speaker)
Towards personalised cancer prevention: The Digital Cancer Precision Prevention Initiative
TWe live in the information age where the flow of knowledge, including medical advice and innovations, quickly reaches each of us. However, existing recommendations for disease prevention, diagnostics and treatment are population-based, or based on highly selected randomized controlled trials, and only seldomly account for individual differences. For effective control of globally increasing morbidity and mortality due to cancer, the focus on early detection and intervention cannot be underestimated. The estimated spiraling costs of cancer treatment will challenge even the highest-income countries and underline the urgent need to develop preventive efforts.
Knowledge of biological disease mechanisms along with existing individual data from national population-based health registries, biobanks and surveys can be tailored for personally designed ctions safely, efficiently and quickly.
Cervical cancer screening is an excellent model system for the development of personalised strategies for cancer prevention. It has a proven strong effect for decreasing cancer burden at the population level, and the Norwegian population-based screening program has produced large amounts of individual data that is accessible by centrally organized nationwide registries. Full Abstract
14:00
Becca Asquith
(Invited Speaker)
Immune cell dynamics & control of persistent virus infection
Chronic viral infections such as human immunodeficiency virus (HIV-1), hepatitis C virus (HCV) and human T cell leukemia virus (HTLV-1) are marked by huge between-individual variation in outcome. Some people infected with HIV-1 will develop AIDS in less than 5 years others will remain healthy for 10 years or more. In HCV infection, some individuals spontaneously clear the virus others develop persistent infection and subsequent risk of liver failure. Similarly in HTLV-1 infection, some individuals remain lifelong healthy carriers of the virus whilst others will develop an aggressive, rapidly fatal leukemia. Full Abstract
14:20
Omer Dushek
(Invited Speakers)
Control of T cell responses by accessory receptors revealed by phenotypic modelling
T cells are important immune cells that are routinely being exploited for a number of different therapies. They are activated to respond when ligands bind to various receptors on their surface. This binding initiates signalling pathways that ultimately induce responses important for clearing infections and cancers. A key open question is how ligation of different surface receptors quantitatively control their responses. To address this, we have been systematically stimulating T cells with different combinations of ligands (input) and measuring their responses (output). Using systematic mathematical inference algorithms, we identify effective pathway models that intuitively explain how inputs are converted to outputs (‘causal inference’). Here, we show that T cell response outputs to constant antigen ligand input induces perfect adaptation and that ligation of different accessory receptors (CD2, CD28, LFA-1, CD27, 4-1BB, GITR, and OX40) control this phenotype differently. Initial results with the inference algorithm suggest that an incoherent feedforward coupled to a digital switch can explain perfect adaptation along with the different phenotypes observed by the different receptors. The work offers a new way to infer effective signalling pathways directly from quantitative cellular response data. Full Abstract
14:40
Jonathan Wagg
(Invited Speakers)
Application of Artificial Neural Networks to Infer Pharmacological Molecular-Level Mechanisms of Drug Evoked Clinical Responses
The Roche Clinical Pharmacology Disease Modelling Group (CPDMG) aims to better understand the biological basis of observed inter-patient variability of the clinical responses to drugs administered both as monotherapies and in combination. Administration of drugs to human subjects drives widespread and diverse changes in their biology. However, the majority of these changes are NOT clinically relevant, generate noise and are a common source of false positive predictors of clinical response. Full Abstract
15:00
REFRESHMENTS
WATSON WATT ROOM: Imaging & Visualisation
Time
Speaker
Title
13:00
Abbes Amira
(Invited Speaker)
Accelerating Medical Imaging on Multi-core Platforms
In this invited talk, we will address different software and hardware issues related to the design and implementation of medical imaging systems using reconfigurable computing and multicore platforms. A range of algorithms, including discrete wavelet transform (DWT), Ridgelet and Curvelet transforms will be presented for imaging applications such as medical image denoising, segmentation and compression. The implementation process of these algorithms on reconfigurable platforms will be described. In addition, we will review the latest reconfigurable hardware technologies and development methods for real-time embedded and high performance computing systems, and will conclude with comprehensive case studies demonstrating the deployment of low power reconfigurable architectures for algorithms acceleration and performance evaluation methods for reconfigurable medical imaging systems. Full Abstract
13:20
Thomas Odaker
(Invited Speaker)
Improved Data Analysis with Virtual and Augmented Reality
The Leibniz Supercomputing Centre (LRZ) of the Bavarian Academy of Sciences and Humanities is not only IT service provider for the Munich Universities but competent and reliable research partner as well. Over the last decade, the LRZ has acquired great expertise in the field of visualisation while focusing primarily on the topics of Virtual Reality (VR) and – more recently – Augmented Reality (AR). Large-scale VR installations (such as CAVE-like systems [1] or ultra-high resolution stereoscopic displays) have been a staple in certain industries for decades. Their complexity and cost, though, have always been a limiting factor and prevented the widespread availability and use. However, thanks to advances in hardware and software development over the last few years, VR technologies have taken a leap forward in terms of accessibility, usability, quality, and affordability, especially with regards to head mounted displays (HMDs). Full Abstract
13:40
Abbes Amira
Automatic Cerebral Aneurysm Segmentation Using Contourlet Transform and Hidden Random Field Model Template
Cerebral Aneurysm (CA) is a vascular disease that affects almost 1.5 – 5% of the general population, mostly adults. Sub-Arachnoid Hemorrhage, caused by a ruptured CA, has high morbidity and mortality rates. Therefore, radiologists aim to detect and diagnose this disease at an early stage to prevent or reduce its consequential damages.
This work contributes to the CA segmentation field by proposing a novel automated algorithm. Precisely, the contourlet transform, as a multiresolution technique, and Hidden Markov Random Field with Expectation Maximization, as a statistical technique, are the two main adopted approaches. The first technique helps in extracting images features not apparent in the normal image scale; while the second technique segments images in the contourlet domain based on the spatial contextual constraints. The proposed algorithm reveals promising results on the tested Three-Dimensional Rotational Angiography (3D RA) datasets, where both an objective and a subjective evaluation are carried out. Full Abstract
13:55
Guillermo Marin
(Invited Speaker)
Animating the Virtual Human: Applying movie-industry tools and techniques to data visualization
Impactful and memorable images are a great resource for science dissemination, improving the reach and engagement of complex and information rich topics. Yet scientists in general do not have the training nor the time to create high quality and appealing imagery, and journalists and designers do not have a deep understanding of the data itself nor the tools to deal with scientific datasets. On its own, each group can more easily fall into the traps of either detailed but uninteresting imagery, or non-rigorous “artist’s rendition” style of visuals generally dismissed by the domain scientists. However, when both scientists and designers work together, they can create accurate and appealing images and stories. Furthermore, from the interaction new approaches appear, like the one we present here. Inspired by popular web libraries that have allowed journalists and designers to incorporate data in their workflow and produce high quality visualizations, we created a pipeline and a set of tools that allow designers and animators to import large scientific datasets (from 3D simulations) directly into industry level software tools (Maya, Blender, Adobe suite, etc.) where they can control and manipulate the visual style more precisely, and reach higher levels of visual quality than with scientific visualization tools. On the other hand, the scientists can become more than simple advisors to the designers, and (thanks to the automation afforded by the coupled tools) create new visualizations for their publications and presentations. Full Abstract