ABOUT
CALL FOR CONTRIBUTIONS
KEYNOTE SPEAKERS & ABSTRACTS
REGISTRATION
ABSTRACT SUBMISSION
PROGRAM
LOCATION & TRAVEL INFO
CONTACT
EVENT POSTER
OTHER EVENTS
SPEAKERS & ABSTRACTS
POSTERS
SHARCNET

Speakers

No. Speaker Affiliation Department
1 Nasser Mohieddin Abukhdeir University of Waterloo Chemical Engineering
Title: Kinetic Monte Carlo Simulations of Electrodeposition using the Embedded-atom Method
Author: Nasser Mohieddin Abukhdeir
Abstract: T. Treeratanaphitak, M. Pritzker and N.M. Abukhdeir Department of Chemical Engineering, University of Waterloo, Waterloo, Ontario Electrodeposition is a common and attractive method for the production of metal and alloy thin films and coatings. Depending on the intended application, different surface structures are preferred over others. Control of the surface structure requires a better understanding of the effect of operating conditions during electrodeposition on the resulting structure than is currently the available. Toward this end, a three-dimensional on-lattice kinetic Monte Carlo (KMC) method is presented for the simulation of electrodeposition of metals from aqueous solution. The embedded-atom method (EAM) is used to determine the atom/atom interaction energy, which is a key component of the methodology. The EAM is a semi-empirical, multi-body potential which is suitable for classical mechanics simulations of metalic systems. In this study, two surface mechanisms are considered during the electrodeposition process: deposition and surface diffusion. The KMC methodology is capable of simulating electrodeposition processes at the sub-micron level over long time scales at a fraction of the computational cost required by molecular dynamics (MD) simulations (serial versus parallel computation; hours versus days). This enables simulation of electrodeposition processes over experimentally-relevant time scales; on the order of seconds. Simulations results are shown to predict equilibrium surface configurations that are consistent with those obtained by parallel MD calculations.
2 Bill Atkinson Trent University Physics & Astronomy
Title: Monte Carlo Simulations of Fluctuating Phases in Cuprate Superconductors
Author: Bill Atkinson
Abstract: Despite nearly 25 years of research, high temperature superconductors remain, in many ways, mysterious. A persistent question is how these materials make the transition from Fermi liquid to Mott insulator as the doping is reduced. Recent experiments suggest that a number of nonsuperconducting phases emerge at low doping. A remarkable feature of these phases is that they appear to be slowly-fluctuating with short range order. To a large extent, theoretical understanding of these nonsuperconducting phases is built on models of static long range order. In this talk, I will describe our recent Monte Carlo simulations that were specifically designed to probe the role of fluctuations. Because photoemission experiments are one of the most important probes of cuprate superconductors, I will emphasize the effect of fluctuations on spectral properties. I will show that in many cases, the spectrum of the strongly fluctuating phase is qualitatively different from that of the ideal static phase.
3 Ashkan Dehghan McMaster University Physics and Astronomy
Title: Emergence of Hierarchical Morphologies in Binary Blends of Diblock Copolymers
Author: Ashkan Dehghan
Abstract: The self-assembled structures formed in binary blends of AB/CD diblock copolymers are studied using the real space Self-Consistent Field Theory (SCFT), focusing on the cases with attractive A/C and repulsive B/D interactions. The attractive A/C interaction prevents macroscopic phase separation, whereas the repulsive B/D interaction leads to the formation of complex nanoscopic structures. The combination of these features makes the AB/CD blend an ideal model system for the study of hierarchical self-assembly. Our results demonstrate that the B/D separation leads to the emergence of hierarchal alternate lamellar, cylinders and checker board morphologies from the classical lamellar structure. Similar behaviour in the cylindrical phase, where an increase in the BD interaction leads to a phase transition from the classical hexagonally packed cylinders to alternating cylinders, has also been predicted. The theoretical predictions are consistent with available experiments and, more importantly, provide an interesting route for the engineering of hierarchically ordered structures using block copolymer blends.
4 James Desjardins Brock University Psychology
Title: INTEGRATING ELECTROPHYSIOLOGY WITH HIGH PEFORMANCE COMPUTING FACILITIES
Author: James Desjardins
Abstract: There is currently a pronounced increase in the sophistication of analysis strategies being reported in electroencephalographic (EEG) research. There have been substantial technical improvements in regards to preprocessing (e.g. using independent components analysis (ICA) for artifact detection and correction), signal processing (e.g. exploring the influence that cortical neural networks have on each other using Granger Causality), and hypothesis testing (e.g. robust estimation and single subject statistics). Two clear barriers that limit the accessibility of these advances methods to all research labs are (1) the expertise to implement the programming of these new strategies, and (2) the computational resources that are required to complete the processing. The first barrier is being addressed by the EEGLab project developed at the Swartz Centre for Computational Neuroscience (at UCSD). EEGLab is the field-leading open-source Matlab toolbox whose key features include (a) the implementation of state of the art processing tools, (b) easy integration of contributed tools from the community, and (c) a stable graphical user interface (GUI) that is accessible to a wide range of users. In this talk we introduce our EEGLab plugin “Batch2.0.0” that addresses the second boundary by providing a GUI within EEGLab for efficient and flexible SHARCNet queue submission, and demonstrate the submission of a multi-level automated preprocessing routine. Further, we discuss the development of this tool as a vehicle for community-sharing and refinement of universal and optimized analysis scripts.
5 Kevin Green University of Ontario Institute of Technology Faculty of Science
Title: Computing periodic solutions in a neural mean-field model.
Author: Kevin Green
Abstract: Neural mean-field models attempt to describe the electrical signals of networks of neurons at length scales much longer than the extent of a given neuron. We investigate periodic solutions to a particular mean-field model of the cortex (Liley et. al, Network-Comp Neural (2002) 13, 67-113) with certain spatial symmetries. The solutions we study originate from Turing-Hopf bifurcations, where spatially homogeneous equilibria destabilize into periodic solutions with some spatial dependence. The detection of these bifurcations can be done simply from analysis of the (low dimensional) spatially homogeneous reduction of the equations, but studying the spatiotemporal solutions that develop must be done with the full PDE model. I will present an overview of the model, and our method for refining approximate periodic solutions implemented in PETSc.
6 Ben Keller McMaster University Physics & Astronomy
Title: Simulating Gas and Gravity in Galaxies with Gasoline
Author: Ben Keller
Abstract: In essentially every object considered by astrophysicists, hydrodynamics and gravitation are key ingredients to understanding the overall evolution and behaviour. I review Gasoline, a flexible, parallel simulation code, including its hydrodynamic method, tree-based gravity solver and other components. Gasoline has been used to simulate a wide variety of astrophysical problems on scales ranging from individual stars and their planetary systems up to the large-scale evolution of the universe. Many important processes take place on size and time scales that are too computationally expensive to model directly and must be captured using so called "subgrid" models. I present the latest results I have produced to capture the unresolved effects of stellar feedback (supernova explosions, radiation from massive stars, etc.) in simulations of galaxies.
7 Marwa Khater University of Windsor Computer Science
Title: Adaptation and Genomic Evolution in EcoSim
Author: Marwa Khater
Abstract: Artificial life evolutionary systems facilitate addressing lots of fundamental questions in evolutionary genetics. Behavioral adaptation requires long term evolution with continuous emergence of new traits, governed by natural selection. Like in many disciplines; simulation modeling played a great role in studying evolutionary processes. Many biological studies that require data of hundreds of years can be obtained by simulation modeling that produces results in a matter of a few hours or days depending on the computational cost of each system. We model organism’s genomes coding for their behavioral model and represented by fuzzy cognitive maps (FCM), in an individual-based evolutionary ecosystem simulation (EcoSim). In this paper we show how individuals in EcoSim, an evolutionary predator-prey ecosystem simulation, follow the Darwinian evolutionary process through natural selection. Our system allows the emergence of new traits and disappearing of others, throughout a course of evolution. We show how genetic evolution and diversity governs the adaptation process. We show that EcoSim’s individuals adapt to their changing environment by comparing their behavior with a neutral model - a partially randomized version of EcoSim. We use the Shannon entropy, which is a measure of unpredictability and disorder from Information theory, as a measure of genetic diversity and present the difference in entropy between EcoSim and the neutral model to emphasize the adaptive characteristics of EcoSim. We show how entropy used to measure genetic diversity, behaves differently in both systems. The fluctuation in entropy curves for EcoSim shows how individuals try to learn and adapt to their environment. On the other hand, the neutral model shows more steadiness in the curves due to more randomness and elimination of natural selection process. Furthermore, we investigate the relationship between genetic diversity and species fitness and present the correlations found between these two measures in EcoSim. We presented high correlation values between species fitness and genetic diversity which strongly indicates how genetic diversity affects the well being of the species. A validation step was performed with the use of machine learning techniques. A random forest classifier was built to predict the correlation values based on internal and physical properties of species used as features. The rules discovered from the rule learner, which seem to be biologically pertinent, gave us more understanding about the conditions affecting the values of correlation between genetic diversity and fitness.
8 Yusuke Koda University of Waterloo Mechanical and Mechatronics Engineering
Title: Lattice Boltzmann Method for Large Eddy Simulations
Author: Yusuke Koda
Abstract: The objective of this study was to assess the validity of the lattice Boltzmann method (LBM) based large eddy simulation (LES) on the flow over a square cylinder confined in a channel. The LBM is a numerical method derived from the Boltzmann equation and kinetic theory. Contrary to the conventional computational fluid dynamics methods that discretize and solve the Navier-Stokes equations, the LBM uses discrete particle velocity distribution functions based on microscopic fluid physics to simulate the hydrodynamic flow field. It has recently been gaining popularity due to its simple algorithm and proven accuracy, as well as its parallel scalability. In this work, the LBM-LES was implemented on multiple GPUs to perform a large scale calculation on the turbulent flow over a square cylinder confined in a channel. The obtained results were compared against experimental and numerical data to evaluate the accuracy of the LBM-LES.
9 Ping Liang Brock University Biological Sciences
Title: Personal Genomics: the Computational challenges and Solutions
Author: Ping Liang
Abstract: The rapid advancement of DNA sequencing technologies in recent years presents us a realistic hope to sequence a human genome for $1000 or less, making personal genome service within the reach of more people and eventually to everyone. Personal genomics and the resulting genome-based medicine offers great promises for our future medicine and public health systems by enabling prediction of every individual’s risk for diseases and responses to different drug and other medical treatments. However, such promises currently encounter many challenges. Two of the major challenges include: 1) the astronomic demand on computing resources both for CPU times and data storage space, and 2) the difficulty in identifying and characterization of the variants/mutations in a test genome. These challenges are due to the fact that the genome sequence data provided by current technologies are randomly sampled short sequence fragments in a high level of redundancy, rather than continuous sequences matching representing individual’s chromosomes in diploid format. This leads to the extremely large file sizes for the genome sequence data and the need of re-mapping the huge amount of sequences onto a reference genome, which is very computationally intensive. Further more, we currently lack resources/databases documenting known genome variants in details, and the detection of variants based on one single genome suffering a high level of false positives and false negatives. In this talk, I will provide you a more detailed picture about the computational needs and challenges imposed by personal genomics, and I will also tell you a bit about our new project and strategies aiming for tackling the difficulties of genome sequence analysis and the roles of HPC in our projects.
10 Apichart Linhananta Lakehead University physics
Title: Stability and Cooperativity of Proteins in Complex Solutions and of Confined Proteins
Author: Apichart Linhananta
Abstract: In living cells, the presence of complex biomacromolecules strongly affects the stability of folded proteins. The overall effects depend on the size and concentration of the crowding agents as well as on its chemical properties. In a recent publication (Linhananta et al. 2011, Biophys. J. 100: 459-468) we employ a simple Go model of the Trp-cage protein in spherical solvents to assess the effects of macromolecular effects on protein stability. The model uses a single control parameter that varies the protein affinity of the spherical solvents. Repulsive solvents, , which mimics protein-protecting osmolytes, stabilize the structure of folded proteins, increasing the folding temperature. Attractive solvents, , mimics denaturants (such as urea), destabilizes proteins, decreasing the folding temperature. Data analysis that used the weight histogram method (WHAM) found an entropy-driven stabilization mechanism of proteins by osmolytes. In contrast, denaturants destabilize proteins by an enthalpy-driven mechanism. The model was generalized to include variation in solvent size and volume fraction (Linhananta et al. 2012, J. of Physics 341: 012009), as well as to protein systems in confinements. The confined proteins are encapsulated in hydrophobic or hydrophilic rectangular walls, to mimic the chaperonin GroEL/ES. The cooperativity of the systems are assessed by the method developed by Chan and coworkers (Chan et al. 2004, Methods Enzymol. 380:350-379). In general, the cooperativity remains the same for confined proteins and proteins in denaturants, but in some cases, the cooperativity decreases in the presence of osmolytes, despite an increase in the stability of the native state. This unusual behavior is attributed to the loss of conformation entropy due to osmolytes. .
11 Abdullah Mahboob Brock University Biological Sciences
Title: Towards an artificial Photosystem II : Second generation of the E. coli Bacterioferritin 'reaction center'
Author: Abdullah Mahboob
Abstract: Photosystem II (PSII) of photosynthesis has the unique ability to photochemically oxidize water and evolve oxygen. Recently an engineered Bacterioferritin photochemical ‘reaction center’ (BFR-RC) using a zinc chlorin pigment (ZnCe6) in place of the native heme has been shown to oxidize a bound manganese ion and a tyrosine residue, thus mimicking two of the key reactions in the donor side of PSII. We determined the redox potential of ZnCe6 in three organic solvents using Cyclic Voltammetry (CV) measurements and Density Functional Theory (DFT) calculations. Based on the experimental value from CV measurements, as well as calculations showing shift due to the protein environment, the redox potential corresponding to the first oxidation of ZnCe6 in BFR-RC was determined to be 640mV. The first oxidation of ZnCe6 was found to be sufficient for the oxidation of the manganese cluster, but not sufficient to oxidize the tyrosine residue. The redox potential corresponding to the second oxidation; however, seems to be sufficient to oxidize tyrosine. Based on our calculations and experimental work, we propose that ZnCe6 oxidizes the manganese cluster and tyrosine separately. In order to develop a BFR reaction centre capable of oxidizing tyrosine and the manganese cluster in succession through the first oxidation, we used phosphorus porphyrin pigment instead of ZnCe6. The phosphorus pigment will be axially cross-linked to BFR-RC through two cysteine residues. Towards this goal, we have synthesized a series of sulfur-linked phosphorus pigments to examine their oxidation and optical properties. DFT calculations show that the sulfur-linked phosphorus porphyrins will have an oxidation potential near 1.4V. This oxidation potential is sufficient to oxidize tyrosine , as well as being close to that of P680.
12 Ralf Meyer Laurentian University Dept. of Mathematics and Comp. Science and Department of Physics
Title: Task Approach to Parallel Molecular-Dynamics Simulations
Author: Ralf Meyer
Abstract: Molecular-dynamics simulations with short-range forces are frequently parallelized through spatial decomposition. In simulations of complex systems like nanodevices or nanostructured materials however, spatial decomposition is often inefficient since the particles are not distributed homogeneously within the system. In order to avoid this problem, a different approach to the parallelization of molecular-dynamics simulations with short-range forces has been developed in this work. The calculation of the forces in the simulation is divided into a large number of small tasks that are dynamically assigned to a thread pool. The execution of the tasks is controlled by a dependent task schedule that guarantees that accesses to the particles are free from race conditions. Timing tests show that the new method achieves consistent speedups for homogeneous as well as inhomogeneous systems. For homogeneous systems the speedups are comparable to those achieved by spatial decomposition. For inhomogeneous systems the new approach is clearly superior.
13 Varvara Nika York University Mathematics & Statistics
Title: Change Detection of MR images using EigenBlockCD Algorithm
Author: Varvara Nika
Abstract: Change detection in MR imaging is the process of identifying differences of serial MR images taken at different times. During this process, an efficient algorithm should detect disease-related changes while rejecting unimportant ones induced by noise, mis-alignment, and other common acquisition-related artifacts. In this talk we present the EigenBlockCD algorithm for detecting changes in a series of MR images based on local dictionary learning techniques. Our new approach uses L2 norm as similarity measure to learn the dictionary, and principal component analysis to eliminate the redundancy and increase the computational efficiency. Synthetic and real MR images are used to compare the performance of the algorithm against the standard simple differencing method. Statistical validation analysis of the performance evaluation is also presented. Experiments show that our algorithm is robust to the unimportant changes as mentioned and have the potential to be used for detecting the changes in serial MR images.
14 Kyle Pastor McMaster University Physics & Astronomy
Title: Exploring the properties of the bilayer membrane
Author: Kyle Pastor
Abstract: The lipid bilayer is immensely important in many biological functions of the living cell. Some of these functions require topological changes of the membrane which lead to regions of high curvature. The standard treatment to explore the effect of bending on the free energy of a system is to use the Helfrich model, however this model has its limitation to small membrane curvatures. We extend the Helfrich model by including higher order curvature terms and investigate their effect on the free energy of the system. We also apply these methods to understand and calculate the edge tension of a open ended membrane which can help describe pore formation and other biological processes. These calculations are performed using a Self-Consistent Field Theory (SCFT) which has high computational demand due to repeated calculation of modified diffusion equations.
15 Pawel Pomorski University of Waterloo SHARCNET
Title: GSML: Open-source library for solving PDEs using the Fourier continuation method
Author: Pawel Pomorski
Abstract: A recently introduced Fourier continuation method, the FC(Gram) method, was implemented for the computation of high-accuracy numerical of partial differential equations in heterogeneous computing environments. This method enables the use of the Fourier spectral method on non-periodic domains through efficient generation of a periodic continuation. Subsequently, the main computational complexity of the method involves the Fast-Fourier Transform (FFT) which benefits from GPU acceleration. An extension to the FC(Gram) method was developed to enable Neumann and Robin boundary conditions. The methodology was optimized to benefit from next-generation "fused" CPU/GPU computing architectures. Usage and performance of this package is demonstrated through the solution of transient liquid crystal dynamics with weak surface anchoring, a second-order partial differential equation system with Nuemann boundary conditions. Details of the implementation will be discussed including its implementation in Python/OpenCL, performance on fused CPU/GPU processors, and our current work towards implementing MPI parallelization. The development of this library is supported by the SHARCNET Dedicated Programming support program.
16 Sanjay Prabhakar Wilfrid Laurier University Mathematics
Title: Coupled multiphysics, barrier localization, and critical radius effects in embedded nanowire superlattices
Author: Sanjay Prabhakar
Abstract: The new contribution of this paper is to develop a cylindrical representation of an already known multiphysics model for embedded nanowire superlattices (NWSLs) of wurtzite structure that includes a coupled, strain dependent 8-band k.p Hamiltonian in cylindrical coordinates and investigate the influence of coupled piezo-electromechanical effects on the barrier localization and critical radius in such NWSLs. The coupled piezo-electromechanical model for semiconductor materials take into account the strain, piezoelectric effects and spontaneous polarization. Based on the developed 3D model, the bandstructures of electron (hole) obtained from results of COMSOL multiscale multiphysics modeling in Cartesian coordinates are in good agreement with those values obtained from our earlier developed 2D model in cylindrical coordinates. Several parameters such as lattice mismatch, piezo-electric fields, valence and conduction band offsets at the heterojunction of $\mathrm{Al_xGa_{1-x}N/GaN}$ supperlattice can be varied as a function of the Al mole fraction. When the band offsets at the heterojunction of $\mathrm{Al_xGa_{1-x}N/GaN}$ are very small and the influence of the piezo-electromechanical effects can be minimized, then the barrier material can no longer be treated as an infinite potential well. In this situation, it is possible to visualize the penetration of the Bloch wave function into the barrier material that provides the estimation of critical radii of NWSLs. In this case, the NWSLs can act as inversion layers. Finally, we investigate the influence of symmetry of the square and cylindrical NWSLs on the band structures of electrons in the conduction band. We argue that when the radius of the cylindrical NWSLs or the side length of the square NWSLs are small (for example, suppose $a=\sqrt\pi R < 10~\mathrm{nm}$, see the text for the meaning of the notation) then the influence of edge on the band structures of these NWSLs can be profound. However for larger lateral size of the NWSLs, the influence of such edge effects on the band structures of the NWSLs due to different square and cylindrical symmetry orientations can not be observed.
17 Devin Rotondo Nipissing University Computer Science and Mathematics
Title: Wavelet Correlation Techniques for Cardiac Information and Electromyography
Author: Devin Rotondo
Abstract: Devin Rotondo, Andrea Macedo, Dean Hay and Mark Wachowiak Time-frequency analysis has become a standard tool in uncovering and interpreting frequency characteristics of biomedical signals. In particular, research into wavelet-based techniques and applications has generated a vast literature. The goal of the current research is to analyze and to relate cardiac information and muscle activity using advanced signal processing techniques, including the Continuous Wavelet Transform (CWT), the Cross-Wavelet Transform (XWT), and Wavelet Transform Coherence (WTC). These approaches center around detailed mathematical and statistical computations to extract and to correlate signal frequency components. The XWT is well suited for comparing the phase angles of two signals and for displaying areas with high common power, and with subsequent application of the WTC, correlations between the signals are computed in time and frequency space. Although the WTC is well known in the analysis of geophysical data, it is relatively new in cardiac and EMG analysis. An electrocardiogram (ECG) captured blood volume data and electromyography (EMG) was used to record muscle activity. As a proof of concept, blood flow and EMG data were gathered during experiments conducted on a group of women's varsity hockey players, with the goal of better understanding the variability of human performance as a result of heart rate and blood volume changes to muscle motion. The XWT and WTC have shown to be valuable tools in verifying the correlations between blood flow oscillations, and in comparing the synchrony of agonist, synergist, and antagonist muscles for given movements. Additionally, in separate experiments, heart rate and blood pressure relationships were quantified for correlations using the WTC. Plans for the immediate future include real-time signal correlations, facilitated by high performance computing, specifically, high-throughput multicore and GPU processors. The results of this research have potential applications in the design of sports drills and exercises, and in rehabilitative physiotherapy.
18 Bryan Sarlo Nipissing University Department of Computer Science and Mathematics
Title: Optimizing Particle Swarm on Heterogeneous Systems
Author: Bryan Sarlo
Abstract: Bryan B. Sarlo, Alexander E. Lambe Foster, Mark P. Wachowiak Non-convex functions are very difficult to minimize because of numerous local minima within their landscapes. One such cost function is a high-dimensional composite of several weighted elementary functions (Ackley, Weierstrass, etc.). A real-world problem is the geophysics static correction function for aligning geological plates. Complex functions such as these need to be minimized with global optimization, particle swarm optimization (PSO) being a particularly representative stochastic population-based approach. In PSO, a population of individual agents behave analogously to birds in a flock searching for food (i.e. the minimum of the cost function). PSO and its variants, such as the adaptive PSO (APSO), require many computationally expensive operations, including the particles’ motion updates, distance matrix calculations, and weight calculations. Additionally, the functions being minimized: (1) the 200D composite function; and, (2) the 20D geostatic correction function, are very difficult, time-intensive, and resource-hungry. Therefore, to improve efficiency, it is beneficial to isolate and to parallelize the more computationally-intensive components of PSO. Some of these components are better-suited for multi-core parallelization while others are more conducive to fine-grained GPU parallelism, using NVIDIA’s CUDA architecture. The heterogeneously parallelized APSO with each of the two large cost functions was run on SHARCNET’s Angel system because of its multi-core and CUDA GPU parallelism capabilities. Results show that the parallelism in the algorithm improves the execution time with more than 8x speedups. Distance matrix computations on the GPU exhibited a speedup of 11x. This research suggests that investigators in a variety of disciplines should analyze the components of their cost functions or other computationally expensive tasks to best exploit parallelism, and to increase efficiency by performing the tasks on heterogeneous multi-threaded and GPU-accelerated systems where appropriate.
19 Joey Sham McMaster University Physics and Astronomy
Title: Zechmeister-Kurster Periodogram on GPU
Author: Joey Sham
Abstract: In many different fields of physics, it's often important to deal with periodicy of datasets. This can usually be achieved by implementing the Discrete Fourier Transform (DFT), or Fast Fourier Transform (FFT). However, these tools can only be used for evenly spaced datasets. There are many cases in which having evenly spaced data is not possible, so the user needs to use other tools. One method is to use approximation methods or interpolation, then FFT the result. Another method is to use statistics to transform the dataset, then calculate the statistical significance of the result to determine whether a signal is present or the data is pure noise. One method to transform unevenly spaced data is called the Lomb-Scargle Periodogram. In this talk, I will present the implementation of the Zechmeister-Kurster Periodogram on GPU, which is a generalization of the Lomb-Scargle Periodogram. The differences in the results of Lomb-Scargle Periodogram and Zechmeister-Kurster Peridogram will be discussed. I will also discuss the derivation of statistical significance of the Zechmeister-Kurster Periodogram and it's comparison with the Monte Carlo results.
20 Russell Spencer University of Guelph Physics
Title: Dynamical simulation of disordered micelles in a diblock copolymer melt with fluctuations
Author: Russell Spencer
Abstract: By including composition fluctuations in our dynamical simulation of the time-dependent Landau-Brazovskii model for a diblock copolymer melt, we find that disordered micelles form above the order-disorder transition to a BCC phase. At high-temperatures, the micelle number density is effectively zero, and the melt is disordered at the molecular level. As we lower the temperature, the micelle number density increases gradually and approaches the number density in the BCC phase. If we increase the strength of the fluctuations, the temperature range over which disordered micelles exist broadens, and the onset of BCC order is suppressed. We examine the dynamics of crystallization of disordered micelles into the BCC phase. By tracking trajectories, we also investigate the dynamical behaviour of individual micelles in an environment of disordered micelles.
21 Graham Taylor University of Guelph School of Engineering
Title: Representation Learning for the Analysis of Human Pose and Activity
Author: Graham Taylor
Abstract: The pervasiveness of computing has resulted in the production and storage of more data than ever before. Machine learning seeks to transform this deluge of data into intelligent systems that identify patterns and make decisions. It has revolutionized fields as diverse as computer vision, computational neuroscience, biology and the social sciences. But when faced with data that is increasingly complex, how does a machine know which parts are relevant? How does it structure the millions of components into organized units on which it can base decisions? In this talk, I will discuss my work to confront this challenge, developing biologically-inspired algorithms that learn increasingly abstract layers of representation without human guidance. I will give examples of machines that learn to "see" people and understand the subtleties of human movement. I will demonstrate a model, trained on motion capture data, that is used by animators to synthesize human-like movement for film and video games. My talk will also discuss crowd sourcing, in the context of a recent social computing project. Here, people around the world helped a machine learn to recognize human poses by imitating a music video. These examples give a glimpse of the potential of machine learning to effect change in our work and personal lives.
22 Mark Wachowiak Nipissing University Computer Science and Mathematics
Title: Visual Analysis of Population-Based Stochastic Optimization
Author: Mark Wachowiak
Abstract: Particle swarm optimization (PSO) is a common stochastic, population-based global optimization method that simulates social behaviour observed in nature. PSO is also an inherently parallel method that can benefit from high-performance hardware and accelerators. In the current work, the behaviours and dynamic trends of particle populations are observed with multidimensional visualization techniques to uncover patterns and to better understand the search process. Determining relationships between parameter value changes and population behaviours provide a deeper insight into the optimization and provides an opportunity to identify how the search may be improved. This is particularly useful with dynamic cost functions, in which the peaks (minima) change over time. Visualizing the swarms on the dynamic “moving peaks” problem reveals how these swarms not only locate the peak, but also track its motion. With complex cost functions of very high dimensionality, aggregating parameters, dimension reduction, and interactive visualization methods allow many phenomena and correlations to be clearly displayed. These visualizations provide a framework for representing multidimensional global optimization searches to simplify the process of finding relationships, and for gaining insight into the behaviour of different PSO variations and adaptations. It also facilitates rigorous mathematical analysis and improvements in difficult, high-dimensional dynamic optimization applications. Additionally, high-performance visualization toolkits (e.g. ParaView) may be used to provide online feedback for difficult, real-time optimization problems.
23 sara bandehbahman University of Windsor Computer Science
Title: Investigating the criterions affecting Sympatric Speciation, using Ecosim
Author: sara bandehbahman
Abstract: Current research in evolutionary biology focused on sympatric speciation is facing the obstacle of unifying empirical studies with proposed theoretical investigations. Empirical studies have characterized the occurrence most thoroughly on the basis of disruptive selection due to preferential resource use. Using EcoSim, an evolving individual based predator-prey behavioral model, we ran a series of computer simulations through EcoSim using a dual resource system. The behavioral model of each individual is coded in its genome, dictating its actions in the simulation, and is model by a Fuzzy Cognitive Map. With about one billion individuals generated and one million speciation events for each runs, EcoSim allows us to observe and characterize a very large sample size. In this system, we are able to model the effects of behavioral patterns on speciation as a continuous evolutionary process. We investigated whether the selective pressures acting on foraging behaviors’ drove sympatric speciation. We observed behavioral changes occurring as a consequence of preferential resource use, as well as the fulfillment of sympatric speciation criteria, as dictated in literature. These results, through the use of machine learning tools and dedicated data visualization methods, show promise in elucidating patterns shared by species that have undergone sympatry, which may later be used to structure Empirical studies.
24 Stefon van Noordt Brock University Psychology
Title: EXAMINING THE NEURAL CORRELATES OF CONTROLLED ATTENTION USING HIGH PERFORMANCE COMPUTING
Author: Stefon van Noordt
Abstract: Traditional approaches to processing and analyzing event-related potentials (ERPs) are limited and favour data reduction. Although computationally demanding, innovative signal processing techniques and high level statistical analyses, such as automated pre-processing, independent components analysis (ICA), and bootstrapping of single-subject data sets, can be implemented to better understand the neural underpinnings of human behaviour. One such example is research on the human anterior cingulate cortex (ACC), which has been studied in several paradigms and its core function variably attributed (depending on the paradigm being used) to error processing, conflict monitoring, reinforcement/associative learning, expectancy deviation, and inhibitory control. In order to test a simpler model of controlled attention underlying all these effects, we used a novel task-switching paradigm to elicit ACC activity to stimuli signaling changes in response context, and implemented advanced signal processing and analytical techniques on SHARCNet. Our evidence suggests a unifying function involving the modulation of controlled attention. Discovery and demonstration of this latent factor clarifying the neural correlates of controlled attention require the use of HPC clusters, such as SHARCNet, and thus allows researchers to investigate more nuanced models of brain function.


© 2013 Shared Hierarchical Academic Research Computing Network (www.sharcnet.ca).