ࡱ>    ɀ\pUser Ba==XixK%8X@"1[SO1[SO1[SO1[SO1[SO1[SO1h8[SO1,8[SO18[SO18[SO1[SO1[SO1<[SO1>[SO1?[SO14[SO14[SO1 [SO1 [SO1[SO1[SO1 [SO+""#,##0;""\-#,##05""#,##0;[Red]""\-#,##07""#,##0.00;""\-#,##0.00A""#,##0.00;[Red]""\-#,##0.00i*2_ ""* #,##0_ ;_ ""* \-#,##0_ ;_ ""* "-"_ ;_ @_ .))_ * #,##0_ ;_ * \-#,##0_ ;_ * "-"_ ;_ @_ y,:_ ""* #,##0.00_ ;_ ""* \-#,##0.00_ ;_ ""* "-"??_ ;_ @_ 6+1_ * #,##0.00_ ;_ * \-#,##0.00_ ;_ * "-"??_ ;_ @_ \$#,##0_);\(\$#,##0\)\$#,##0_);[Red]\(\$#,##0\) \$#,##0.00_);\(\$#,##0.00\)% \$#,##0.00_);[Red]\(\$#,##0.00\)                                     P  P          a> , *   ff   ` + )               !20% - :_eW[r 1!20% - :_eW[r 2!20% - :_eW[r 3!20% - :_eW[r 4!20% - :_eW[r 5!20% - :_eW[r 6!40% - :_eW[r 1!40% - :_eW[r 2!40% - :_eW[r 3!40% - :_eW[r 4!40% - :_eW[r 5!40% - :_eW[r 6!60% - :_eW[r 1!60% - :_eW[r 2!60% - :_eW[r 3!60% - :_eW[r 4! 60% - :_eW[r 5!!60% - :_eW[r 6" #h$h 1%h 2&h 3'h 4(])}Y *Gl;`+, -{.hgUSCQExperimental modelling and analysis of thunderstorm downbursts> Graphene is a single layer of carbon atoms arranged in a honeycomb lattice. While graphite in our pencil lead is formed by million layers of graphene and possesses electrical properties relatively similar to metals, the electrical properties of an isolated graphene layer exhibit a number of new phenomena related to the two-dimensional nature of this material. Some of these properties are also retained, to a certain extent, in bi-layer or tri-layer arrangements of graphene sheets. The discovery of graphene has been awarded the 2010 Nobel Prize in Physics and a number of applications in electronics are now being proposed for this material, including thin film transistors and sensors for detecting the presence of small amounts of g< ases and liquids. The fact that graphene is formed by carbon, with an atomistic structure extremely similar to pyrolitic graphite, suggests that this material can be extremely biocompatible. This will open up the possibility to introduce graphene-based electronic devices in the body and use them for a number of medical diagnostic applications that are precluded to the usual electronic materials, such as silicon. Although several proteins and forms of nucleic acids have been shown to adhere on graphene surface, the physical mechanism for which such adhesion occurs is still unknown. Preliminary investigations from the principal investigator of this project as demonstrated that adhesion of nucleic acids to graphene can lead to their solubilization in water and the formation of nanocomposites (Sharifi, Bauld, Ahmed and Fanchini, 2011). The biocompatiblity of graphene will be examined theoretically by calculation of its interaction with proteins and other molecules participating in functioning of the human body, such as RNA or DNA. The modeling will be performed on an atomistic level using a first-principles approach. The interaction between surface states of graphene and chemically active amino groups of proteins (often negatively or positively charged that is controlled by the protonation mechanism) or so called dticky ends?in case of DNA/RNA will be examined. In particular, the adsorption of those separated groups on the graphene surface will be probed in terms of bonding, charge exchange and long-range electrostatic interactions. In this work the student will combine experimental methods in the laboratory and theoretical studies with supervision from an expert in computer simulations of biological systems in order to understand the adhesion mechanisms of nucleic acids and proteins on graphene. Specific properties of the graphene surface, such as roughness, curvature, and the addition of a second (or third) layers of graphene will be explored with the ultimate goal to determine the ideal conditions for such adhesion and how adhesion of biological systems affects the electrical properties of graphene.The student expected for this project will have a solid background in Physics or Engineering Physics or Materials Science or Chemistry. The student will also possess good computer programming skills (e.g. Matlab, Labview or other language for technical programming). At least one course in Quantum Mechanics or Quantum Chemistry is expected as well as one or more courses with laboratory components. Additional courses in Solid State Physics, Solid State Chemistry and/or Condensed Matter Physics are desirable but not essential.tThe role of the student will be to assess the quality of plasmonic solar cells assembled on transparent graphene layers using state-of-the art nanoscale imaging tools existing in Fanchini's Lab. The project will span over 10-12 weeks, with the following timeline: Weeks 1-2 Literature review. Training with the SNOM system Weeks 3-4 Development of the program for interfacing the SNOM system with a sourcemeter for solar cell characterization Weeks 5-6 Characterization of first set of solar cells Weeks 7-8 Data interpretation. Project report Weeks 9-12 "Feedback phase" and characterization of advanced solar cells. 2Adhesion of Proteins and Nucleic acids on Graphene Solar cells that have been commercialized to date can be loosely divided in two categories: i) high-cost, high-efficiency photovoltaics, prepared from inorganic crystalline materials and ii) low-cost, low-efficiency photovoltaics, prepared from more cost-effective materials. Increasing the quantity of light that can be conveyed inside a solar cell is vital to improve the amount of photocurrent generated, the efficiency and, ultimately, the competitiveness of solar energy in respect of other non-renewable sources of energy. Graphene, a single layer of carbon atoms that has been awarded the 2010 Nobel Prize in Physics, is an excellent candidate to make solar cells cheaper. It can be used as the transparent electrode placed in front of solar devices in alternative of transparent conducting oxides that require rare chemical elements, such as Indium. However, it is vital to improve the efficiency of graphene-based solar cells by extending their ability to convert light into an electrical current to the entire spectrum of wavelengths of light emitted by the sun. Photovoltaic materials are generally weakly absorbing in the infrared range of solar spectrum and a significant improvement in performance can be obtained by using metallic nanoparticles that are able to reflect light in such a spectral region, while offering negligible light absor< ption at shorter wavelengths, in the visible range. Typically, metallic nanoparticles produce a local enhancement of the electric field associated to scattered light, which enhances the amount of light that can be conveyed to specific regions in the proximity of the particle. Specifically, efforts in our group concentrate on plasmonic solar cells incorporating copper (Cu) nanoparticles on Graphene, which maximizes the amount of light that is scattered inside the active layer of the cell. These devices are partly assembled by an industral partner collaborating with us and partly by students working in our Group. In this project, the student will use the Scanning Near-Field Optical Microscope (SNOM) imaging techniques available in our Laboratory in order to illuminate regions below 100 nm in diameter on the active layer of a graphene-based plasmonic solar cell. During illumination, the electrical current locally generated by the cell will be measured and correlated with the local morphology of the active layer in the proximity of copper nanoparticles and graphene layers incorporated in it. The ultimate goal of this characterization is to locally dap?the performance of such innovative type of solar cells and optimize them. In addition to the day-to-day activities of the project, it is expected that the student will participate to the laboratory meetings of our group as well as specific three-way meetings involving the student, her/ his supervisor and a representatives of the Company involved in this project. Objective of these meetings will be to discuss the results of the project carried on by the student, as well as exposing him/her to an industrial environment and technology transfer activities. gCell Biology and/or Molecular Biology and/or Biochemistry with some knowledge of Virology is preferred.The student will be trained by the principal investigator and a highly dynamic team including one research technician, a postdoctoral fellow and graduate students. The student will directly participate in the research design, experimental procedures and data analysis. Experiments will include basic molecular biology, cell biology and sophisticated immunufluorescence experiments on state of the art confocal laser scanning microscopes. Students will also directly test novel inhibitors for their effect on HIV-1 replication. The latter experiments will allow students to comprehend drug design. The student will be expected to present their work at weekly lab meetings in an informal lab discussion setting.GiovanniFanchiniPhysics & AstronomyMNanoscale Imaging of Plasmonic Solar Cells on Transparent Layers of Graphene The AIDS pandemic has resulted in over 25 million deaths worldwide and there are almost 3 million new HIV-1 infections each year. Currently, the majority of treatment regimens rely on anti-retroviral therapies that target the few HIV-1 proteins that possess intrinsic enzyme activity. Unfortunately, the high mutation rate and immune evasive capabilities of HIV-1 diminish the therapeutic effectiveness of these strategies and underscore the need for innovative approaches that will identify novel therapeutic targets. The HIV-1 protein Nef, which is required for the onset of AIDS following HIV-1 infection, has long been overlooked as a potential therapeutic target because it lacks intrinsic enzyme activity. Nef reprograms infected cells to support HIV-1 replication and escape immune surveillance by assembling a multi-kinase complex consisting of multiple cellular proteins, including Src Family Kinases (SFKs). Indeed, studies in transgenic mice demonstrated Nef must bind SFKs to cause an AIDS-like disease. Thus, although Nef lacks intrinsic enzymatic activity, it recruits host cell enzymes to drive disease. Taken together, these findings suggest that Nef may be a novel target for the development of HIV-1 therapeutics. Recently, we determined that the small chemical probe selectively interferes with the binding of Nef to SFKs and blocks Nef-mediated immune evasion, providing a proof-of-concept that Nef may be a target for novel HIV-1 therapeutics. However, the narrow therapeutic window of this probe precludes its development as a potential drug. Therefore, we conducted a small-scale computer-based virtual chemical screen to identify drug-like compounds that may inhibit the Nef-SFK interaction. This initial screen yielded two compounds that block Nef-mediated signaling complex formation and HIV-1 replication, thereby demonstrating feasibility of a large-scale virtual drug screen to identify novel HIV-1 therapeutics. In this proposal, we will identify novel drug-like compounds that block the Nef-SFK interaction by performing a large-scale virtual screen of a small molecule library that was desigend to include only drug-like molecules. Our preliminary data demonstrate the feasibility of identifying drug-like Nef-SFK inhibitors from this library that are capable of disrupting Nef-dependent pathways. Using these initially identified molecules as templates, we expect to identify highly potent drug-like molecules that inhibit the Nef-SFK interaction and therefore block the ability of Nef to drive HIV-1 disease. Familiarity with social and environmental planning issues, and qualitative and quantitative data analysis. Preferably the student should be pursuing a degree in a social science field with an interest in the intersections of environment and development issues, and urban and rural interfaces. Ideally the student would also have an interest in issues of food system sustainability and resilience. The student will assist with literature review, and identification and analysis of relevant policy documents and statistical information. He/she may also assist with identifying key stakeholders who could be contacted by the research team for interviews.RussellThompson@Field Theoretic Predictions of Self-Assembly for Nanotechnology Self-assembly describes the ability of some molecules to arrange themselves into patterns spontaneously. The nanotechnology potential of self-assembly is enormous. In our group, we are using field-theoretic methods to predict the nanoscale structures that molecules will form based on molecular architectures. Specifically, we're interested in amphiphilic molecules, such as phospholipids, that are ubiquitous in biology, forming cell walls, vesicles and membranes. Such systems can't be easily dealt with a< nalytically, so we take a numerical approach, developing algorithms and computational methods to solve these complex, soft matter, systems. The globalink student should have a background in either physics, engineering, materials science or comparable quantitative field. It is advantageous for the student to have some experience of statistical mechanics. Elementary calculus is mandatory; some knowledge of differential equations is helpful. The student should have some computational experience, and a familiarity with either matlab, c, or fortran is an advantage. DThe Globalink student in our group will have the opportunity to become familiar with field-theoretic methods in complex, soft matter, systems. The student will learn about both the analytical and computational methods we employ and will be responsible for numerically exploring possible nanometre length scale morphologies. En-huiEInformation-Theoretic Research of Video Coding: Theory and AlgorithmsVideo communications is a major area of growth in engineering today and in the future. Just look at the growing popularity of Internet TV, Internet video conferencing, and mobile video. A key problem in video communications, however, is the conflict between the growing data volume of video and the increasing shortage of bandwidth. This is particularly true in wireless video communications. Indeed, with the growing popularity of wireless video and the proliferation of high definition TV and other emerging new video applications, the bandwidth shortage becomes increasingly severe to the extent that it is regarded as an issue meriting our national attention. As such, there is strong demand for better video coding technologies which can improve compression efficiency significantly while providing other green features such as low complexity (hence low encoding and decoding power consumption) and scalability. Existing video coding standards are all based on the framework of predictive DCT-based video coding. Compression performance gains of these standards over their predecessors come largely from the improvements in their signal processing components---prediction/compensation and reconstruction filtering. Although one cannot rule out the possibility of further gains from this direction, it is certain that such gains, if any, are diminishing for given frame sizes and would come at extreme high computation complexity. Therefore, it is imperative to explore new directions and develop game-changing approaches to future generations of video coding. Building on our early success in both lossless and lossy compression and in developing information theoretic (IT) ideas and algorithms for image and video coding, in this research, we will investigate and explore game-changing approaches to video coding systematically from the IT point of view by proposing new advanced video coding frameworks, developing their respective information theory, and designing new efficient video coding algorithms. Strong analytical capability; good background in probability theory, mathematical analysis, and/or algorithms and data structures; excellent programming skills in C or C++.FAfter getting familiar with the popular video coding standard, H.264, students will be trained in the newest video coding standard, HEVC, and introducted to the information theoretic research of video coding. In the late part of the project, students will be asked to experiment some new algorithmic ideas on top of the HEVC. =Mobile Video: Opportunities, Bottlenecks, and Green SolutionsThe convergence of communications and computing, in particular, mobile communications and computing is happening at a speed much faster than most of us have anticipated. In 2011, the global mobile data traffic was over eight (8) times the entire global Internet traffic in 2000. With the growth of smart phones, tablets, and content sharing, it is estimated, according to the forecast from Cisco VNI Mobile 2012, that by 2016, the global mobile data traffic will grow to 10.8 exabytes per month, and over 70% of mobile data will be video. However, opportunties always come with challenges. In this research, we will take a unique perspective to look at bottlenecks for mobile video---power and bandwidth---and investigate their green solutions. After getting trained in video codecs, students will be asked to research bottlenecks for various mobile video applications and possibly suggest respective solutions.CorriganWestern University - LondoncOrganometallic Materials: Preparation and Structural Characterization of Ferrocenated Nanoclusters The assembly of multiple di(?5-cyclopentadienyl)iron (cerrocene? fragments onto well defined molecular architectures is being pursued for potential applications of such materials in areas encompassing electro-catalysis, anion recognition and the preparation of specialty electrodes. Work in our laboratories has shown recently that using a suitably designed ferrocenyl reagent (FcCH2SSiMe3; Fc = ferrocenyl), it is possible to use a monodisperse Ag2S architecture for the tethering of multiple ferrocenyl groups. For example, single crystal X-ray diffraction illustrates that in the nanocluster [Ag48(?4-S)6(?2/3-SCH2Fc)36], 36 ferrocenyl units form a closely packed shell on the surface of a Ag48S42 core with overall molecular dimensions of ~3.0 x 3.0 x 2.2 nm3.[1] The size of the nanoclusters, and their surface to volume ratio, can be tailored by controlling the amount of interstitial sulfide (S2-) ligands incorporated into core during synthesis. In this project, you will prepare and develop the reaction chemistry of a new ferrocene based reagent with Ag(I) to modify nano cluster surfaces with anion sensitive ferrocenyl moieties. [1] S. Ahmar, D. G. MacDonald, N. Vijayaratnam, T. L. Battista, M. S. Workentin, J. F. Corrigan, Angew. Chem. Int. Ed., 2010, 49, 4422-4424. A Nanoscopic Polyferrocenyl 3-D Assembly: Preparation of the Structurally Characterized Triacontakaihexa(ferrocenylmethylthiolate) Cluster [Ag48(?4-S)6(?2/3-SCH2Fc)36].All synthetic skills (inert atmosphere) training will be provided. Interested students should have completed basic courses in inorganic chemistry (main group and/or transition metal chemistry). Working closely with a PhD student on this project, you will gain extensive experience in air-sensitive inorganic synthesis, single crystal X-ray diffraction and NMR spectroscopy (1H, 13C). The project will involve the preparation of a new ferrocene based reagent, including its complete spectroscopic characterization. You will then use the reagent for the assembly of nano cluster complexes with a redox active, tailored surface.de Bruyn!Material properties near yiel< dingsMaterials such as gels behave as soft solids when when subjected to a weak stress, but will flow like a liquid when the stress exceeds a value called the yield stress. This project will involve studying the behavior of gels close to the yield point. Shear rheometry will be used to measure the viscous and elastic properties of the gels as the yield stress is approached. Of particular interest are the time it takes for the material to relax to a steady state following a change in stress, and the relative importance of viscous and elastic effects close to the yield point. Microscopy will be used in an effort to visualize the changes in microscopic structure that occur when the material yields. The goal is to develop an improved understanding of the physical processes that hold the gel together below the yield stress, and that cause the gel to yield at the yield stress. Some background in physics or materials science or fluid dynamics. Some familiarity with standard computing tools such as Excel or Matlab. Basic experimental skills, including the ability to learn how to safely properly use sophisticated equipment. Curiosity and initiative. ^The student will be responsible for carrying out all of the experimental research on this project, under the supervision of Dr. de Bruyn. The student will prepare samples of the materials to be studied and, after being instructed in the use of lab equipment, will perform all rheological and microscopic experiments related to this project, as well as other experiments that might seem necessary. The student will analyze the data obtained. This may involve writing computer software for this purpose. The student will report on progress at regular meetings with Dr. de Bruyn and at research team meetings.Jimmy D.Dikeakos9Novel AIDS therapeutics targeting the Nef-SFK interaction Rapid urbanization and industrialization in China has exacerbated food insecurity through loss of agricultural land and of farmers and through urban poverty, especially among migrant workers and the urban poor. Often, food insecurity is more a distribution problem than a production problem, but China faces both. Furthermore, Chinab industrial food system is highly vulnerable to environmental and economic shocks. These challenges call for combined planning approaches that link food security, food safety, and food system resilience, particularly the capacity to deal with shocks of various kinds. In China there is urgent demand to develop a low-carbon modern agricultural model that also ensures safe food. This research asks what are the principal components of an urban food system that contribute to its resilience? We focus on the following research objectives: 1. Develop and test an urban food system resilience assessment framework; 2. Develop a dynamic scenario planning model of the urban food system; and 3. Using the model, formulate a food system strategy and plan to maximize resiliency to a range of key stressors. The city of Nanjing in China will be used as a case study. The methodology is as follows: 1. Assessment Framework: (a) Adapt frameworks used in Canadian cities for implementation in Nanjing. Test the framework on site and use the outcomes to refine the instrument. (b) Collect/analyze/asess existing quantitative and qualitative data on the Nanjing food system. (c) Fill additional data gaps through primary research. (d) Identify key vulnerabilities and resilient characteristics of the Nanjing food system. (d) Refine assessment framework for use in other contexts. 2. Dynamic Scenario Planning Model: (a) Use results from the assessment (key vulnerability and resilience variables) to create an interactive map of Nanjing food system. (b) Create a model using Geographic Information Systems. (c) Using the model, develop scenarios to test the impact of manipulating key variables on outcomes of vulnerability and resilience. 3. Food System Strategy and Plan: (a) Develop a common vision of a resilient food system for Nanjing though extensive stakeholder consultation and dissemination of research results. (b) Use the GIS model to map and plan strategic actions required to shift the current food system towards the common vision. (c) Make the model methodology and software available for use in other contexts. Anticipated Significance and Impacts 1. Integrated Planning: Novel and methodologically comprehensive approach to including food systems in urban planning strategies 2. Social Justice: Ability to identify communities that are most vulnerable to food insecurity, and target remediation strategies accordingly 3. Evidence and Process-Based Decision Making: Ability to make strategic planning decisions on the basis of quantitative and qualitative data and stakeholder inputs 4. Replicable Methodology: Tools (assessment framework and model) transferable to other contexts 5. Knowledge Mobilization: Results are of direct relevance to a wide range of actors, including academics and researchers, planners, NGOs, b< usiness, government and development agencies. The student should have a background in programming in Java, and a basic understanding of algorithms and data structures. Additional knowledge of hardware description languages would be beneficial. Programming experience is suggested.The student will build on top of an existing synthesis framework called synASM, and implement a synthesis algorithm that maps and schedules accesses to abstract data types such as lists, sets, and queues when generating the corresponding hardware. Steffanie$Geography & Environmental ManagementDPlanning Resilient Urban Food Systems: Pilot Study in Nanjing, ChinaA The research proposal investigates the challenge of automatic memory scheduling of abstract data structures during the synthesis of hardware circuits from algorithmic specifications to FPGAs. There are two main issues with modern high-level synthesis (HLS) methodologies. The first issue is that arrays are the only data structures supported for synthesis by most HLS frameworks. This forces designers to describe their state using arrays that synthesize to contiguous memory after synthesis. Note that this forces an ordering on the memory, which might be unnecessary for the particular specification limiting any potential opportunities for memory partitioning for improved performance results. Consequently, this research proposal investigates the use of set data structures and their synthesis. Sets do not enforce an ordering, but it is their use that determines an ordering. For example, a set may be used as an array would be. The challenge here is to use static analysis techniques to understand the use of sets, and then provide hints on partitioning the set so as to map it onto FPGA memories. Access to these memories must be scheduled appropriately to meet certain design constraints such as timing and performance. The research will use the synASM framework, a high-level synthesis framework based on the formal model of Abstract State Machines developed at University of Waterloo by Prof. Hiren D. Patel. This framework supports the specification of data structures such as arrays, multi-dimensional arrays and sets; however, the current synthesis engine only supports the synthesis of arrays. The task is to use analysis techniques or user-guided hints to determine the shape of the data structure, and a symbolic formula indicating its size. The shape provides an intuition of the evolution of the data structure that can be leveraged for memory partitioning to better segment the data structure onto multiple memories. The symbolic formula allows designers to specify parameters at synthesis-time such that the appropriate memory size gets allocated for the data structure. This provides sufficient information to automatically partition, schedule and synthesize the data structure. The researcher will extend SynASM with the memory analysis, partitioning and scheduling engine to generate synthesizable VHDL targeting an ALTERA FPGA. cThe student is requred to have basic laboratory etiquette and skills including the preparation of chemical solutions, use of pipettes, centirfuge and simple to operate analytical equipment. The student should also be comfortable to use a computer and have good knowledge of organizing data in a spreadsheet, and using a statistical software program to conduct PCA analysis. The student will be using the traditional Ecoplate method to evaluate differences between intercrops and sole crops in the soil microbial community. The preparation of the samples and the analytical instrumentation are simple to use.The student will be working with soil samples that will have been imported to Canada from Argentina by Dr. Oelbermann's PhD student. The student funded by MITACS will prepare the soil samples for inoculation on Ecoplates and will evaluate changes in carbon substrate utilization on the Ecoplates using a Biotek Plate reader over a 7 day period. The actual laboratory work will take approximately 10 days. Once the data from the Ecoplate, using the Biotek plate reader, has been obtained, the student will organize the data in an Excel spreadsheet to quantify the microbial activity [average well color development {AWCD)], microbial richness (R), Shannon Weaver Index (Hs) to determine diversity, and eveness (activity across all substrates). Principal component analysis (PCA) will be used by the student to determine the extent of differentiation in C source metabilism bewtween the sole crop and intercrop treatments. If the student is already familiar witih statistical software packages, he/she may use this software (if available). Otherwise the student will be using SPSS and will be working under Dr. Oelbermann's close supervision. The sutdent will also have an opportunity to work with my current M.S. and PhD students who are working on other aspects of C and N transformations in intercropping systems. This will expose the student to other analytical techniques in a state-of-the-art international research laboratory. HirenPatel)Synthesizing Abstract Data-types to FPGAs Land-use changes such as the conversion undisturbed ecosystems to intense agricultural production systems, has negatively affected the soil organic matter (SOM) pool, resulting in declining soil fertility and the emission of greenhouse gases (GHG). Intercrops, where more than one crop is grown on the same land area at the same time, are sustainable because of their effective use of resources through enhanced microbial activity and tight nutrient cycling compared to sole crops. Intercrops may therefore have a lower requirement for N fertilizers, are more effective in sequestering C, and show greater resiliency to climate change due to their greater structural complexity. It is imperative to shift our research focus to produce food using agoecosystem management practices that maximize the sequestration of C and minimize the emission of GHGs. A major literature gap currently exists on the processes driving C and N transformations, and no information is available on changes in soil microbial communities which control C and N dynamics. Soil microbial community structure will be studied using randomized complete block design (RCBD) with four treatments and three replicates per treatment: maize (Zea mays L.) sole crop; soybean [Glycine max L. (Merr.)] sole crop; 1:2 inter-crop (one row maize and two rows of soybeans); and 2:3 inter-crop (two rows of maize and three rows of soybeans). The site is located in the rolling Pampa, Balcarce (37?5tb, 58?8zb), Argentina, and soil required for this study will be colle< cted in May 2013, prior to the studentb arrival in Canada to conduct this analysis. Our initial samples (first year of intercrop establishment) also included the analysis of soil chemical [pH, SOC, soil total N, SOC partitioning, soil light fraction (LF) and ?13C and ? 15N of the whole soil and the LF, NH+4, NO-3], physical (bulk density) and biological (soil microbial biomass (SMB)-C, N, ?13C-SMB) characteristics and crop residue input were evaluated one year after site establishment in order to provide baseline information on each treatment. Since then, GHG, SOC and soil total N concentrations and stocks, bulk density and crop residue input were quantified annually. However, to evaluate long-term changes in soil quality between sole crops and intercrops, additional samples in the final year of this study (year 7) will be evaluated. These characteristics include SMB, soil LF and SOC partitioning and soil microbial community structure. The analysis of these soil characteristics will follow standard and internationally acceptable protocols. In order to understand C and N transformations between sole crops and intercrops, an evaluation of differences in the structure of the soil community is essential. We currently do not have this information from this long-term research site. The student needs strong skills in statistical analysis, experimental design, and communications. An interest in economics, decision theory, risk analysis, construction engineering, and management is desirable. The student would formulate and conduct a simple initial experiment, analyze the results, and write a report about the significance of the results, what questions are raised, and future research that is needed to address those questions.AndrewKennings(Large scale placement of ASICs and FPGAsModern circuits are physically implemented via ASICs, structured ASICs or FPGAs. Placement is an integral step in the physical implementation of integrated circuits regardless of the final implementation technology. The placement of an integrated circuit has a significant impact on the resulting area, performance and power consumption of the integrated circuit. Both performance and power consumption are significant design issues as the world strives to be more "green" while still achieving high performance. Of course, runtimes required to perform placement can be extremely large which is a consequence of the large size of modern circuit designs; e.g., recent ASIC benchmark circuits released by IBM have on the order of millions of placeable objects. The placement problem is typically solved using a variety of sophisticated software algorithms. These algorithms must be capable of producing a good quality of result while being implemented as efficiently as possible to reduce the amount of runtime required. The purpose of this proposed research is to investigate existing placement algorithms and to develop new placement algorithms capable of efficiently handing the complexities of modern placement problems. This will include, for example, investigation of parallel techniques which can exploit modern multi-core architectures in order to reduce placement runtime. Ways to improve placement algorithms to better address modern issues such as routability, timing, design hierarchy and datapath placement will be considered. This project should be of interest to those students interested in learning more about the software and algorithms used for the Computer-Aided Design of integrated circuits. More specifically, the development of automatic tools for performing one step of the Computer-Aided Design flow, namely placement. The student should have excellent programming skills (C++) and an excellent knowledge of software development, algorithms and data structures. Strong programming skills including C++. Strong knowledge of software development, data structures and algorithms. Design to learn more about the Computer-Aided Design flow and the algorithms used in this flow.The student would have a lead role in the software development of advanced algorithms for the placement of integrated circuits. The student would be involved in the benchmarking of the developed algorithms on well-known placement problems to compare against other placement tools.Maren Oelbermann Environment and Resource Studies\Changes in soil microbial community structure after six years of maize-soybean intercropping In the real world, people donb always make optimal decisions. While experts are said to make good decisions most of the time in the blink of an eye (re. Gladwellb clink?, we cannot be experts in every subject, nor are we typically immune to the influence of emotion and tricks of the mind on our decisions. In fact, making bad decisions is common in construction management and it can be costly. In order to avoid these costs, a way of identifying and mitigating the human aspect of decision making is needed. If the human aspect is unavoidable, but it can be predicted, then we can use that knowledge to improve our decision processes. If the associated costs are still unavoidable, it is important to gain an understanding of where these costs will arise to quantify contingencies. Traditional economics assumed that the world was populated by logical, rational, calculating people. It did not take into account the human nature to repeatedly make incorrect decisions, believing that they are correct. One of the most dominant deviations from traditional economics is called loss aversion. Behavioural economics helps us understand such deviations. It seeks to improve economics by incorporating irrational< human decision making into economic models. Unfortunately, there has been no work on the decisions that are specific to construction engineering projects; for example on how the benefits and costs of working are evaluated on a day with a 40% chance of rain. Although bad economic decisions are a common occurrence, it is fortunate that these illogical decisions often occur systematically. One example was studied by behavioural economist Dan Ariely. Ariely studied the human tendencies involved in cheating. The decision about whether or not to cheat is, in theory, a simple cost-benefit analysis where the cost is the penalty for getting caught, and the benefit is what is gained from successfully cheating. Contrary to what is expected, people did not act as the cost-benefit analysis indicates they should. There were many different contributing factors to this, but Ariely found that people acted irrationally in a consistent and predictable manner. Behavioural economics offers the potential to gain some insight into the many types of decisions made during construction that depend on human judgment rather than calculations. Based on the lack of behavioural economics investigations that are specific to construction management, field data regarding construction decisions should be analyzed in order to incorporate a behavioural perspective into construction management. This would require: (1) Identification of key types of decisions that may be modeled with behavioural economics, (2) Formulation of experiments that would scientifically validate one or more models of such decision types, (3) Modification of one or more key decision processes based on this new knowledge, (4) Scientific examination of whether the modified processes result in improved decisions, and (5) If improvements occur, recommendation of methods for widespread education and deployment. Student should be in one of the following programs: Aerospace Engineering/Mechanical Engineering/Applied Mathematics/Computational Mathematics/Astrophysics, or similar. Required: Student should have taken courses in computational fluid dynamics, nonlinear numerical PDEs, or hyperbolic conservation laws, and should be familiar with the compressible Euler equations of gas dynamics and with finite volume methods. Student should have expert knowledge in C++ programming. Student should have interest in parallel computing and fluid dynamics modelling applied to extrasolar planet atmospheres. Student should be ranked in the top 10% of his/her class. The student will become a member of a small research team that is composed of several graduate students and a postdoc. The first step in this project will be to become familiar with the advanced parallel simulation code (written in C++ with MPI message passing) and the papers on the 1D atmosphere models. The student will have access to parallel computing clusters provided by the SHARCNET and SciNet consortia to perform the 3D simulations, and will learn how to run the codes there and how to visualize the results, using existing test cases in the code. The next step will be to implement the atmospheric heating models from the 1D papers into the 3D code, and to perform 3D simulations to explore scientific questions on the atmospheres of extrasolar planets. Sebastian FischmeisterComputer System BenchmarkingDecision makers need solid data to make the right decision about technology. This is especially important when deciding whether the company should embrace a new technology or innovation. The quality of an innovation is then usually measured through benchmarking. Unfortunately, current benchmarking mechanisms for the embedded systems domain suffer from many pitfalls. A proper benchmark must (1) be reliable and exercise the system elements that are affected by the innovation, (2) be repeatable for third parties, and (3) be robust so tweaking does not change the conclusions. Recent work shows that these requirements are challenging as, for instance, subtle changes in the environment affect performance (such as the user name, hardware performance counters, hardware monitoring). This results in problems when repeating experiments such as during regression tests to ensure that a new hardware feature, a new device driver, or a new compiler option does not negatively affect performance.A- Operating Systems - Unix - Scripting languages (Bash, python)lThe intern will assist in the further development of the benchmarking infrastructure built here at the University of Waterloo. This can include theoretic topics such as developing an new scheduler for distributing benchmarks on a heterogeneous cluster or systems topics such as identifying a hidden factor that invalidates results published at famous conferences. Wojciech (Voytek)Golab:Software Techniques for Next-Generation Multi-Core Systems In order to harness the power of parallel architectures, software designers must synchronize threads that concurrently access shared resources such as in-memory data structures. The project will focus specifically on solving this problem in the context of multi-core computers, which provide multiple processing elements or cores attached to a pool of shared memory. Growing core counts and disruptive innovations, such as hardware transactional memory (HTM), continue to drive research in this area, and render many classic techniques obsolete. The specific goal of this project will be to design, prototype, and evaluate novel synchronization algorithms suitable for systems that incorporate tens to hundreds of cores bound together by a cache coherence protocol. Innovations arising from this work may be used to accelerate real world applications such as in-memory data management and analytics systems, and may be of interest to a variety of technology and Internet companies including Hewlett-Packard, IBM, Google and Amazon.FThe ideal candidate will be a self-motivated, enthusiastic team player who is interested in pursuing graduate studies. Strong programming skills are required. Familiarity with C/C++, the Linux environment, and the pthread library specifically are definite assets. Experience with rigorous proofs of correctness is a plus. The student will be involved hands-on in the design, implementation, and experimental evaluation of synchronization algorithms. The implementation and evaluation phase may involve incorporating a synchronization algorithm into an existing open-source software system. Depending on the studentb interests, the student may be involved in writing and verifying inductive proofs of correctness. The student will develop competencies in parallel programming and rapid prototyping using C/C++ or Java ?skills essential for success both in graduate studies and in the industry. Furthermore, the student will gain practice in careful thinking and reasoning about the correctness of algorithms. Haas#Civil and Environmental EngineeringPBehavioural Economics Applied to Construction Project Engineering and Management= The goal of this project is to simulate supersonic escape of hydrogen from the atmospheres of extrasolar planets. The 3D simulations will be performed using an advanced parallel and adaptive simulation code for the Euler equations of gas dynamics that has recently been developed in the host group (see reference [1]). This work will be an extension to 3D of 1D models that were developed in the host group for the atmospheres of extrasolar plan< ets and the Earth [2,3]. The simulation will be performed in a domain between two concentric spheres (outward from the planetary surface), and it will use a so-called 'cubed-sphere' grid, which is obtained by taking a Cartesian grid between two concentric cubes and 'inflating' the grid to obtain concentric shells (see [1]) between inner and outer concentric spheres. The main steps in this project will be to become familiar with the advanced parallel simulation code (written in C++) and the papers on the 1D models ([2,3]), to implement the atmospheric heating models from the 1D papers [2,3] into the 3D code, and to perform 3D simulations to explore scientific questions on the atmospheres of extrasolar planets. Our previous 1D models have showed convincingly that planetary atmospheres can be in a state of continuous supersonic expansion when they are strongly irradiated by their parent stars. We propose to extend this work to 3D, which means that we can take many important effects into account that were neglected in our 1D models, including nonuniform heating and rotation. We will investigate the influence of rotation on the region of maximal temperature in exoplanet atmospheres. We will predict where this highest-temperature region is located relative to the exoplanet substellar point, which may be tested observationally by measuring the location of the maximum in the infrared lightcurve of the exoplanet. These 3D simulations will provide estimates of escape rates that are more realistic than our crude earlier 1D results. The 3D planetary outflow results will provide much more realistic flow patterns and outflow rates than our earlier 1D results. Exoplanets atmosphere are 'hot topics', and we anticipate that these 3D results will lead to a paper in a high-profile research journal. References: (can be downloaded from Prof. De Sterck's website) [1] L. Ivan, H. De Sterck, S.A. Northrup, and C.P.T. Groth, 'Hyperbolic Conservation Laws on Three-Dimensional Cubed-Sphere Grids: A Parallel Solution-Adaptive Simulation Framework', submitted to Journal of Computational Physics, 2012. [2] F. Tian, O.B. Toon, A.A. Pavlov, and H. De Sterck, `Transonic Hydrodynamic Escape of Hydrogen from Extrasolar Planetary Atmospheres', Astrophysical Journal 621, 1049-1060, 2005. [3] F. Tian, O.B. Toon, A.A. Pavlov, and H. De Sterck, `A Hydrogen-Rich Early Earth Atmosphere', Science 308, 1014-1017, 2005. V* Excellent written and oral communication skills * Good background in software engineering * Familiarity with the Ruby programming language, the Rails / Hobo frameworks * Familiarity with security and privacy engineering * Familiarity with the cloud computing paradigm and mobile platforms * Good grasp of the Unified Modelling LanguagepConceptual designs of the research software platform exist. The role of the student will be to develop a detailed design based on these conceptual designs, to the implement a subset of the detailed design in a cloud-based prototype software system and to help evaluate the prototype in cooperation with domain-experts, technology specialists and clinical stakeholders.JonWillis$Galaxy populations in X-ray clusters2The X-ray X-tra Large (XXL) project is a astronomical research collaboration that aims to use a large X-ray survey of galaxy clusters to answer fundamental questions in cosmology and galaxy evolution. I am a key member of the XXL project and I specialise in using multi-wavelength observations of galaxies in X-ray emitting clusters to learn about their formation and evolution. Previous student projects associated with the XXL survey have included a) creating a computer algorithm to identify galaxy clusters as spatial over-densities of characteristically red galaxies as observed in optical images in order to compare their properties to clusters identified at X-ray wavelengths, b) performing and reducing optical spectroscopy of individual galaxy clusters in order to determine their redshift, c) studying the population mix of red versus blue galaxies in clusters with a view to understanding the physical mechanism that causes the apparent transformation from a blue, star-forming galaxy to one which is red and passive. I have a range of similar projects suitable for a MITACS student including a) the analysis of near-infrared images of very distant clusters to learn about the physical structure of so-called brightest cluster galaxies and b) the comparison of cluster catalogues generated using optical observations and X-ray wavelengths with a view to understanding what makes a cluster of given mass detectable at X-ray wavelengths or not. I anticipate that the exact nature of any project to be investigated would be discussed and refined with any interested participant.Students from with a strong physics and astronomy background will do well. Computer skills such as familiarity with unix-based programming, numerical/scientific coding (e.g. c, fortran, idl) and graphical plotting packages will be a great asset. Experience with astronomical techniques (e.g. optical imaging and spectroscopy) in addition to astronomical analysis packages such as iraf or idl will also be very useful.The student will the lead member of their particular research project. They will lead the reduction (if necessary), analysis and dissemination of the project results.FengChangSchool of Pharmacy"University of Waterloo - Kitchener5Connecting Wealth to Health for Retiring Older AdultsThe project aims to increase awareness associated with health related financial costs in aging; and enhance the capacity to self-manage post-retirement. In the first phase, a study was conducted assessing the health and financial related needs of rural residents in pre-retirement (45-70 years old). The objective was to identify gaps in knowledge related to health care costs and finances, so suitable educational tools can be developed to aid individuals in decision making. In the second phase models of the educational tools (including technological and paper formats) will be developed and tests will be conducted to make improvements before wider distribution. 6Ideally the student should have the following: - Familiarity with social determinants of health, common chronic diseases of aging and management strategies - Good command of English, enjoys communicating with people, warm and personable, able to work well independently - Technical proficiency with website building and / or smartphone application building, familiar with social media applications - An interest in health services and health policy, and working with older people If this is not possible, having some of the above attributes will be acceptable.7Depending on the background of the student, he/she will assume varying levels of responsibility associated with the project. Opportunities exist to work with local undergraduate and graduate students in pharmacy or health sciences. There will also be interactions with local healthcare professionals and the public. Main duties can include research for content and tool development; tool design and modeling; trial testing for usablilty, knowledge transfer, and behaviour modification during follow-up; community engagement, coordination, writing and presentation.;Quality of medication use among rural dwelling older adultsThis project will examine the quality of medication use among rural dwelling older adults in Ontario. Indicators such as the use of high risk medications, prevalence of drug-related problems, and challenges faced including adherence or medication switches.Research skills working with primary literature, familiarity with medical databases such as Pubmed, common diseases and drugs, and medical terminology. Familiarity with, or interest in pharmacy practice research is an asset. Good command of English and communication skills, able to read and interpret medical literature, self-motivated as a learner, organized, and able to work well independently.ZDuties can include background literature searches, proposal writing, project coordination, data collection and analysis, and writing and presentation. The student will have< the opportunity to work with other graduate and / or undergraduate students also on the project, and work with local healthcare professionals in the data collection phase. Nasser Mohieddin Abukhdeir!University of Waterloo - Waterloo?Characterization of Self-Assembled Domains via Image ProcessingmThis project aims to address the issue of pattern quantification through cross-disciplinary research and development of software tools integrating image processing techniques, computational methods, and theories of ordered phases. This research project involves extending and enhancing an existing in-house package for characterizing self-assembled domains. The package is written in a high-level programming language, Python, which is conducive to rapid problem-focused computational software development. Currently only 2D periodic self-assembled domains are processable using this package. Thus, extensions and enhancements are required for processing domains with boundaries and 3D order. This project is an exciting cross-disciplinary opportunity which could result in a concrete contribution to the general field of self-assembly research. Additionally, it will provide a promising young researcher with a novel background in a growing field of research and technology development. More information about the project and software can be found here: http://chemeng.uwaterloo.ca/abukhdeir/research/software.html#BOORL- Some exposure to image processing techniques - Intermediate-to-advanced programming skills (Python and C/C++) - Intermediate-to-advanced computer literacy, experience with Linux/Unix environments is desirable No prior experience/understanding of seThe successful candidate will learn the fundamentals of self-assembled phases and mathematical morphology through a guided literature review. The candidate will work under the direct supervision of the PI on extensions and enhancements of an existing in-house image processing code (written in Python and C). It is expected that substantial modifications and enhancements to the in-house code will be required to complete the project, which will result in the release of the software package under a free open-source license.6Coarse-grained Molecular Simulation of Liquid Crystals^This project is part of a research program that aims to enable the development of technology, specifically nanotechnology, which utilises these novel smectic LCs and their inherent nano-scale structure. This research involves performing molecular simulations of smectic LCs in order to predict their phase behaviour and nano-scale structure. The methodology used involves course-grained molecular models of LC interactions in conjunction with Monte Carlo simulation techniques. This approach is complementary to experimental studies, which have difficulty capturing the complex nano-scale structure of smectic LCs. This exciting area of research has a broad range of possible applications, from developing next-generation of LCD technology to novel adaptive optics devices. Without a basic understanding of the structure and dynamics of novel LC phases, development of new and innovative technologies based on LCs is infeasible. This project will provide a promising young researcher with background in an exciting area of nanotechnology research, in addition to hands-on fundamental research within this field.- University-level mathematics - Intermediate-to-advanced computer literacy, experience with Linux/Unix environments is desirable No prior experience/understanding of liquid crystals or Monte Carlo simulation methods is requiredThe successful candidate will learn the fundamentals of LC materials and molecular models of these materials through a guided literature review. The candidate will work under the direct supervision of the PI on microscopic simulations using an extension of the SPPARKS Monte Carlo code. It is expected that minor modifications and enhancements to the in-house code will be required to completeBFabrication and Microscopy Studies of Liquid Crystal Optical CellsThis project is part of a research program that aims to enable the development of switchable films and glazings for light-scattering applications (eg. dmart glass?. The research project involves both fabrication of liquid crystal optical cells and characterization of the properties of these cells through various optical microscopy methods. Techniques are used which enable the fabrication of relatively large surface areas with uniform thickness, composition, and minimal defects/impurities. The main aspects of this process include environment isolation, film coating, spacing, and polymerization reaction/phase separation. Once fabricated, relationships between macroscopic film properties to microscopic structure of liquid crystal domains will be determined through optical microscopy techniques. This relationships will be used to both optimize desired properties of the liquid crystal cell and increase the fundamental understanding of the phase separation process involved in fabrication.- Basic laboratory skills. - Experience with optical microscopy and atomic force microscopy is desirable, but not necessary for the project. No prior experience/understanding of liquid crystals or optics is required.JThe successful candidate will learn the fundamentals of LC materials and optical microscopy through guided literature review and hands-on training exercises. The candidate will work under the direct supervision of the PI and in collaboration with other undergraduate and graduate students working in similar areas on the project. CRedox-enhanced Electrochemical Surface Switching of Liquid CrystalsrThe orientation of molecules within an LC gives rise to anisotropic electro-optical properties and depends sensitively on the confinement geometry and chemical functionality at the LC/solid interface. While changes in molecular orientation of the LC far from the electrode surface can be modified by application of an electric field, LC regions in close proximity to the surface remain unaffected by this stimulus. However, functionalization of an electrode surface with electrochemically-active molecules, such as ferrocene, has been found to drive the surface orientation of LCs, enabling a new way to couple LC orientation to chemical and electrical stimuli. This project involves the fabrication and experimental verification of LC-based chemical sensors that exploit surface-driven switching of LC molecular orientation using redox-active molecules. Fabrication will involve construction of gold electrodes decorated with redox-active groups and electrochemical studies of these electrodes with various LCs. Optical characterization of the electrodes will also be necessary to determine the orientation of molecules within the LC.Although no prior experience with liquid crystals is necessary, the applicant should possess some knowledge of electrochemistry and have course-related laboratory experience in building and measuring electrochemical cells such as batteries or cells for metal deposition. The applicant should also be comfortable applying basic scientific t< echniques and in working with redox-active compounds. EThe student will have the opportunity to gain hands-on experience in building and testing LC-based systems for chemical sensing. The student will work in a laboratory environment and will engage in surface-chemistry studies involving the functionalization of electrodes with redox-active compounds. The student will also perform electrochemical characterization of LC devices and will acquire experience using measurement equipment for analytical electrochemistry. Characterization of the LC devices for use in chemical sensing applications will also be carried out by the student.MarcAucoinIRational probing of media additives for the production of biotherapeuticsnThis project will look at characterizing hydrolyzed protein fractions and their utility as media additives in cell culture (e.g. hybridoma, CHO or insect cells). The driving hypothesis of this work is that there are fractions that are susceptible to proteolysis/degradation in a manner dependent on the concentration of the degradation products, which causes these fractions to act as controlled release/feeding devices in culture. It will be the goal of this project over the course of the summer to isolate fractions, characterize their composition and confirm if any such fractions characterized behave in this manner.The student should have an ability to work in a lab and follow established protocols for the fractionation and hydrolysis of protein/peptide fractions (micro-filtration, nano-filtration, enzyme and/or acid treatments) and their characterization (NMR). The student should be able to work meticulously and have great attention to detail. The student should have an ability and willingness to learn new and varied laboratory techniques - including working with animal cell culture.Under the supervision of the faculty member and a PhD candidate, it will be the goal of this project over the course of the summer to isolate fractions, characterize their composition and confirm if any such fractions characterized behave as controlled feeding devices.@Purification of dual modality nano-carriers using nanofiltrationThis project will look at the purification of dual-modality nano-carriers based on adeno-associated virus-like particles. In this work we are trying to develop a nano-sized (25nm) drug delivery device that can also be imaged after being administered. We believe that an adeno-associated virus like particle - a virus that does not contain any genetic material - can be used in such a fashion. A significant challenge remains, however; that of purification of these particles. We also believe that an efficient purification scheme can be developed based on nano-filtration, which uses both the principles of charge and size to separate species.The student should have an ability to work in a lab and follow established protocols for nano-filtration and an ability to manipulate process conditions to tailor the separation for the particles of interest. The student should be able to work meticulously and have great attention to detail. The student should have an ability and willingness to learn new and varied laboratory techniques - including working with animal cell culture, and carrying out SDS PAGE and Western blotting analysis.Under the supervision of the faculty member and a PhD candidate, it will be the goal of this project over the course of the summer to determine the ability of recovering the aforementioned particles and the resulting purity of the particles.dInfluenza virus-like particle production: debottlenecking insect cell culture at the molecular levelFor the past number of years, our group has been investigating ways to increase the flexibility of multiple protein expression in insect cells using polycistronic baculovirus vectors. We believe that by using the native temporal control on protein expression offered by the baculoviruses, we can stagger protein expression and ultimately increase overall product yield. One such system that can benefit from this approach is the production of a two (HA, M1) or three (HA, NA, M1) protein influenza virus-like particle.The student should have an ability to work in a lab and follow established protocols for operating a flow cytometer - a major tool used in our investigations.The student should be able to work meticulously and have great attention to detail. The student should have an ability and willingness to learn new and varied laboratory techniques - including working with animal cell culture, carrying out SDS PAGE and Western blotting analysis, and potentially cloning. Experience in molecular biology techniques an asset.8Under the supervision of the faculty member and a PhD candidate, it will be the goal of this project over the course of the summer to evaluate the expression of protein from different polycistronic baculovirus vectors using flow cytometry. Overall levels and ultimately dynamics will be retrieved from this data.AmeliaClarke1School of Environment, Enterprise and Development:Sustainable Communities - Sharing Local Knowledge Globally7An outcome of the 1992 United Nations Conference on Environment and Development (UNCED), dhe Rio Earth Summit,?was Agenda 21, a comprehensive plan for global, national, and local action on sustainable development. Twenty years later the 2012 United Nations Conference on Sustainable Development (UNCSD), cio+20? focused on a creen economy?agenda. The goal of this research project is to help local governments around the world more effectively implement Local Agenda 21s (LA21s ) ?or other community sustainability plans ?and transition toward a local green economy. The student should have training in urban sustainability, strategic management, and/or green economy. The student must be fluent in either Chinese or Spanish (in addition to English). Skills in quantitative research and SPSS are also ideal. ,Assist the team with the international survey - both data collection and analysis. Additional qualitative data collection in either China or Spanish speaking countries (depending on language ability) related to local sustainability strategies and their implementation.Other related tasks as needed. Hans De SterckApplied MathematicsEAdvanced algorithms for social network analysis with Hadoop/MapReduceSocial network companies like Facebook, Linkedin and Twitter make extensive use of the distributed computing facilities of the Hadoop/MapReduce framework to operate their business and analyze their user data in a scalable way. In this project you will help to develop algorithms for social network analysis that are inspired on numerical linear algebra techniques, implement them into the Hadoop/MapReduce framework, and experiment with their use for social network analysis. Due to limitations in Hadoop and the MapReduce framework the algorithms currently used are often mathematically quite simple, and the goal of this project is to explore the use of more sophisticated algorithms. The challenge is to find clever ways to fit those more sophisticated algorithms into the Hadoop/MapReduce framework (which may require enhancing the framework) and to do this in a way that maintains scalability for very large datasets. Student should be in one of the following programs: Computer Science/Applied Mathematics/Computational Mathematics/Electrical Engineering/Computer Engineering, or similar. Required: Student should have taken courses in numerical methods, in particular numerical linear algebra, and should have expert knowledge in Java or C++ programming. Studen< t should have interest in parallel and distributed computing and social network analysis. Student should be ranked in the top 10% of his/her class. |The student will help to develop algorithms for social network analysis that are inspired on numerical linear algebra techniques, will implement them into the Hadoop/MapReduce framework, and will experiment with their use for social network analysis. The student will become a member of a small research team that is composed of several graduate students and a postdoc, and is directed by two faculty members (from the Applied Mathematics and Computer Science departments). The student will have access to dedicated Hadoop/MapReduce computer infrastructure, and real social network data will be used for testing the analysis algorithms.MThree-dimensional fluid dynamics simulations of extrasolar planet atmospheresS This project aims on the development of a scalable research platform for cost-effective management and coordination of a variety of research designs, including large-scale epidemiological cohort studies, augmentation of existing cohort studies, harmonization of existing measures, intensive measurement designs, and randomized clinical trials. The e-epidemiology we envision includes the use of Internet, cellular, wireless, and other electronic and remote data acquisition methods for observational and clinical trial research. Such data collection procedures can be used to augment existing cohort studies, providing additional information on exposures, lifestyle factors, and refinement of outcome measurements. An integrative and flexible research platform on which to obtain one-off and repeated surveys and assessments of health, cognition, and well-being would serve a number of purposes, with applications for applied, clinical, and basic epidemiological and interdisciplinary research. This research platform would meet current scientific needs for large-scale and intensive measurement studies of health, cognition, and behaviour and would support assessments across a range of devices. Innovative features of this research platform include: a) a privacy regulation-based data repository for encryption and data storage and access according to country-specific regulations; b) platform scalability to enable expansion for large-scale studies; c) a dynamic scheduler for adaptively regulating the frequency of assessments conditional on change in functioning; and d) flexibility of survey and assessment formats for PC, tablet, and wireless devices. The platform will be developed under an open source model that encourages development of shared survey and assessment modules to facilitate interdisciplinary and international comparative research. The research platform will be developed in partnership between researchers in the Department of Computer Science (Software Engineering) and the Department of Psychology at the University of Victoria, in close collaboration with researchers at the School of Medicine at Cardiff University (UK) and the Vancouver Island Health Authority (VIHA) as one of the knowledge translation partners. The project will contextualize and integrate prior research results that have been established in these separate fields. The intern will use advanced software engineering tools and techniques to help design and implement the first version of the research platform in tight collaboration with domain experts and stakeholders. The system will need to meet strict privacy and security requirements, which will have to be implemented in the design and verified for compliance. The platform will integrate with a variety of other Web services and eHealth systems, such as clinical information systems, personal health records, and social health networks. There are no pre-requisite skills; it is understood that the interested student will learn all aspects of the design and implementation during the project. Of course, a student interested in such a project would be keen to learn about optics, spectroscopy, control of mechanical components, and some basic computer programming. No previous experience in any of these areas is assumed or required.HThe MITACS student will first survey the relevant literature, and learn some basic aspects of existing designs related to the instrument being constructed. The student will then assemble the optical bench, perhaps in a step-wise manner, making sure the signals are what they should be as the complexity increases with added components. Basic software will be developed to control the instrument, and perhaps some custom hardware. Finally, some initial test studies will be performed, using the newly-built instrument to investigate protein adsorption at the solid-liquid interface.Nigel Livingston Livingston CanAssistDTechnology to improve the quality of life of those with disabilitiesThe project will focus on the development of computer interfaces, tools and applications that are directed at improving the quality of life and independence of persons with disabilities. The suit of tools under development include; 1) Task management and organizational apps, and journal (record keeping and diary) applications that support individuals with cognitive challenges- including those with brain injuries and those with early stage dementia. 2) A simple and automated Skype interface (canConnect) that allows people unfamiliar with, or unable to use computers to connect (via audio and video interaction) with family, friends or caregivers using Skype. 3) A wayfinding and navigation tool (CanGo) that allows users (for example those with cognitive challenges or those with visual impairments) to use the public transit system to travel to and from school, the workplace to or visit friends). Students must have experience with software development. Ideally, students will have experience developing mobile applications for either the Apple (iOS), Android, Blackberry, or mobile web (HTML5 and JavaScript) platforms. It would also be beneficial if students have experience with user interface development. We hope to attract students who have had some experience in the disability field. Students will be engaged in many aspects of the research, from interviewing clients (users), to drawing up requirements documents for individual modules, to the development and testing of code. Wherever possible, students will have direct contact with users, both in determining their needs and evaluating outcomes. Students will be part of an enthusiastic and interdisciplinary team that includes, mechanical and electrical engineers, psychologists, neuroscientists and computer scientists.McIndoeUThe mechanism of palladium-catalyzed decarboxylative coupling using mass spectrometry/Design and synthesis of a charge-tagged substrate for decarboxylative coupling. Such a substrate will contain ester, alkynyl and allyl functional groups in addition to the charge tag. Preliminary results show that these reactions can be studied in great detail by our continuous monitoring methods, and accurate kinetic data obtained. Additionally, these reactions produce interesting byproducts, which with a firm understanding of the mechanism may be selected for by judicious choice of reaction conditions and catalyst. These r< epresent new and potentially useful catalytic transformations that will add new tools to the synthetic chemist's arsenal of methods. The student will build the charge-tagged substrate, characterize it fully by a range of spectroscopic techniques, and examine its reactivity under standard reaction conditions (for decarboxylative coupling, this is typically in the presence of a palladium catalyst at high temperature) using our pressurized sample infusion (PSI) method in conjunction with continuous analysis by electrospray ionization mass spectrometry (ESI-MS). Concentration versus time data will be extracted and normalized for all species, including starting material, product, byproducts, and intermediates. A combination of numerical modeling and chemical intuition will help us establish a mechanism, which will be tested by altering experimental conditions in a rational fashion. New reactions will be investigated fully, and conditions altered to enhance the yield of these byproducts if they look to be of potential utility in synthesis. Basic synthetic lab skills. Good knowledge of Excel. Experience with analytical instrumentation (any type of spectroscopy is fine; training in ESI-MS will be provided).JThe student will spend time in the laboratory (synthesis), and collecting data using a variety of analytical tools (which they will be trained to use, and will include NMR spectrometers and especially mass spectrometers). They will be expected to process and analyze this data, and to use the results of the numerical modelling (depending on their skill set and aptitudes, they may have a hand in this work as well, but it is by no means necessary for a productive project) to pursue optimization of reaction conditions and to help establish a reasonable and useful catalytic mechanism.Chris PapadopoulosNanostructured solar cellsMSemiconductor nanoparticles, or quantum dots, will be used in this research project to create nanostructured photovoltaics for large-area solar cells. The resulting nanostructured films will be characterized and studied in order to create the most efficient solar cells possible with an eye towards potential applications in low-cost photovoltaics. Semiconductor nanoparticles will be produced and studied via microscopy and spectroscopy. The particles will then be incorporated into solar cell device structures and their photovoltaic properties examined. The resulting photogenerated currents and voltages will be measured and characterized by examining the solar cell I-V characteristic. We plan to improve the properties of the nanostructured solar cells by focusing on two related fronts: (i) nanoparticle morphology and distribution and (ii) nanoparticle surface and interfaces within the solar cell structure. The interface between the nanostructure surface cell contacts and the nature of the built-in electric fields will ultimately determine solar cell performance and providing an efficient charge generation and collection mechanism is one of the prime challenges in the development of these photovoltaic materials. The shape and distribution of the nanoparticles within the film will also be optimized to increase solar cell efficiency.Experimental skills are an important asset. Some experience with materials processing (wet chemistry) and electronic characterization (current-voltage measurements) will be beneficial.Student will be involved in materials synthesis and electrical-optical characterization of nanostructured photovoltaic materials. On the synthesis side, this will include solution processing of nanostructures, e.g., semiconductor quantum dots. The characterization work will involve using precision source-measure units to determine the electrical and photovoltaic properties of the nanostructures in combination with nanoscale microscopy.YangShi$Department of Mechanical EngineeringwDistributed Optimization and Control for Networked Complex Dynamic Systems: Application to Multiple Autonomous VehiclesyA networked multi-agent system, being composed of multiple interacting dynamic agents connected via communication networks, can be used to solve problems which are difficult or impossible for an individual agent. Application examples include: a team of mobile robots for land mine search; unmanned aerial vehicles for surveillance; and, underwater vehicles for undersea exploration. These are all critical areas of importance to maintaining Canada's prominence in the global knowledge-based economy. The primary objective is to make the agents behave consistently, the so called consensus problem, by designing a networked cooperative control strategy. However, communication networks and complex agent dynamics present tremendous challenges to control engineers and designers: The network-induced random delays and packet losses. Existing control design methodology has not kept pace with the technology - most cooperative control theory and practice ignore the aforementioned practical communication constraints, which may significantly degrade performance. The research project will contribute to the fundamental understanding and unified design framework of such networked multi-agent systems that operate in complex, unstructured environments and over unreliable communication networks. This research will integrate communication and cooperative control in ways that will both increase the cooperative control performance and the ability of involved agents to work together in complex tasks. The proposed research program will fill the gap between theory and practice; it will provide control engineers with new tools for analysis and synthesis of the cooperative control for networked multi-agent systems; it will establish a novel unified paradigm addressing practical constraints. Furthermore, the program will greatly benefit graduate students through technology- and industry-relevant research training.The potential Golbalink students are expected to have the following skills/experience: (1) Have taken the fundamental control course. (2) Have some programming experience on Matlab/Simulink. (3) Research experience on control systems would be an asset. The proposed project aims to bring controls to real world application for the Globalink students, and train the applicants with in-depth understanding of advanced control theory. Specifically, the candidate will carry out the following research while closely working with the supervisor and other lab members: (1) To learn the new cooperative networked control scheme, and then apply it to the experimental multi-agent system. (2) To set up the Matlab/Simulink model of the network based multi-agent systems consisting of helicopters and mobile robots. (3) To test the new control schemes on the developed simulation model. (4) To conduct the experimental tests. (5) To write the research report. IssaTraore6Mobile User Authentication Using Behavioral Biometrics<Mobile applications and devices are playing a major role in modern communications. The number, variety, and quality of devices being used around the world has increased dramatically. According to Gartner, they will surpass personal computers for accessing the Web as early as 2013. Likewise many end-users currently rely on mobile devices as their primary computing platforms.However, the level of security offered by mobile communications has been lagging behind that offered by wired devices. Although almost all devices provide a PIN to restrict access to legitimate users, PIN can be stolen, shared, or forgotten. But even worse, most users simply do not use their PIN, because they find them impractical. Our goal in this project is to develop new ways of securing mobile devices that would be transparent to the user and more reliable. Our lab has performed pioneering work in the areas of behavioral biometrics technologies using mouse and keystroke dynamics. We are currently working on extending and adapting our algorithms to the areas of mobile authentication based on data collected through touch screen devices. There are various types of data that can be extracted from mobile user interactions and various se< nsors intrinsic to mobile environments that we believe can be used for biometric authentication. The intern will assist in developing tools to collect and analyze such data in order to extract relevant features. The proposed research will enable the development of stronger technologies to secure mobile devices and protect the identity and assets of mobile users. `Programming skills in at least one of the major programming languages, such as Java, C++, or C#.The student will assist in implementing some of the data acquisition tools, and possibly in collecting and analyzing data samples.JensWeberjHeads in the Cloud (HitC) - A Cloud-based Service Platform for e?Epidemiological and Intervention ResearchThe proposed project involves the design and construction of a real-time broadband infrared Mueller-matrix ellipsometer for the study of protein adsorption at the solid-liquid interface. By preparing an incident light field in a well-defined polarization state, and then characterizing the change in polarization that occurs upon interacting with a sample (for example, in a reflection geometry), it is possible to learn about many structural details of the molecules that interacted with the beam. This field in general is called polarimetry or ellipsometry. A recently-published technique from our group is capable of measuring all elements of the polarization transfer matrix, in the mid-infrared from 400-4000 wavenumbers. This is especially interesting since this region of the spectrum corresponds to molecular vibrations. As a result, analysis of the polarization fingerprints we observe can be directly related to sub-molecular features. If results corresponding to the vibration of a particular chemical functional group are analyzed quantitatively, it is possible to construct a distribution for the molecular bond orientation. If this is performed for multiple vibrational modes, we learn about the shape of molecules at the solid-liquid interface. Applications include investigating the biocompatibility of polymeric medical implants, thereby studying the propensity for protein denaturation upon adsorption at the polymer-aqueous interface. Although our existing experiment is powerful, and has demonstrated the proof-of-principle, it is slow, thereby limiting its applications to many scientific problems. The MITACS student working on this project would develop an instrument with the same capability, but of an entirely different design. Four photoelastic modulators (piezoelectric crystals) would be employed in order to modulate the light at various frequencies. Demodulation of the signals would then be performed in order to extract the optical, physical, and ultimately chemical properties of interest.The student should hold the following skills: -health informatics and nursing informatics -web-based survey administration -conducting literature reviews -conducting focus groupThe student will -conduct reviews of the literature - conduct a web-based survey -conducting focus groups and participate in the analysis of the data - be involved in tool developmentReuvenGordon6Optical Trapping of Nanoparticles and Optical Antennas=The student can choose between two sub-projects. The first involves our recent discovery of optical trapping of nanoparticles that is 10000 times more efficient than past methods. With this improvement we can trap, detect and manipulate even single proteins. We are studying the capability of this unique trapping configuration, which will be the research task of the student. The second project involves design of antenna like structures for the visible-infrared regime. These antennas will be of interest to single photon sources and enhanced photovoltaics (solar cells).Good physics background, especially in optics, properties of materials. Interest in nanotechnology. Capability with computers. Good with their hands (setting up experiments).Students can assist in optical trapping setup modifications, running different trapping experiments, interpretation of data. Students can also run simulations for the optical antenna project, and potentially create devices for testing.DennisHoreThe construction of a rapid-scan mid-infrared Mueller matrix ellipsometer for the study of protein adsorption at the solid-liquid interface The proposed project provides new nurses with the knowledge, skills and competencies needed to work in clinical practice settings in British Columbia, Canada and Internationally. The project will lead to the development of nurses who can not only articulate how nursing practice can be captured in health information systems (HIS) and telehealth applications, but can also use technology to enhance the safety and quality of their practice. The project will also allow for a new typical curriculum to be disseminated widely by providing faculty from BCb Schools of Nursing with curriculum components or learning activities/teaching tools that can be used to integrate informatics and information management content into nursing courses. Objectives of the Project: The objectives of the proposed project are to: (1) identify nursing informatics and information management competencies required of undergraduate baccalaureate nurses in BC, (2) identify teaching/learning activities that can be used to develop these informatics competencies within the context of a typical undergraduate nursing curriculum, (3) develop a laddered curriculum that leads to the development of these competencies, (4) pilot and evaluate the curriculum components (i.e. teaching/learning activities), and (5) use a drain the trainer?approach to disseminate the curriculum. Background: National nursing bodies have identified a critical need to integrate informatics and information management competencies into nursing curricula. Many undergraduate nursing informatics competencies have been identified, but they have not yet been fully identified, defined, integrated, evaluated or disseminated to Schools of Nursing internationally (Weaver et al., 2010). The development of a curriculum that supports informatics competencies development is critical to todayb and tomorrowb nursing practice [Institute of Medicine (IOM), 2010]. The project i< s unique as it has high impact and volume, having the potential to bring Schools of Nursing into the 21st century healthcare environment. Methods: The project will take place in 4 Phases. Phase 1 will consist of a systematic review of the literature focusing on nursing informatics and information management competencies at the baccalaureate level. Phase 2 will involve focus groups (with key policy, professional and academic stakeholders) to identify current and future information management and informatics competencies for nurses working in BC. In Phase 3 the researchers will integrate nursing informatics and information management components into a typical nursing curriculum, and in Phase 4 they will evaluate these curriculum components for their ability to develop undergraduate nursing informatics competencies. - ObjectiveC or Java language knowledge - Android or iPhone development experience (junior developers) - Ability to work independently - Creative mind - Strong work ethic motivated by discovery and researchThe applicant will experience the full execution of a research project, including the definition of the problem, design of the solution, the development of the prototype and its evaluation and, to conclude, the production of dissemination material such as technical reports, videos and photos depicting the projects novelty. S/he will be included on a strong research team and work alongside multiple researchers, thus exposed to a multi-disciplinary team. We will have direct access to the team's knowhow and encouraged to collaborate with the team's members. S/he will receive training within the area of HCI research at one of the worldb top labs. The training will include novel technologies such as depth cameras (Kinect hardware) and visualisation frameworks such as Processing or Windows Presentation Foundation. Finally, he will experience the production of a technical paper and companion material such as video/photos. These are valuable skills, seldom taught within the undergrad program. Further, we expect the successful co-op to conclude with a first-tier publication at ACM CHI 2014 in Toronto, Canada. Publications made at a leading venue will be fully funded for the first author. YanqinWuAstronomy & Astrophysics&Formation of Moons around Pluto-CharontIn this short project, we will carry out numerical simulations of the Pluto-Charon binary system and the moons around them. Astronomers have recently found 4 more small moons around this binary and their origins remain a puzzle. We hope to disclose their origin, and also to learn about the condition of the Kuiper belt region in the Solar system when these bodies formed.ability to programme (any language), to run numerical simulations, have a good grounding of college physics and math. be strongly motivated.pdesign and run numerical simulations, plot results, analyze results, interpret them physically, write up reportShirley X.Y. Leslie Dan Faculty of Pharmacy,Nanotechnology for biomedical applications hthe project concerns the preparation, characterization and in vitro evaluation of drug delivery systems.the student should have training in one of the fields: chemistry, polymer science, biomedical engineering, nanotechnology, cell biology, pharmacology, toxicology. Depending on the background, the student may undertake synthesis and characterization of nanotechnology-based drug deliver system or conduct in vitro evaluation of such drug delivery systems using cell culture. ElizabethBoryckiHealth Information Science!University of Victoria - VictoriaFPreparing Nurses to Work in Technology Enabled Healthcare Environments The objective of this application is to develop a human-computer interface for mobile devices that harness the field of freehand gestures and the multiple cameras already integrated in commodity devices. In particular, the focus will be on using the front and back camera to create a complete visual picture of the surroundings of the mobile device. In turn, this visual picture will enable freehand gestures that "go-around" the device, allowing users to, for example, create continuous gestures around the device. The applicant will be responsible for the exploration of the design space and the development of the framework to infer such gestures. The development will be supervised by a researcher expert in the field and the applicant will be provided with the tools and knowledge required to execute the project. The internship has the duration of 12 weeks, divided as follows: Week 1: Applicant Arrival and Setup - Introduction to the research lab. The applicant will present himself to the research lab and introduced to the lab's members and their current projects - Applicant workspace setup. The applicant will be provided with a suitable workspace environment where most of his development will occur. Weeks 2 - 3: Background Overview - The applicant will be introduced to the hardware used and the respective API. - The applicant will be provided with previous related projects. - The applicant will be included in the creative process of the project and allowed to influence the project's direction in terms of technology used and how to implement the prototype. This period will finish with an applicant presentation explaining what he will do, in the next period. Weeks 4 - 8: Development - The applicant is responsible for hardware setup and workstations required to run the prototype. - To demonstrate the project's ideas the applicant, in conjunction with a senior researcher,implements a prototype. This period has, as a deliverable, the final prototype, ready to be evaluated by end-users. Weeks 9: User Studies In order to evaluate the work developed, the applicant will assist the execution of a user study. This includes: - Preparation of the user tests - Production of questionnaires and other material required during the execution of the user studies - Conduction of the user study with end-users. This step will yield results and is one of the main objectives for a successful internship. Weeks 10-12: Media Production The applicant will also be responsible for the production of documentation and included in the production of a technical report. Which will be submitted to a first-tier HCI publication. S/He will also be included in the production of other media material required for the dissemination of the work produced during the internship. In particular photos depicting the technical details and ideas explored and a 10-minute video to be used as companion to the technical report produced. (All material required for these is available at the lab).The ideal student will have an excellent background in mathematics, operations research, optimization, and programming. A course in linear programming, integer programming or similar optimization course is required. The student should be comfortable i< n both Windows and Linux environments. The student should be proficient in Matlab and/or C++. Knowledge of AMPL, CPLEX and cluster computing are strengths. Eagerness to learn new programming languages and mathematical concepts, and an excellent work ethic are required.[The student will perform the core duties associated with the two problems described in above. In the first project, s/he will analyze patient data that we have from hospital collaborators and develop exhaustive search methods in Matlab to determine optimal re-optimization intervals for historical treatments. Next, the student will develop the optimization model that identifies optimal re-optimization intervals for future treatments (to be solved using CPLEX). Finally, a simple simulation model will need to be developed (in Matlab) to generate uncertainties that can be used to test the optimal adaptive treatment schedules generated by the optimization model. For the second project, the student will extend the existing optimization model to incorporate new ideas on how dose that is delivered in previous treatment days can be accounted for and used to adjust future dose targets. My group has already documented some of these ideas, so the student will be responsible for implementing them. Coding will be done in a combination of Matlab/AMPL, using CPLEX as the solver. The student will also have significant latitude to generate his/her own ideas for dose adjustments that can be tested. Large scale optimization tests may be run on a remote computer cluster HPCVL (www.hpcvl.org). The student will work closely with myself and a PhD student. AshvinGoelQDepartment of Electrical and Computer Engineering, Department of Computer Science,Scalable and Reliable Storage VirtualizationnTwo emerging technologies, virtualized and cloud storage, are transforming the storage infrastructure of computing systems. Both technologies enable increased resource sharing, promising more flexible, manageable, and cost-efficient use of storage resources. However, these shared storage technologies raise several challenges for storage providers. Storage systems must meet the competing and conflicting performance demands of customers. They must be designed for heterogeneous storage units with widely-varying characteristics, such as low-end and enterprise disks, and a variety of flash devices. Storage reliability and security become critical concerns because loss of data availability or data compromise can affect a large number of users, potentially damaging customers' businesses. These and other requirements, such as reducing energy consumption, result in growing complexity of shared storage systems, with significant time and effort being spent designing, customizing, and maintaining storage solutions, both at the provider and customer ends. Our key observation is that a policy-based architecture that allows flexible specification and enforcement of storage requirements is essential for managing the complexity of shared storage systems. We propose a storage architecture in which high-level policies allow expressing application-level requirements using a common management interface, and the system enforces these requirements by monitoring both storage requests and the dynamic characteristics of storage devices. Our policy-based storage system will adapt to changes in workload, hardware configurations, power consumption and load hot-spots, while providing reliable and secure storage to clients. It will help reduce storage provider costs, while enabling greater flexibility for their customers, thereby increasing the use of virtualization and cloud storage technologies.- Strong expertise of C - Basic knowledge of the internals of the Linux operating system - Basic knowledge of compiler design - Shell programming, Python The student will define the initial set of features for the storage policy language. The student will validate the expressiveness of the language by writing policies for storage provisioning, and differentiated request handling. Time permitting, the student will refine the policy language based on experience with writing policies. The student will investigate strategies for combining multiple policies and resolving conflicts. BenLiangFMobility management for next-generation heterogeneous wireless systemsTo support the ubiquitous availability of broadband applications and services, multiple network-access technologies, including the wired Internet and various wireless networks, are expected to co-exist and interoperate. This integration of heterogeneous access networks brings forth unique challenges in the design of multimedia applications and services, since no single network meets the ideal of high bandwidth, universal availability, and low cost. This project aims to provide innovative solutions to network inter-connectivity and wireless resource management, to allow efficient and transparent services to mobile users across heterogeneous networking platforms.BProbability theory, linear systems, algorithms, Matlab programmingTo assist faculty member and graduate students in mathematical derivation, computer simulation, and numerical experimentation. Participation in weekly research group meeting is expected. IStochastic optimization and transmission control for multimedia streamingfMultimedia streaming applications and services impose difficult constraints on network design, requiring high bandwidth, low latency, and stable data transfer for smooth playback. These challenges are compounded with the prevalence of limited-bandwidth and unstable wireless access to support free mobility. This project aims to develop new theories and algorithms for transmitting and receiving streaming data over time-varying pathways, based on a fundamental understanding of the tradeoffs among different system parameters including memory usage, playback delay, jitter tolerance, and computational complexity.To assist faculty member and graduate students in mathematical derivation, computer simulation, and numerical experimentation. Participation in weekly research group meeting is expected. ,Multihop cooperative wireless communicationsCollaborative network participants can jointly achieve advanced networking functions beyond simple relaying of data packets. They enable distributed network reconfiguration and autonomous tuning of software and hardware among peers, to support diverse and evolving application requirements and networking environments. However, the combination of mutual interference, network scale, decentralized control, and possible multihop radio instability, brings new challenges to the paradigm of intelligent collaboration in future generation wireless networks. We conduct research to create new theories and technologies toward promoting intelligent collaboration among the peer devices in a wireless system. This project aims to achieve efficient provisioning of resources and services, leveraging the benefit of joint-communication and joint-processing power of multiple wireless devices in proximity. Nades Palaniyar$Laboratory Medicine and Pathobiology*Reducing NETosis for improving lung health&Background, knowledge gap and therapeutic value: Lung disease is the primary cause for the morbidity and mortality of patients with cystic fibrosis (CF). One of the major culprits of the CF airway diseases is chronic inflammation associated with dying neutrophils, accumulating DNA and colonizing Pseudomonas aeruginosa. However, the factors and pathways that regulate the pathological changes, particularly relevant to neutrophil death, in CF airways are unclear. NETosis or the death of neutrophils by forming neutrophil extracellular traps (NETs) is a recently identified form of cell death. Our data strongly suggest that dysregulation of NETosis contributes to the accumulation of DNA and neutrophilic cytotoxic by-products in CF lung disease. We aim to inhibit this pathway to treat CF lung disease and to prevent the deterioration of lung health. Overall objectives of our project are to identify the host and microbial factors that regulate NET< osis in the inflamed lungs and to devise therapeutic strategies to suppress NETosis. Specific aims and experimental approaches: (i) The first aim is to determine the key factors (host and microbial components) that regulate NETosis. We will use wildtype and mouse model with CF-like lung conditions to identify the host factors. To identify the bacterial factors, we will use clinical strains of bacteria and their components in NETosis assays. (ii) The second aim is to elucidate the relevance of these factors in CF lung disease and to suppress NETosis in CF-like mouse airways. We will use human (healthy, CF) blood, BAL and sputum samples and in vivo mouse models to achieve this aim. Significance: Identifying molecules that suppress NETotic neutrophil death in the CF airways could help to devise novel therapeutics for treating CF lung disease. ^This project is suitable for a student who has strong background in immunology and microbiology. The student should be comfortable with basic biochemical assays and standard laboratory procedures. Previous experience in tissue culture or primary cell cultures is highly desirable. Good working knowledge in fluorescence microscopy would be helpful. The project student will participate is conducting one of the specific aims, primarily using in vitro experimental setups. The student will isolate and purify host factors from lung washings. Verify the purity of the factors by SDS-PAGE, Western blots, ELISAs and mass-spectrometry. These purified components will be added to the NET forming neutrophils. The effect of these factors on NETosis will be determined by monitoring fluorescence readouts. The results will be confirmed by fluorescence microscopy. The student may also test the effect of microbes and microbial components in inducing NETosis using similar procedures listed above.DanielWigdorDepartment of Computer Science'Development for the Symphony of Devices The multitude of computing devices owned (and carried) by any given person is increasing. There is a clear need and opportunity to not simply replicate experiences across form factors, but rather to enable applications to easily span form factors. In the future, it is easy to anticipate that experiences will seamlessly grow and shrink by annexing nearby displays and input devices: the Netflix of the future is not one that can simply play the same movie on multiple devices. Rather, it is one where any device in a viewerb pocket can serve as a remote control and supplementary information display for the screen showing the film. Two people trying to find a common date for a meeting will be able to effortlessly show an overlap of their calendars on a nearby screen. A user sitting at his laptop should be able to easily slide a table-of-contents page of a document to his iPad, and use it as an index to select which pages are shown on the PC. We term this personal computing experience the Personal Symphony of Devices (PSoD).6This will be an intensive software development internship. The applicant will be iterating on our existing toolkit, and designing new applications for simultaneous use on multiple devices. Development experience in native languages is essential (eg: Objective C, C++). Experience with mobile development is desirable. Experience with Javascript and HTML5 application development (not simply webpage creation) is also highly desirable. A track record of creative thinking, the ability to work independently, and a desire to do something truly original are key.)The student will be developing an application of their own design for the Symphony of Devices project. They will research, brainstorm, and designing their application, with the help of the faculty advisor, post doc, and graduate students. They will then begin development using Javascript and HTML5, utilizing our existing toolkit to build their application. In those places where the toolkit is insufficient to meet their needs, they will switch to developing the toolkit. This pattern will continue, while the toolkit and application evolve together. FAd hoc multi-device application experiences in the Symphony of DevicesEThe research project is concerned with the design, implementation, and evaluation of post-WIMP interaction for ubiquitous computing environments with multiple co-located devices and/or users. The goal is to create user interface designs and technologies that enable the ad-hoc creation of a community of co-located devices (e.g., smart phones, tablet PCs, interactive tabletops, high-resolution screens) that serves a single user or a group of users during collaborative tasks. The scenarios for the collaborative activity can range from home entertainment to creative design or command & control. As opposed to previous approaches in this field, the project particularly focuses on finding designs and implementation strategies that adapt and flexibly react to changes in the physical environment and the configuration or number of devices or users. The intended outcome are systems that do not impose few selected pre-defined device and display configurations, working styles, or workflows on the user(s). Instead, the community of devices?functionality and the style of interaction between users and the devices emerges flexibly from the current configuration and local interactions between neighboring agents (i.e., devices and users) instead of being based only on apriori hard-coded functionality, workflows, and rules of context-awareness.Knowledge of human-computer interaction principles and experience with designing and implementing graphical user interfaces with object-oriented programming languages. Experience with implementing user interfaces with C# and the Windows Presentation Foundation (WPF) and experience with the design of multi-touch interaction is a big plus. Willingness to look into literature about biological, physical, or technological self-organizing systems that expose emergent behavior (e.g., swarms or flocks of animals, cellular automata) and to explore what programing languages and algorithms could simulate and exploit such behavior for the project.The studentb task will be to support the prototyping of a system based on several co-located tablet PCs or smart phones for a given task based on existing user interface APIs and frameworks. The system should serve as a proof-of-concept that demonstrates that the desired functionality and behavior for an entire community of devices can emerge from and adapt to local interactions and spatial configurations of devices. Therefore the student will also help researchers during searching for computer< science and self-organizing literature with modeling techniques, programming languages, or other approaches that could be helpful to achieve this.Mobile free-touch interaction Intensity-modulated radiation therapy (IMRT) is an advanced cancer treatment technology that uses beams of high energy x-rays to deliver radiation to a tumour. In IMRT, radiation beams are divided into many small beamlets. The intensities of each radiation beamlet are computed using specialized software. In this software, the treatment planning problem is modeled as a mathematical optimization problem, and solved using mathematical algorithms. Treatments need to account for potential uncertainties that may degrade treatment quality. For example, tumours in the lung move as the patient breathes, so when solving the mathematical optimization problem to design a radiation therapy treatment, such motion must be accounted for. My research group has designed novel robust optimization methods to optimize radiation therapy treatments subject to such uncertainties. Furthermore, we have developed adaptive methods that allow the treatment to be adapted to patient changes as the treatment progresses ?treatments are normally spread over multiple weeks. This Globalink project will build on existing research that is being conducted in my research group on adaptive and robust radiation therapy. In particular, there are two related problems that will be explored in this project. First, the question is cow often should a treatment be adapted? Our initial research shows that treatments that adapt to uncertainty perform better than those that donb, but the question of how often to adapt is still open. Frequent adaption likely leads to better clinical results, but ends up being quite costly for the hospital. To address the first question, we will start with an empirical analysis using historical patient data and exhaustive search to determine optimal treatment adaption times (retrospectively), given a budget of one adaption, two adaptions, etc. That is, if we are allowed to re-optimize the treatment once to adapt to observations of the uncertainty, when should we re-optimize. Guided by the empirical findings, we will develop a mathematical model that can be applied to future treatment cases to determine guidelines on treatment adaptation. The second problem will focus on extending our previously developed mathematical models for adapting a treatment to consider incorporating previous dose information in a novel way. We have developed preliminary adaptive optimization models that account for previous dose information (e.g., if certain parts of the tumour are underdosed, then the model will focus on increasing dose to those regions in subsequent treatment days), but they require more extensive testing. We have hypothesized additional enhancements to the model that can more accurately measure previous dose delivered and adjust future dose requirements. We will implement these new algorithmic ideas, incorporate them into our previously developed models and test whether dose results are improved with the new model. The student should be familiar with a basic knowledge of optics and electronics. Some familiarity with condensed matter is a desired, but not necessary. Previous experience with Raman, nano lithography or scanning probe microscopes is also desired, but not required. The student will begin by producing their own nano materials via mechanical exfoliation (the same process used to make graphene). They will then assist in the day to day operation of the TERS instrument, and eventually perform their own measurements. Finally the Student will be responsible for developing new analysis software to handle the large datasets they produce. There may also be opportunities for the student to interact with our industrial partners on this project, to provide feedback on instrument operation. TimothyChan2Optimal adaptation of radiation therapy treatmentsThe project is aimed at exploring various physical properties on the nanoscale using optical spectroscopy. In particular a focus on Raman and Optical spectroscopy of nano materials and bulk compounds on the nanoscale to better understand the interplay between numerous physical processes. This not only allows us to explore the basic physical underpinnings of novel effects but better characterize and optimize devices. Three different materials will be the focus of this study, high temperature superconductors, topological insulators and semiconductor nano wires. All three offer unique opportunities to study the emergence of new properties as a material is tuned on the nano scale, while also offering dramatic improvements in multifunctional materials, quantum computation, loss-less energy transmission and thermoelectric power generation. Particular focus will be given to the Raman spectroscopic response of these materials. Raman is a powerful technique as it can measure the lattice, magnetic, thermal and electronic properties of a material simultaneously. Furthermore every material has a unique Raman signature and thus Raman can be used to "fingerprint" an unknown compound or map out the composition over a large are< a. This is all achieved through the scattering of a focused laser beam, providing spatial resolution of 1 micron. Recently our group has begun to couple a Raman system with a scanning probe microscope enabling us to measure the Raman response with 10nm resolution. This unparalleled performance will provide key insights into the role of nanoscale inhomogeniety in novel properties as well as fully characterize single nano materials (quantum dots, nano wires, exfoliated material). For example this Tip Enhanced Raman Spectrometer (TERS) can simultaneously map the local temperature, strain and chemical composition of a material or device on the nanoscale. These TERS spectra will be acquired while operating the device to better understand the real limits of its performance as well as the origins of novel behaviour that emerges. IA reasonable training in the basic concepts of machine learning, including some understanding of theory behind it. If the student is more interested to pursue the theoretical side of the program, then I suggest to read my course notes on foundations of statistical learning: http://aix1.uottawa.ca/~vpest283/5313/downloads.html }The successful candidate will be working together with other research students under my supervision. This will make his or her integration in the project seamless, as the past experience shows. Weekly meetings with the supervisor and the team members to discuss the project are expected. At the same time, I expect quite a bit of independence and initiative on the student's part. Gary W.Slater8Translocation of Partially Melted DNA through a NanoporeEnabled by advances in the fabrication of nanotechnology, it is now possible to construct nanofluidic devices with enough precision to isolate and analyze single biological molecules. An exciting example is nanopore technology in which a nanoscale hole is drilled into a membrane. The width of the resulting pore can be on the order of Angstroms and thus if a molecule such as DNA is threaded through the pore, it can pass from one side of the membrane to the other with the base pairs passing through the pore sequentially ?a process that forms the basis of emerging sequencing technologies. This project will use computer simulations to model and investigate the dynamics of the translocation of a double stranded DNA segment that is partially melted; that is, the two strands dissociate at various locations forming cubbles? How does the existence of these bubbles affect the time it takes to translocate? Is the probability of successfully crossing from one side to the other affected by the melted regions? How do these effects depend on the degree of melting? Considering the translocation of a dnknown?strand, can these changes to the dynamics be used to obtain information about the DNA segment? Modeling: A physics background is preferable to facilitate the modeling aspects of this project. The student will help to implement a simulation model of partially melted DNA and thus familiarity with interaction potentials will be beneficial. Programming: While this is a simulation study, only a basic level of programming knowledge . The student must be familiar with basic the principles of programming (loops, matrices, file input/output etc.) in any language. The student should be willing to expand these skills. Analysis: The student should be comfortable with developing small programs to analyze the output of the simulations. The student will use the Espresso simulation package to perform the simulations. With the assistance of other group members, the student will: i) implement a model of partially melted DNA ii) build the model system within the simulation environment iii) run multiple simulations on large-scale super computer clusters iv) analyze the simulation output to characterize the translocation process vi) represent the data in a meaningful manner. DMotion of a particle in a solution containing dhick?and dhin?regionsWhen a group of particles is released in a fluid, they will disperse throughout the solution via diffusion. If we consider a solution that is dhick?(like honey) in some regions and dhin?(like water) in others, where will the particles end up? Will they become trapped in the honey, concentrated in the water where they move freely, or evenly distributed throughout? It turns out the answer depends on the details of how the interface between the thick and thin regions are treated. In this project, simulation and numerical techniques will be employed to investigate the diffusion of particles in 2D systems containing regions of different viscosities. The relative concentrations and effective diffusion coefficients of particles in datchy?(random viscous inclusions of various sizes) or drdered?(eg., stripes alternating thick and thin) arrangements will be studied. Results will be generated for several treatments of the interfaces. Considering application to the release of drugs within biological systems or the dispersion of particulate matter in food preparation and aging, which approach is most physical? What do these results tell us about the efficiency of delivering drugs or mixing substances? Can we design configurations to control the mixing or even separate particles based on their size? kModeling: A physics background is preferred such that the student is familiar with concepts such as diffusion and viscosity. Programming: The student should be comfortable with basic programming. Simple Lattice Monte Carlo simulations of the system will be constructed . From these simulations, an exact numerical approach will b< e employed. This procedure involves the generation and manipulation of large matrices (itb not as bad as it sounds). Advanced programming is not required. Analysis: The student must be able to develop small programs to analyze the output of the simulations and numerical approaches. %The student will develop new code for simulation and analysis. Constructing Monte Carlo simulations will be the first step. Then, instead of directly simulating the system to produce results, an exact methodology will be employed to obtain the exact, steady state solution (the group has employed this technique many times and much assistance will be provided to the student). For both approaches, the student will write small programs for the analysis of the results and to produce graphs and figures that illustrate the data in a meaningful way. CComputer Simulation of Bacteria Swimming in Various Fluidic SystemsjBacteria and cells have different swimming abilities and strategies. It has been shown that it is possible to design funnel-shaped fences in microfluidic systems that force swimming cells to concentrate on one side of the fence. This is not unlike how lobster are captured! In this project, we will simulate the swimming of hundreds of interacting cells in the presence of various geometrical features in order to optimize their separation. Time permitting, we will also examine the role of chemotaxis (cells tend to swim towards food sources) and cell-cell interactions (cell tend to synchronize their swim). Modeling: Student should be familiar with concepts such as diffusion, and viscosity. Programming: The student should be comfortable with basic programming. lWith the assistance of other group members, the student will run multiple simulations on large-scale super computer clusters and analyze the simulation output. Student will be invited to be creative and design new microfluidic devices that may improve the separation of swimming bacteria. The student will then test these new designs using computer simulations.ArnaudWeck=Ultrafast laser-matter interactions in metals and dielectricsUltrafast lasers and especially femtosecond lasers offers new possibilities in the field of micromachining as they allow precise machining with almost no collateral damage around the machined features. Furthermore, the pulse duration (<500fs) is smaller than the atomic vibration period of virtually all materials which results in unique laser/matter interactions. The research project will look at better understanding the surface features obtained after femtosecond laser irradiation of metals and dielectrics. This includes ripple formation, microvoids formation, nano and microstructure modification, etc. In particular, two beam interference will be used to better control the morphologies obtain after irradiation. The student should have a strong knowledge in the operating principles of laser (ultrafast lasers would be preferred). Excellent knowledge of optics is also required. The student needs to be comfortable with experimental work as well as analytical problem solving.The student will need to select the appropriate laser parameters (pulse length, pulse energy, angle between laser beams, etc.) to obtain a given set of surface morphologies. He/she will carry out the experiments with the laser and observe the surface morphologies in a scanning electron microscope. Analytical tools will be developed to predict the obtained features based on the initial set of parameters.XExtracting the mechanical properties of materials as small scales using ultrafast lasers{In order to accurately predict the mechanical properties of materials, it is necessary to obtain the true stress-strain curve of the material until failure. If the material is heterogeneous, more parameters are required including the true stress strain curve of all phases in the material and the strength of the interface between the various phases. This information is difficult to obtain though is it crucial for the modelling efforts. In this project, the local mechanical properties of materials will be extracted from various materials using ultrafast laser machining combined with state of the art characterization techniques.RThe student should have a strong knowledge in materials science and engineering, especially in the mechanical properties of materials. Knowledge of the finite element method is required. Knowledge in laser machining would be an asset but is not required. The student should be comfortable with carrying out experiments and simulations.VThe student will design an operating procedure to extract local mechanical properties in a variety of materials, mainly metals. Materials will be tested in tension, in-situ, and local mechanical properties will be obtained. Finite element simulations will be carried out to predict the material properties and to validate the approach taken.XuhuaXiaBiology6Comparative genomics to identify function associationsMany microbial genomes have been completely sequenced. The presence/absence data for thousands of genes across a diverse array of species allow one to quickly identify genes that are functionally associated. Because of shared ancestry among biological species, such functional association needs to be phylogenetically controlled. The computation requires the construction of a reliable species tree, map the presence/absence data onto the tree, and test two Markov chain models, one assuming no functional association (with four parameters) and the other assuming association (with eight parameters) by a likelihood ratio test. In the past, this has been done only for small data set in a semi-automated way. This project will create a computer program to automate the entire process. Students in this project will learn the general conceptual framework as well as the relevant knowledge of stochastic processes and statistical estimation methods to facilitate their future participation as graduate students.Statistical estimation (maximum likelihood and least-square methods), Markov chains, calculus, scientific methods/hypothesis testingTest various functions as they are implemented in computer programs. Manipulation and organization of microbial and viral genomic data; Optionally participating in computer programming.MathewWells#Physical and Environmental Sciences#University of Toronto - Scarborough6Quantifying stratified turbulence in gravity currentsThe student working on this project will determine how the internal mixing dynamics influence the degree to which gravity currents entrain fluid. Gravity currents are important flows for oceanography that form when dense water cascades down the continental slope. There are also many important engineering applications, such a dense waste-water disposal from a desalination plant, or the fate of underwater turbidity currents. The student will conduct laboratory experiments using particle image velocimetry, an acoustic Doppler velocimeter and a high resolution density measurements to measure the small-scale density and velocity fields to test our previous predictions on how entrainment rate and the internal mixing dynamics depends upon the bulk entrainment rate of gravity currents. It is anticipated that within the 12 week program that enough experimental data will be collected to result in subsequent peer reviewed journal publication. The student should have a quantitative background in either physics or engineering. The project will involve hands-on laboratory work as well as mathematical analysis of turbulence data. It would be helpful if the student had taken fluid dynamical courses, but this is not essential. The student will be primarily responsible for conducting a series of laboratory experiments and for analyzing the turbulence data, under my supervision. A research pa< per would be written on these under my guidance. KennethBurchUniversity of Toronto - Toronto%Optical Spectroscopy of Nanomaterials Traditionally, proofs of universal consistency of particular machine learning algorithms - including local learning algorithms such as k-NN classifier - are given under the assumption of data inputs being independent identically distributed random variables. This assumption is often too strong, for instance, when modelling learning and generalization of time series. A sequence of random variables X_1, X_2, ... is called exchangeable if the joint law of any finite subsequence is invariant under permutations of members of the subsequence. For instance, if we have two biased coins, A and B, with different probabilities to get tails, then an exchangeable sequence of random variables is obtained by tossing either A or B, and a decision on which coin to toss being made each time by of tossing a third coin, C. A famous De Finetti theorem states that in essence every exchangeable sequence of random variables is obtained like this, as a mixture of a family of i.i.d. sequences. The notion of learnability has to be modified if the data inputs are assumed exchangeable, and in the earlier work of the project supervisor [V. Pestov, Predictive PAC learnability: a paradigm for learning from exchangeable input data. - In: Proc. 2010 IEEE Int. Conference on Granular Computing (San Jose, CA, 14-16 Aug. 2010), pp. 387-391, Symposium on Foundations and Practice of Data Mining. doi> 10.1109/GrC.2010.102] it was shown that a "conditional", or "predictive", version of a probably approximately learnable concept class behaves well for such input variables and satisfies an analogous result to its classical i.i.d. counterpart. The next interesting question it to reformulate the notion of universal consistency, and to give a proof that a "conditional", or "predictive" universal consistency of the k-NN classifier holds under exchangeable inputs. Moreover, as exchangeable inputs can be easily simulated, it would be very interesting to stage a large-scale experiment using the High Performance Computing Virtual Laboratory (HPCVL) in order to test the asymptotic performance of the learning algorithm under such inputs and compare it to the performance under the traditional i.i.d. inputs. Thus, the research problem has both a theoretical and a practical dimension, which can be pursued in parallel or even independently of each other. This is a rich project, sufficient not only for a Summer MITACS project, but for a good Ph.D. thesis as well!The potential candidate must have the following: excellent academic standing, good analytical problem solving skills, knowledge of wireless sensor networks, programming background, and good communication skills. =The student will be required to perform the following tasks: study and analyze related papers on this topic, design new algorithms for the research project with help from supervisor, simulate the algorithms in a simulator, collect and analyze the data, and help in a writing technical paper documenting the results. VladimirPestov(Department of Mathematics and Statistics?Consistency of local classifiers under exchangeable data inputs Wireless Sensor Networks (WSN) contain large number of tiny, low cost sensors. Many sensor networks have mission critical tasks that involve data collection in remote, inaccessible or hostile environments, such as battle fields, deserts, mountains, etc. These sensors are normally monitored and managed by a trusted authority commonly known as sink or collector. In certain special classes of WSNs, this sink may not be online all the time. It visits and collects information from the nodes at certain intervals. Such WSNs are known as unattended wireless sensor networks (UWSNs) [1]. Since the sink visits and collects information at intervals, every node has to secure the data until the next visit of the sink. Security needs should be taken into account to ensure data protection (also called data survivability), authentication of sensors. Existing distributed security mechanisms for WSNs [2] are not suitable for the UWSNs due to infrequent visits of the sink. Cryptographic key management techniques provide data authenticity [3, 4] and integrity but do not ensure data survivability. Self-healing in UWSN has also been widely studied [5]. In self-healing techniques, nodes can regenerate keys and continue functioning normally after being compromised. Most of the existing schemes assume that the sensors are static between successive visits from the sink. Efficient data survivability in mobile UWN has not been well studied. In a large field it might not have efficient to visit all nodes. Secure data aggregation is an important problem in this regard. Hence, we will design efficient algorithms which will ensure secure data aggregation by special sensor nodes called aggregators. Sink can then visit only the aggregator nodes, instead of all nodes. In this project, will be study the following problems: 1. Efficient data survivability in mobile UWSN, and 2. Efficient data aggregation with self-healing in UWSN We will not only design algorithms, but also simulate these algorithms using simulators like NS2. References: [1] R. D. Pietro, L. V. Mancini, C. Soriente, A. Spognardi, and G. Tsudik, "Data security in unattended wireless sensor networks", IEEE Transactions on Computers, vol. 58, pp. 1500-1511, 2009. [2] X. Chen, K. Makki, K. Yen, and N. Pissinou, "Sensor network security: a survey", IEEE Communications Surveys Tutorials, vol. 11, no. 2, pp. 52-73, 2009. [3] R. D. Pietro, C. Soriente, A. Spognardi, and G. Tsudik, "Collaborative authentication in unattended WSNs", in ACM WISEC, pp.237-244, 2009. [4] T. Dimitriou and A. Sabouri, "Pollination: A data authentication scheme for unattended wireless sensor networks", in IEEE 10th International Conference on Trust, Security and Privacy in Computing and Communications (TrustCom), Nov. 2011, pp. 409-416, 2011. [5] R. D. Pietro, D. Ma, C. Soriente, and G. Tsudik, "Posh: Proactive cooperative self-healing in unattended wireless sensor networks", in IEEE SRDS, pp. 185-194, 2008. The student must be good programmer with strong interest for the field of image and video processing. Programming will be done in C++. Knowledge of computer vision concepts would be an asset as well as familiarity with the OpenCV library. ZIn collaboration with the postdoctoral student working on this project, the student will participate to the implementation of the intelligent video monitoring algorithm. He will also build a test database of events in order to test the validity of the algorithm. He will produce the required results for the production of a scientific paper. He will also participate to the formal testing phase of the approach on real-world data. This research project will be an opportunity for the student to work on state of the art intelligent surveillance algorithm having important application for the industry.AmiyaNayak3School of Electrical Engineering & Computer Science>Secure data aggregation in unattended wireless sensor networkshMany video survillance solutions demand a description of events of interest from the user in order to begin the process of identifying the potentially anomalous ones. This approach suffers from two intrinsic limitations: 1) it is difficult for a user to describe events of interest when they depend on a multitude of conditions, and 2) the system cannot learn from classification mistakes or changing user preferences over time. These limitations can be addressed by incorporating learning algorithms into the monitoring systems. Detections and classifications that exploit knowledge learned about user's habits can significantly reduce false alarms and false rejects, improving the usability and reliability of the system. This research project aims at performing an intelligent discrimination of visual events that can learn from the habits of its users. The approach offers a strong complement to existing video monitoring tools by reducing the number of false positives repo< rted to the user over time. The problem is twofold. First, the software system must learn which patterns of events and activities are within a normal range for the current user. Second, the software system must take into consideration the feedback from the user associated with any reported false alarm when assessing the risk level of future events. The proposed system will integrate sensor information provided by an object tracking algorithm and learns the normal range of values (for each scene) over an initial training period. This requires the segmentation of sensor data into recognizable event sequences associated with regular user activities, a problem that has received considerable attention in the literature. Once the system is trained, it is able to segment out sequences of sensor values that correspond to previously seen patterns, and classify them according to their similarity to the attributes of known activities. For example, a mail person activity is described in terms of spatial and motion events along with date and time attributes. Later, the detection of a similar sequence of events but with different attributes is classified as anomalous.jThe student should have completed introductory Science courses in Mathematics, Physics, and Chemistry, and several geology courses, such as Mineralogy, Optical Mineralogy, Igneous Petrology. A student should be interested in Geochemistry. The student should be familiar with the use of a petrographic microscope for identification of minerals and observation of optical properties of minerals. It is essential for a student to have basic oral and written communication skills in English to receive the instruction of the operation of instruments and to comprehend the safety regulations in analytical laboratories. I have already collected mantle peridotite samples in the area two years ago with the help of Vice-director of the Changbaishan volcano observatory. The Changbaishan volcano is one of the most picturesque volcanoes in the area and still active with the latest unrest in 2002.-2006. There are abundant literatures of the volcano in China. Before arrival in Ottawa, the student will review the nature of the volcanoes and volcanic activity that brought xenoliths. The study in Ottawa has three components. 1. Petrographic examination The student will describe the texture and mineralogy of hand specimens and polished-thin sections using a petrographic microscope with dual light sources. Students will examine the sections using a scanning electron microscopy to identify sulphide and oxide phases, document the micro textures in back-scattered electron images. 2. Mineral chemistry study The student selects representative samples and grains for mineral chemistry analysis of olivine and spinel using an electron microprobe. 3. Calculation of oxidation conditions The student will calculate the fO2 values using the mineral chemistry data. JacobKrich0Efficiency calculations for advanced solar cellsIntermediate band photovoltaics (IBPV) have the potential to radically improve the efficiency with which sunlight is converted to electricity. The idea of IBPV is to find (or create) a semiconductor with a band of levels contained entirely inside the the band gap between the valence and conduction bands. These levels will allow the material to absorb lower-energy photons while still getting a large voltage, exceeding the efficiency of all simple semiconductor designs. The search for ideal materials to realize the intermediate-band concept is still in its infancy. This project will use analytical methods to explore the ideal parameters for an intermediate band material. It will begin by considering the so-called detailed-balance limit, in which all nonradiative processes are neglected, and find the ideal properties of IB materials as a function of solar concentration. It will then consider reasonably attainable levels of nonradiative processes and determine how stringently materials must be engineered to make viable IBPV devices.Interest and experience with theoretical methods of physics, in particular differential equations and linear algebra. A course in quantum mechanics and/or solid state physics would be helpful.The student will adapt the differential equations describing standard ideal solar cells to include an intermediate band. These will be further modified to include the effects of nonradiative processes. Using analytical and numerical techniques, the student will explore what efficiency a real device can obtain and compare to the ideal limit. These results will inform the work of other theoretical studies and of experimental collaborators.Robert Laganiere5School of Electrical Engineering and Computer ScienceSIdentification of Regularities in Time, Day and Space Events for Video Surveillance Redox conditions of the mantle are important in controlling the fertility of magmas and types of mineral deposits in the crust. For example, carbon is diamond or graphite under reduced conditions, and CO2 or carbonate in oxidizing conditions. The formation of large porphyry-type copper deposits, major source of copper and gold in the world, requires oxidized mantle where sulphur and metals can be released to magmas. It is generally considered that the mantle underlying old cratons are reduced based on the abundance of diamondiferous kimberlites in ancient cratons, but what caused this reduction is not fully understood. Furthermore, cratons commonly experience later tectonic events, such as subduction along their margins and collisions with other continents. Again, it is poorly understood how these geological events affect the redox conditions of the mantle. The ultimate objective of the proposed project is to evaluate the changes in mantle redox conditions and abundance of base and noble metals during these tectonic events. The findings have global implications in targeting the favourable areas for mineral exploration. For this study, I selected the eastern Sino-Korea craton because of abundance of mantle samples in kimberlites and explosive volcanic rocks in a relatively small area. These xenoliths provide the data to evaluate the these changes during specific geological events. Thus, the compilation of the data gives the temporal change in response to geological activities on the crust. The Sino-Korean is one of major cratons in the world, covering from the southern part of Mongolia to northeastern China and most of Korean Peninsula. It has established as a stable continent by late Archean, similar to the Canadian Shield. As many ancient cratons, the mantle root underlying the Sino-Korean craton was also old and cold similar to the overlying crustal rocks, but it has changed about 250 million years ago. The craton, especially the eastern part, became an active margin involving the subduction of the oceanic plates (Tethyan, and old Pacific plates). Later it was invaded by young and hot mantle leaving relict old roots. These activities are well constrained. Thus, the data can be interpreted in relation with geological events. For this proposed study supported by MITEC, I will select a suite of mantle xenoliths in Changbaishan volcano on the border between China and North Korea in the eastern part of the Sino-Korea continent (the summit in North Korea, the crater lake in China). The student will identify mantle peridotites among xenoliths, examine the mineralogy using a petrographic microscope and a scanning electron microscope, determine the compositions of minerals using an electron microscope, and calculate oxidation conditions using the mineral chemistry data. If a time is available, the student will participate to determination of metals (precious metals and base metals) and sulphur. The candidate is required to have some basic knowledge related optics, and fiber optics, with understanding the physics phenomenon, related to light propagation, reflection, refraction, lenses , and related topics. Interest in testing and working in labs is essential. It will be an asset to have some hands on Experience using some basic testing equipment , such as voltmeter, A< mp meter, and spectrum analyzer. For the period of the scholarship, the student is expected to attend our research group meetings related to the Radio of fiber topic on weekly bases. According to the University of Ottawa safety regulations, the candidate will be required to attend and complete successfully two certificates (Laser safety workshop) and (lab safety workshop). Every workshop is 3 hours workshop. The candidate will benefit through the learning of state of the art experiments, in addition to the understanding, use and manipulation of advanced instrumentation systems. Examples of such instruments are semiconductor laser sources, arbitrary waveform signal generators, optical spectrum analyzers, high-speed optical oscilloscopes, optical coherent receivers, in addition to advanced and high precision nano-positioning alignment stages, and LUNA vector analyzer testing instrument. The candidate will be instructed to use specific equipment and to perform specific related testing that will be running within our research agenda by the time of the candidate arrival.The candidate will be closely trained and supervised by a Post doctoral fellow, leading the lab at our research group. KeikoHattoriEarth SciencesPRedox evolution of the mantle underlying the eastern part of Sino-Korean craton. Radio over Fibre( ROF) is appearing as perfect solution to sustainable ubiquitous & broadband wireless access, and is the perfect solution for reduced energy consumption in cluttered environments. This research falls within the targeted area of advanced communications and management. It's specific topic is the applications of green Radio - over fiber ( ROF). It includes testing of our designed ((digital Radio - over - fiber communication systems that are based on coherent detection)). The next generation of wireless networks should be able to deliver high-speed data to mobile terminals in densely populated areas including office buildings, stadiums, shopping malls, tunnels, subways, airports. Wireless-over-Fibre is a major key enabler technology to massive wireless/mobile network deployment given that it advantageously combines the strengths of both optical and radio-frequency technologies to provide low-cost and future-proof infrastructure to wired and wireless broadband networks. Whilst wireless network connection frees the end-user from the constraints of fixed physical links, optical network infrastructure provides a high-capacity, low attenuation, interference-free and transparent medium for distributing wireless signals to a large set of remote antenna units while to overcoming bandwidth limitations and handover issues of standalone wireless systems. Optical telecommunication systems appears to be the only available technology to cope with very high transmission rates (Tb/s) in core and long haul networks and are becoming an alternative for access networks, and radio-over-fibre systems. The key point in this entire story is that the information in the optical systems has been traditionally transported by using intensity modulation at the transmitter and direct detection at the receiver. At the present time, the capacity of the optical systems is reaching the limits of the available bandwidth in the fiber optic. At the Center for Research in Photonics the design, fabrication, and testing of fundamental functionalists within the scope of coherent optical telecommunications are of upmost importance. Some of the projects in progress related to this topic deal with the development of single-mode and dual-mode ultra-narrow linewidth lasers fabricated with InP technology, development of silicon-based components for implementing fundamental functionalities at the coherent detector front end, and the investigation of new approaches to be applied in coherent radio-over-fiber systems. Within the scope of Mitacs program, an undergraduate student is expected to get directly involved in the implementation, and data acquisition of experimental related investigations. The student will benefit through the learning of state of the art experiments, in addition to the understanding, use and manipulation of advanced instrumentation systems. Examples of such instruments are semiconductor laser sources, arbitrary waveform signal generators, optical spectrum analyzers, high-speed optical oscilloscopes, optical coherent receivers, in addition to advanced and high precision nano-positioning alignment satges, and LUNA vector analyser testing instrument. Courses in physical chemistry and quantum chemistry are preferred but not essential. Interest in physical chemistry, computer work, and mathematics are important.The student will be involved in the synthesis or co-crystallization of novel compounds. The student will then assist with X-ray diffraction analyses of their samples. The student will run solid-state NMR experiments and analyze the results. Density functional theory calculations will also be carried out and analyzed by the student. Summary of duties: reading relevant background materials on halogen bonding and NMR spectroscopy; synthesis and setting up crystallization experiments; analyzing X-ray crystallographic data; solid-state NMR training; solid-state NMR experiments; analysis of solid-state NMR data; participation in group meetings; summarizing results and writing a short report.Trevor HallW Center for Research in Photonics/ School of Electrical Engineering & Computer Science HPhotonics nano structures for Radio over fiber Coherent systems/ Testing1) Research question and objectives. Halogen bonding, RX-B, is the result of a non - covalent interaction between a halogen X and a negative site B (e.g., Lewis base or ? electrons). The halogen, X, is part of an RX molecule where R can be another halogen, an organic or an inorganic electron-donating- group. Halogen bonds are of particular interest, since, they are widely used in crystal network engineering for the creation of new materials and are implicated in multiple biological processes such as molecular folding and recognition. The research question concerns the evaluation of the preference of a halogen bond electron density acceptor such as p-diiodotetrafluorobenzene (1) for two possible donors: selenium vs nitrogen functional groups. The objectives are to: (i) crystallize new complexes of 1 with selenourea, selenocystine, and other seleno-amino acids; (ii) study the structures of the complexes with various analytical methods to reach conclusions regarding the nature of halogen bonding in these systems. 2) Approaches and methods. The student will co-crystallize mixtures of 1 and various selenium-nitrogen compounds. Single crystals will be grown using slow-diffusion approaches. The student will submit single crystal samples for assessment by X-ray crystallography. The student will then apply 77Se solid-state nuclear magnetic resonance (NMR) spectroscopy to compare the selenium spectral signatures for the starting materials as compared with the halogen bonded complexes. This will involve hands-on spectrometer time and spectral simulation. 3) Learning activities. The student will learn about the role of non-covalent interactions through initial readings. The student will gain hands-on experience in crystallization of compounds and in growing single crystals. The student will gain a basic understanding of X-ray crystal structures. The student will gain hands-on experience in solid-state NMR spectroscopy and will develop analytical skills through the use of software to simulate the spectral data. The student should have a strong background in chemistry. A course in quantitative analysis is recommended, and familiarity with instrumental analysis is an asset. Some knowledge of environmental biology and ecology, or wastewater engineering, is helpful but not required. The student<  should be comfortable working outdoors, and in a team. An international driverb license is NOT required, as other team members will drive field vehicles. Good communications (oral and written) and interpersonal skills are highly desirable, as are basic computer skills e.g., familiarity with word processing and spreadsheet software. We provide on-site training.|The student will have an opportunity to work in a multi-disciplinary team tackling the issue of characterizing and remediating traditional and emerging wastewater pollutants (i.e., pharmaceuticals and personal care products). The student will play a crucial role in performing extensive field sampling of wastewaters and surface waters (rivers, lakes) receiving wastewater inputs both locally within Winnipeg, and in the rural and lake areas of Manitoba (e.g., deployment and retrieval of passive samplers, in situ characterization of common water quality parameters such as pH, temperature, and chlorophyll-a content, collection of water samples for ancillary water quality measurements such as ammonia and phosphorus concentrations). The student will also help process and analyze samples using appropriate wet chemistry and instrumental analysis techniques in our laboratory. He or she will be exposed to and gain familiarity with cutting-edge technology for such analysis, housed in our state-of-the-art Thomas Sill Analytical Laboratory for Water Research Technology (STALWART) opened last year. The student will work directly alongside research associates, postdoctoral fellows, graduate students, and undergraduate students in chemistry, biology, and environmental sciences as an integrated team towards this projectb goals. He/she will also learn risk assessments to ascertain the toxicological significance of the data from this project. Thus, the Globalink student will gain a wide variety of highly valuable skills and expertise over several scientific disciplines, that will be extremely useful to him/her in future career endeavors. Stephane Aris-Brosou$Biology / Mathematics and StatisticsUniversity of Ottawa - Ottawa%GLOBAL DYNAMICS OF INFLUENZA SUBTYPESOAlthough influenza viruses kill an estimated 40,000 every year in the US alone, the global pattern of circulation of these viruses is only beginning to emerge. However, it is still unclear whether this pattern is the same for different subtypes (e.g., H1N1, H3N2) or how this pattern changes in time. Here we propose to use the most extensive data set to date, containing more than 20 thousand sequences of the HA gene, to address these questions. A novel algorithm and statistical design will be developed to analyze this extremely large data set efficiently and accurately. This project is expected to improve our understanding of the co-evolution of the main two influenza subtypes that affect human populations, and to provide us with a means to predict their impact on public health through the discovery of their circulation pattern. The candidate is expected to have a good command of Perl, R and unix. A good understanding of the concepts in molecular evolution is also required. Knowledge of basic virology is an asset.The student will assemble the genomic data from NCBI, participate in the design of the study, perform all the analyses on a computer cluster and participate in interpreting the results. The draft of a manuscript will be expected by the end of the contract.0INFERRING TRANSMISSION NETWORKS OF HIV IN CANADA Over recent years, a number of public health policies have been implemented throughout Canada to monitor and mitigate the spread of the HIV epidemics. However, these policies have been implemented across provinces in a piecewise manner, which makes their success heterogeneous. This success heterogeneity is expected to be reflected in the heterogeneity of the transmission networks of HIV among provinces. Here, to test this hypothesis, we propose to develop a novel method to reconstruct transmission networks of HIV throughout the different provinces of Canada (no methods currently exist to reconstruct these networks from sequence data). The method will take inspiration from Approximate Bayesian Computing, developed in population genetics, but never applied in the context of phylogenetic data. Model selection will also be implemented to compare transmission networks among provinces. This project is expected to improve our understanding of the public health policies that are effective in curbing the HIV epidemics. The candidate is expected to have a good command of R and unix. A good understanding of Bayesian computing is also required. Knowledge of basic virology is an asset.The student will assemble the genomic data from the Public Health Agency of Canada, participate in the design of the study, perform all the analyses on a computer cluster and participate in interpreting the results. The draft of a manuscript will be expected by the end of the contract.BryceAHalogen Bonding studied by Solid-State Nuclear Magnetic ResonanceWe will determine use levels and patterns of selected pharmaceuticals in the Manitoba population and selected subpopulations, and to investigate the magnitude of such use during episodic events (e.g., resorts occupied only part of the year, with likely different drug use patterns). Our hypothesis is that an appropriate analysis of surrogate materials, namely wastewaters, provides a realistic measure of the amount and types of many drugs used by the public, and the extent to which these drugs may contaminate surface waters receiving wastewater discharges through tracer analysis. Characterizing drug usage helps ascertain overall public health status, as it estimates how much drugs are needed for ailments. However, typical means to assess this information, such as sales and prescription records, have several major disadvantages, e.g., they do not indicate actual use. Questionnaires depend on accurate reporting, which may not be forthcoming for various reasons (e.g., privacy concerns). Because drugs are incompletely metabolized, they are excreted into sewage systems, and thus go into wa< stewater treatment facilities. Indeed, some household chemicals, such as the artificial sweetener sucralose, are poorly metabolized and appear to be environmentally stable, so its presence in the environment can indicate contamination by wastewaters. Sewage measurement would provide an objective, anonymous (i.e., no personal information needed, therefore protecting privacy), and aggregate means to assess drug consumption, if residues found in wastewaters could be correlated to use. However, current measurement efforts can be quite inaccurate. For instance, drug use can vary by season or even time of day. Most monitoring studies simply grab water, and thus do not accurately capture common short-term concentration changes. We will estimate overall levels of Manitoba drug usage through analysis of wastewater influents in selected communities representing a cross-section of the province. Urban areas are represented by Winnipeg's treatment plants. Less populated rural communities are represented by the sewage lagoons of Morden and Winkler in southern Manitoba, site of previous work by a MITACS student. Areas with significant seasonal variations in occupancy and therefore likely different drug usage, such as recreational and resort regions, are represented by Grand Marais near the shores of Lake Winnipeg. Unlike previous efforts, we will take advantage of passive sampling technology to obtain continuous, time-weighted-average concentrations of drugs, using appropriate calibration rates we helped develop. These samplers will be deployed in wastewater effluents and influents sequentially to determine spatial and temporal trends in concentrations measured by ultrahigh performance liquid chromatography-tandem mass spectrometry. With treatment flow rates, we calculate drug inputs and outputs, which we believe from earlier work to be constant per capita given ubiquitous use. Sucralose sewage loadings will determine how much "dilution" of sewage is present in surface waters that contain this chemical, helpful for mapping the extent of wastewater impacts in Manitoba. This work will provide objective and overall data on drug use rates and patterns in the Manitoba population, which is currently lacking.'The student must have basic knowledge of mathematical Statistics and willingness to learn new statistical methods. The student should also be interested in learning computational methods. This project will provide an opportunity to learn concepts of Bayesian methods and their implementation. /There are several ways to create training opportunities for undergraduate students through this project. These opportunities will integrate both theoretical and computational aspects. As an example, student will learn basic item response models and then the Bayesian version of these models and their implementation. The implementation steps will give them an opportunity to learn the MCMC methods. Student can also get involve in conducting various simulation studies under different parameter settings in evaluating performance of the proposed methodology. ]Bayesian Methods for Meta-Analysis with Applications to Multi Arm Trials with Binary OutcomesRecently, there has been a growing interest in meta-analysis in many areas including medicine, education, psychology, social sciences. In literature, there are two main approaches used in meta-analysis: the fixed effect model and the random effects model. Under the fixed effect model we assume that there is one true effect size which is shared by all the included studies. The combined effect is our estimate of this common effect size. If there is no statistical heterogeneity among studies, differences across studies may be due to random variation and fixed effects model may be appropriate. In random effects model, we assume that the true effect could vary from study to study. Muthukumarana and Tiwari (2012) developed a random-effects model using Dirichlet process priors to account for heterogeneity among studies. This project will focus on enhancing the methodology developed in this paper. More specifically, the methodology will be extended to multivariate version of random meta-analysis with binary outcomes. This extension is important when there are several zeros observed in some studies. This will compel to introduce zero-inflated Binomial (ZIB) models for meta-analysis. xThe student should have knowledge of statistical inference and willing to learn new methods and computation as well. The student will learn basic meta-analysis techniques and their implementation at the first stage. Next, he/she will involve implementing new developments, conducting simulation studies and data analysis arsing from real life problems depending on their abilities. DerekOliver#Electrical and Computer EngineeringQSubmicron resolution dielectric loss spectroscopy for a scanning probe microscopeThe current focus of this research group is developing a scanning probe microscope based approach that will enable us to perform dielectric loss spectroscopy (i.e. loss tangent measurements) on thin film samples with submicron spatial resolution. Our goal is to develop this technique, based on a dynamic form of electrostatic force microscopy, into a useful addition to the suite of probing approaches available. Our efforts focus on composite polymer membranes that find use in a range of devices including an artificial photosynthesis system and other conducting polymer membranes such as those used in the fuel cells. Another key area of research is studying the potential for developing new materials for electrical insulation. In recent years much interest has been generated by the use of nanoparticles/nanofillers distributed through insulating polymers as a technique for modifying the dielectric constant of the insulator. Our interest is to look at whether these approaches lead to the development of local weak points due to the abrupt variation of dielectric character at submicron length scales. As the spatial resolution required is not attainable via conventional measurement approaches our new approach is well-placed to assist our collaborators in these studies.Electrical Engineering students with a reasonable grounding in electronics are required. It would help if the students have a solid background in physical electronics (i.e. the materials physics that underpins electronic properties of materials used in devices). In this context, some physics/materials science/chemistry students may also be suited to the project. It should be emphasized that the Globalink Scholar will be involved in hands-on experimental work.This Globalink Scholar will work closely with a PhD student whose project is in this area. With assistance from this colleague, the Globalink Scholar will become familiar with the operation of the scanning probe microscope and sample preparation for this instrument. Another key experimental role will be to utilize the impedance analyzer and ellipsometer in the lab to perform bulk measurements which provide the benchmark/control for the more detailed data obtained using the scanning probe microscope. The Globalink Scholar will also participate in meetings and discussions with our collaborators at Manitoba Hydro. These discussions will be geared towards the development and design of new samples and avenues for experimental investigation and, potentially the development of designs for conductor insulation that have a pre-defined gradation in dielectric constant. The Globalink Scholar will be expected to discuss and evaluate their data in regular meetings with their advisor and other members of the teamCharlesWongFLooking for drugs in all the right places: Drug sewage "epidemiology" The modeling of item response data is ruled by item response theory. In psychometrics, item response theory (IRT) is known as latent trait theory. IRT is a paradigm for the design, analysis, and scoring of tests, questionnaires, and similar instruments measuring abilities, attitudes, or other variables. IRT is widely used in education, marketing research and psychological studies. In literature, there are broad class of statist< ical models for analyzing item response data. This project will develop nonparametric Bayesian mixture models using the Dirichlet process for modeling item response data. In standard IRT models such as Rasch model, two parameter and three parameter models, ability level of i-th individual and difficulty of j-th item play an important role in estimating item characteristic curve (ICC) which can be used in predicting probability of a correct response. Persons with lower ability have less of a chance, while persons with high ability are very likely to answer correctly. Given the same ability level, an item is said to be easier if the probability of success is higher. In the ICC of these models, an increase in ability levels leads to the same increase in the probability of success since ICCs are parallel to each other. However, this property is violated when there is sizable heterogeneity among ability parameters and difficulty parameters. This project will develop an approach to estimate non-parallel ICC curves for heterogeneous data. In the Bayesian model development, the likelihood of the data is derived and priors are defined on parameters. Next, the full conditionals of the parameters are derived. We then obtain posterior summary statistics which describe key features in the model. In particular, posterior expectations are approximated through Markov chain Monte Carlo (MCMC). We then compare the proposed model with the models available in the literature. The project will also develop a clustering mechanism which allows user to identify the similar/dissimilar items and subjects in item response data. In general, there are many choices for cluster analysis. Clustering methods can be either hard or soft. In hard clustering, a subject or item is assigned to just one cluster. In soft clustering which is also known as fuzzy clustering, each subject or item can belong to multiple clusters with different strengths in a single iteration. The complete linkage method for hierarchical clustering can also be used. This approach defines the cluster distance between two clusters to be the maximum distance between their individual components. At every stage of the clustering process, the two nearest clusters are merged into a new cluster. The process is repeated until the whole data set is agglomerated into one single cluster. Nonparametric Bayesian mixture models using the Dirichlet process also provide a natural clustering mechanism which can be used to cluster subjects and items. This project will investigate the performance of these clustering methods through simulation studies. While the very specific details of the Globalink 2013 project will be further determined by the supervisor and the student, we expect that the student to complete a small concrete component (that is suitable and manageable for an undergraduate student) during the internship. In general, international student participating in this research project will be asked to implement a data mining component in C/C++ and/or a visualization component in C# or .Net. The Database and Data Mining Lab in Department of Computer Science at University of Manitoba is a multi-cultural environment. Current and previous lab members include students from Asia (e.g., China, Korea, India, Bangladesh, Malaysia), Central and South Americas (e.g., Paraguay). The international undergraduate student participating in the MITACS Globalink 2013 project will work under the academic supervision of Dr. Carson Leung for an approximate 12-week research internship in the summer of 2013. In addition, the student will also work closely with a senior lab member (who will serve as a mentor) and will contribute a small concrete component of the project suitable at the level for the undergraduate students. FrancisPhysics and Astronomy;Developing microfluidic systems for biological applicationsjThe proposed project will develop new microfluidic systems for studying the migratory responses of different cell types such as immune cells, stem cells and cancer cells. The new system will be able to produce highly controlled chemical and electrical environments in a high-throughput format allowing quantitative and efficient cell migration related analysis. Previous experience in microfabrication, cell culture and optical microscopy will be helpful but not required. A keen interest in interdisciplinary research and a strong commitment to academic excellence is essential. Diverse backgrounds are welcomed. <The student will be trained in microfluidics, microfabrication, cell culture, cell imaging and data analysis. He/She will initially assist a senoir lab member for conducting relevant research projects and then proceed to develop his/her own project under the supervision of the faculty and other senoir lab members. SamanMuthukumarana Statistics6Non-Parametric Bayesian Methods for Item Response Data Research within the Database and Data Mining Laboratory in Department of Computer Science at University of Manitoba mainly focuses on databases and data mining, which includes efficient and effective management of, knowledge discovery from, as well as analysis of, large amounts of data (such as transactional, uncertain, social media, Web, and/or streams of data). Current research programs focus on data mining, data warehousing and OLAP (on-line analytical processing), data visualization and visual analytics, as well as applications of database and data mining technologies to areas such as social computing and social network mining. Over the past few years, members in my lab (both undergraduate and graduate, international and local, students including USRA award recipients) have actively designed algorithms that efficiently and effectively find frequently occurring patterns (say, merchandise items frequently purchased together by customers) as well as detected exceptional or abnormal items (say, detect malfunction devices). The resulting algorithms have been applied to various real-life applications. The research project for Globalink 2013 falls within the scope of the aforementioned research programs. Specifically, we focus on social network mining in this project. Over the past few years, the rapid growth and the exponential use of social digital media has led to an increase in popularity of social networks and social computing. In general, social networks are structures made of social entities (e.g., individuals) that are linked by some specific types of interdependency relationship such as friendship, kinship, or partnership. The emergence of Web-based communities and hosted services such as social networking sites (e.g., Facebook, Google+, LinkedIn, Renren, Sina Weibo, Tencent Weibo, Twitter) has facilitated collaboration and knowledge sharing between users. Most users of social media have many linkages in terms of friends, connections, and/or followers. Among all these linkages, some of them are more important than others. For instance, some friends of a user may be casual ones who acquaintances met him at some points in time, whereas some others may be friends that care about him in such a way that they frequently post on his wall, view his updated profile, send him messages, invite him for events, and/or follow his tweets. In this project, students will apply the knowledge and skills they acquired in their undergraduate database courses to build a database for capturing relevant data from a social network. They will then develop a business intelligence solution that applies data mining techniques to social networks so as to help users of the social digital media to discover implicit, previously unknown, and useful knowledge from the social networks. Basic laboratory skills including wet chemistry are mandatory. The student will be expected to have knowledge on use of pH meters, UV-VIS spectrophotometers and liquid chromatographs. The ability to work in a team environment, ensuring strict adherence to laboratory safety procedures so as not to endanger other members of the laboratory. Student is required to be an enthusiastic quick learner willing to learn how to operate advanced instrumentation that is not f< amiliar in undergraduate laboratories such as the quadrupole time of flight mass spectrometer. A good command of English is mandatory as the language of use during training.xThe student will undergo a laboratory safety orientation course prior to conducting any experiments towards the above research project. The student will initially work on the isolation of the aleurone layers and aleurone bodies from whole grains of barley, corn and wheat. The student will then view the isolated aleurone layers and aleurone bodies using bright field and fluorescence microscopy to verify the components and purity of the samples and to identify the different components in the cereal grains based on their excitation spectra and or emission wavelengths if they fluorescence or use fluorescence probes. Thus the student will learn to use bright field and fluorescence microscopes and fluorescence intensity profiles to confirm purity of the isolated aleurone. An experienced PhD candidate will provide the training needed for the student to be able to accomplish these initial tasks. The student will adopt these techniques to understand the anatomical structure in terms of the major botanical fractions that make up the grain. The student will put together a summary report on the differences in aleurone content and microstructure among the genotypes of barley, corn and wheat studied. The student will also be involved in the chemical analysis and identification of some of the aleurone components using liquid chromatographs (LC) coupled to uv-visible and photodiode array detectors and mass spectrometer (MS). An experienced technician from my department and the aforementioned PhD candidate will provide training to the student on LC-MS. Considering time limitations, priority will be given to analysis of phenolic acids and proteins of the aleurone. This breadth of experience gained during this period will allow the student to ponder seriously about graduate studies in the area of cereal chemistry and functional foods, specifically whole grains and their health-promoting properties.YingBiosystems Engineering#Design of a micro cone penetrometer#Design of a micro cone penetrometer Cone penetrometer is an instrument used to measure the soil resistance to penetration, expressed as force per unit cross-sectional area of the cone-base. This measurement can be used to evaluate soil density, compacted layers, trafficability, and foundation of buildings, bridges and roads. The cone penetrometer assembly contains the cone, friction sleeve, sensors and recording system. A micro-cone penetrometer is a lightweight, low powered device that consists of sampling probes driven into the ground by low reaction force means. Micro-penetrometer is one of the most widely used methods of estimating resistance to root growth in soil, and for detecting layers of different soil strength. In the project, a micro cone penetrometer will be designed. There are no standard procedures or patents found for micro cone penetrometer design. In this project, the micro cone penetrometer will be designed according to Chi and Tessier (1994). The micro cone penetrometer to be designed may consist of i) Probes ii) Sensor unit (load cell and potentiometer) iii) Hydraulic system and control unit iv) Data acquisition system References: Chi, L. and Tessier, S. 1994. A portable micro-peneterometer for measuring seed row compaction. Soil & tillage research. 34, 27-39. The project requires students from engineering programs, such as electrical engineering, agricultural engineering, biosystems engineering, civil engineering, or mechanical engineering. Students are required to have good communication skills in both oral and written English.Student will be working at the Soil Dynamic and Biomachinery Lab at the University of Manitoba. Student will perform the design outlined above with the help of graduate students in the lab and the department technician, if needed.5Wet process for extraction of high quality hemp fibreNatural fibres, such as hemp fibre, are renewable resources, and they are proving their uses in creating lightweight and durable materials. Demand for hemp fibre from composite and other fibre industries is increasing globally. Hemp fibre processing is the key issue to meet the demand in both quality and quantity. There was little success with the commonly used mechanical processing of hemp in meeting fibre quality and quantity requirement. The proposed project will develop a wet process (using water and chemicals) to extract hemp fibre to improve fibre purity. The objectives of the proposed project are to investigate: 1) a wet process to extract hemp fibre bundles (raw fibre or coarse fibre) and 2) properties of the end fibres for making bio-composites.No specific requirements.The student will be working at the Biomachinery Lab at the University of Manitoba. The student will set up tests using water or chemical solution to extract hemp fibre. The objective is to examine effectiveness of fibre extraction as affected by different parameters: retention time, water temperature, and material loading rate (dry mass of hemp stalk over mass of water). Trials with different parameters will be evaluated by yield, purity, fineness of the fibre, and total energy input. 'Performance of tillage or seeding toolsTillage and seeding operations consume high energy as the operations need to break the soil. In a field operation, significant amount of the tractor power is required for overcoming the draft force (the force requires to pull tillage or seeding tools). High draft forces of existing tillage tools (e.g. sweeps, chisels, and rolling tines) and seeding tools (hoes, discs, and press wheels) have limited the field capacities and increased the cost of field operations. Therefore, minimising draft requirement of those tools is very important. In this project, selected tillage tools or seeding tools will be tested in an indoor soil testing facility available at the Soil Dynamic and Biomachinery Lab, University of Manitoba. In the tests, draft forces and soil disturbance of the tools will be measured at different working depths and tool travel speeds. The proposed project will provide engineers useful information for developing new tools and improve existing tools.The project requires a student from agricultural programs, such as soil and plant sciences, or engineering programs such as agricultural engineering and biosystems engineering. Student will be working at the Soil Dynamic and Biomachinery Lab at the University of Manitoba. Student will need to perform laboratory tests using the indoor soil bin at the lab. CarsonLeung.Mining useful information from social networks The health benefits of consumption of whole grain cereals and its products are partly attributed to the antioxidant properties of phytochemical components such as phenolic acids and carotenoids. These phytochemicals and nutrients are unevenly distributed across the grain which is comprised of the bran outer layers (pericarp, testa and aleurone), germ and endosperm. The latter is the dominant botanical fraction in refined flours and their concomitant products. Phenolic acids are the most abundant class of phytochemicals in whole grains. However, the phenolics are concentrated in grain outer layers particularly in the aleurone. The aleurone is a living tissue that has been described either as the outer layer of the < endosperm or part of the bran. The number of cell layers forming the aleurone is different in various cereals. Wheat, oats, maize, rice, and sorghum have a single cell layer of the aleurone while barley has 2 to 3- cell layers of the aleurone. Microscopic examinations of grains have shown morphological differences, such as differences in thickness of cell wall and the morphology of aleurone cell walls and aleurone grains and its composition. The aleurone layer is generally 50 Dy thick and its contribution to whole grain in terms of size depends on the type of cereal. Wheat aleurone layer contributed 6-7% of the total bran content (10-15% of the grain). In our study, the size of the aleurone layer ranged from 34.1 to 50.9 Dy, 39.3 to 41.54 Dy and 48.8 to 96.5 Dy in corn, wheat and barley. The cell walls of wheat and barley aleurone layers comprise ferulic acid-carbohydrates complexes, proteins, (1-3),(1-4)-?-glucan and xylan. Using an electron microscope, Jones (1969) observed two electron dense areas and one electron transparent areas within the aleurone grain in barley aleurone. He identified the first electron dense area as a globoid (type I) and the second dense area, found in the periphery, as proteinaceous (type II) after staining with Aniline blue black. The electron transparent areas were described as an internal cavity. However the exact composition of these aleurone bodies or grains is unknown. It is inevitable to characterize and compare the phytochemical constituents of the aleurone layer so as to understand their contribution to the health benefits associated with whole grains. This scientific knowledge may lead the milling industry to re-examine the classification of the aleurone, not as bran but as part of the endosperm to be included in the refined flours. Whole grains will inlcude barley, corn and wheat genotypes. The aleurone layers will be isolated from each grain type using manual and mechanical separation followed by confirmation of their purity. The isolated aleurone layers and aleurone bodies will be used to determine chemical composition and concentration of phenolic acids, proteins, niacin, and phytin using liquid chromatographs coupled to uv-visible and photodiode array detectors and mass spectrometer, and tandem mass spectrometer (MS/MS) to identify and confirm the structures present.These definitions are all elementary so a student needs only a facility with abstract algebra to understand the proofs and the problems. For example, no homological algebra is required. But the student will have to be able to read notes and papers on the subject (the proofs tend to be elementary but intricate). So a course in abstract algebra is pretty much essential, and something on rings (for example modules over a PID) would be useful. I would expect the student to be able to read the relevant literature (including my notes) on the subject, and to attempt to extend existing proofs to the new situation he/she is looking at. Of course I will be there to help. If a result is achieved I expect the student to write it up in a proper format. (I have a lot of interest in the subject and so will be even more interested in what the student discovers.) I do not exclude the possibility that a student spends all his/her time on only one of the tasks in the description. TheresaBurg$University of Lethbridge ?Lethbridge Evolution of High Latitude Birds4Current research projects focus on resident and short distance migrants. Using molecular markers (e.g., DNA sequence data), we can look at historical biogeography and how glaciations have shaped current populations. We have a number of species that can be worked on to look at population genetic structure.fBiology background, some molecular training (PCR, sequencing) is ideal. Must pay attention to detail.QStudents can chose from several projects based on samples and primers in the lab.TRUST BETALFood Science and the Richardson Centre for Functional Foods & Nutraceuticals!University of Manitoba - WinnipegManitoba]Identification of Bioactive Chemical Components Isolated from Aleurone Layers of Whole GrainsFA ring R is called 'clean' if, for each a in R, a = e + u where e is an idempotent (e^2 = e) and u is a unit (uv = 1 = vu for some v in R). I introduced these rings in 1977 [1] and showed, among other things, that clean rings are all exchange rings--and so are important in functional analysis. A ring R is called 'unit-clean' if, for each a in R, ua is clean for some unit u in R. I already know that unit-clean rings are "unit-exchange", suitable defined. In 1977 [2], Han and I proved that the property of being clean passes to matrix rings, and I can prove that the same is true of unit-clean rings. Whether or not being clean is inherited by corners eRe (e an idempotent) was an open problem for years, but just recently J. Ster gave a negative example. However he left open the important question whether this is true for any full idempotent e (that is ReR = R). I want to look at these problems for unit clean rings, and investigate the 'unit' version of Ster's example. In 1999 [3], I introduced the 'strongly clean' rings, wherein each a = e + u where e is an idempotent, u is a unit, and eu = ue. These rings turn out to be a natural generalization of the 'strongly pi-regular rings' and so admit a general version of the classical Fitting's Lemma. Strongly clean rings have been intensely studied, particularly the problem of when a matrix ring over a strongly clean ring is again strongly clean. I want to define 'unit strongly clean' rings, prove an analogue of Fitting's lemma for these rings, and investigate when the property passes to matrix rings. The problem whether strongly clean rings are directly finite (uv = 1 implies < vu = 1) has been open since 1999, and I want to investigate this for unit-strongly clean rings. [1] W.K. Nicholson, Lifting idempotents and exchange rings, Trans. American M. S. 229 (1977), 269-278. [2] J. Han and W.K. Nicholson, Extensions of clean rings, Communications in Alg. 29 (2001), 2589-2595. [2] W.K. Nicholson, Strongly clean rings and Fitting's lemma, Communications in Alg. 27 (1999), 3583-3592. BGraduate students or undergraduate students with basic knowledge of statistical methods in medical research are qualified to this project. Specifically, the students should 1. be familiar with large database and how to extract data using Microsoft Access. 2. know how to clean up the data, e.g., identifying missing values, checking variable characteristics; 3. understand the linear and logistic regression model as well as their applications; 4. know the assumptions of the model; 5. know how to fit each model using STATA or R; 6. know how to interpret results of each model. In the first 3 weeks, the student will be introduced to this project and familiar with the CHIRPP database. The student will work with the Research Assistant on retrieving dataset from CHIRPP for this project. Specifically, records from children under age 20 reported to the emergency department at Alberta Childrenb Hospital (ACH) from 1999 to 2009 will be extracted. On weekly bases, I will meet with the student to follow the progress of this project and instruct the student to clean and summarize dataset. By the end of the first three weeks, the student should learn how to retrieve the dataset from CHIRPP and perform data cleaning and summary. In the following 6 weeks, the student will perform the analysis described in the Research Plan. On weekly bases, I will meet with the student to examine the analysis, interpret the results, and discuss the subsequent step. In the last 2 weeks of this project, results and conclusions will be summarized and presented to the research team at Sport Injury Prevention Research Centre (SIPRC) and manuscripts will be written for publication. Under my supervision, the student will learn to apply statistical methods, logistic regression and related techniques in particular, to construct a predictive model for each injury-producing sport in children under age 20. Furthermore, the student will learn to validate the model constructed. This is not typically taught in medical statistics courses. Participation in this project is a practical experience for the student to apply these techniques. W. Keith NicholsonMathematics & StatisticsUnit clean ringsd The objective of this project is to construct a predictive model of epidemiologic factors and seasonal effects on injuries related to sport for children under age 20. In Calgary, the injury records for children under age 20 are collected at Alberta Childrenb Hospital, the Calgary centre of the Canadian Hospitals Injury Reporting and Program (CHIRPP). To date, injury records from CHIRPP are available from 1999 to 2009. Two primary sports (activities) will be examined: bicycling and ice hockey, which are leading causes of sports and recreational injuries in children under age 20. Bicycling accounts for the largest proportion of injuries in children under age 20, followed by ice hockey. Three secondary sports (activities) will also be examined: playground equipment, soccer, and snowboarding, corresponding to the third to the fifth leading reasons for sport and recreational injuries in this population, respectively. The primary outcome of interest is whether or not the injury related to specific sport. The secondary outcome is whether or not the severe injury related to specific sport, where severity is approximated by admission to hospital. The proposed hypotheses are: 1. Are any of the following factors predictive to each injury-producing sport? Potential risk factors include: age, gender, organized or informal sports, context of the injury, breakdown events, safety device, severity of injury, body part injured, nature of the injury, seasons (fall, winter, spring, summer), and calendar year 1999-2009. Location and area of injury will be examined for bicycling, playground equipment, and soccer due to the nature of these activities. 2. Is there any effect modification by season to each predictor, each predictor with age and gender jointly, or season with age and gender jointly? If no effect modification is detected, is season a confounder to the predictors? 3. Is the model with all significant factors and effect modifications predictive on each injury-producing sport? The statistical approach to< answer above research questions is the logistic regression model. The research plan includes following steps. Firstly, injury records and potential predictors will be extracted from CHIRPP. Secondly, the retrieved dataset is split into two, one for model building and the other for model validation. Thirdly, logistic regression model is fitted to identify significant predictors and effect modifications for each sport using the model-building dataset. Significant predictors and effect modifications are then incorporated in a multivariate model for each sport. Goodness-of-fit of all logistic regression models will be assessed by the c-statistic and receiver operating characteristic (ROC) curve. Finally, the predictability of each multivariate model will be assessed by the validation dataset. Sensitivity and specificity of prediction will be derived with 95% confidence intervals.lA basic understanding of the principles of bacterial genetics, protein expression and purification, and the use of X-ray crystallography for protein structural determination are important. Any experience working in microbiology, biochemistry, or molecular biology laboratories would be a helpful asset. Direct experience with X-ray crystallography is not required.Under the guidance of a post-doctoral fellow, the student will be involved in the structural study of at least one protein involved in the synthesis of mycobactin. Experiments will include overexpression and purification of recombinant protein, identification and optimization of appropriate crystallization conditions, diffraction data collection at our local X-ray source and/or remotely using a synchrotron radiation source, and determination of the three dimensional protein structure from the collected diffraction data. Ideally one protein will be used for all of these steps; however, due to the nature of X-ray crystallographic studies it may not be possible to complete all of these experiments for a single protein in a three month timeframe. It should still be possible for the student to participate in all of these types of experiments by working with the post-doctoral fellow on multiple proteins from the mycobactin synthesis pathway simultaneously. RyozoNagamuneFImplementation of an Engine Control Unit for Automotive Fuel InjectionmThe objective of the research project is to implement an engine control unit (ECU) that we acquired recently to an automotive engine test bench for the fuel injection control purpose. The engine system to be used is a Sunbird engine (port-fuel-injection gasoline engine) that UBC Mechanical Engineering has. The ECU is a MotoHawk microcontroller which determines the amount of fuel to be injected based on sensor measurements. To measure the air-to-fuel ratio in real time, a wide-band oxygen sensor will be installed before the three-way catalytic converter in the exhaust pipe. Both a simple PID controller and an advanced robust controller will be tested, and the closed-loop system performances will be compared over various operating conditions (engine speed and air flow). These controllers will be realized in Matlab/Simulink, and transferred to the ECU microcontroller.The student who conducts this project is required to have taken at least one course in automatic control with high grade, to have strong skills in Matlab/Simulink, to have background in mechatronics (sensors, actuators, microcontrollers), and to be familiar with automotive engines. A good written and oral communication skill in English is necessary. She/He must be hard-working and responsible.The roles of the student in the research project are to understand the sensors and the actuators in the automotive engine, to design and simulate feedback controllers for fuel injection based on an oxygen sensor, and to implement the controllers using the MotoHawk ECU system. The student is required to report the project progress to the supervisor in weekly meetings, and to write a final report.Ollivier-Gooch>Improved Simulation of Shockwaves in Compressible Fluid Flows The reliable simulation of shockwaves is critical in the prediction and study of many physical phenomena, where abrupt changes in material properties due to shockwaves can greatly affect regions of interest and activate physical mechanisms. The predominant method for simulating flows with shockwaves, shock-capturing has been around for more than sixty years.and have been successfully applied to a wide range of problems. However, often, in compressible fluid flow simulations, large errors appear as a result of the presence of shockwaves. These errors do not disappear with grid refinement or higher-order schemes and cast doubt on this otherwise robust and reliable class of methods. A summer intern is sought to analyze and improve performance of a new class of schemes which do not suffer from these problems. Currently, this new class of schemes is not reliable and new analysis and numerical testing is required to better understand these new developments. The intern will use existing codes to test schemes on simple test problems before utilizing mathematical and numerical analysis. The student will gain an understanding of shockwaves and shock-capturing methods as well as learning basic analysis techniques and improved understanding of partial differential equations. Essential: Knowledge of calculus and differential equations. Matlab programming experience. Ability to work independently. Good critical thinking and communication skills. Very helpful: Knowledge of aerodynamics and/or fluid mechanics. Analysis and numerical testing of newly developed methods for shock problems. The intern will use existing codes to test schemes on simple test problems before utilizing mathematical and numerical analysis.SUJATHARAMDORAI MathematicsALGEBRA AND NUMBER THEORYNumerical computations provide insights into some of the arithmetic phenomena. Number theory abounds in problems where one needs extensive numberical evidence to provide clues towards unravelling the larger picture. The student would be introduced to some problems in these areas.Interest in Pure mathematics. Willingness to learn Abstract concepts in Algebra and Number theory. Knowledge of computer programming.The student will be introduced to computational aspects of number theory and arithmetic geometry and will be expected to work with software packages towards producing extensive numerical evidence for some dep problems arising in arithmetic geometry.UsmanAlimComputer ScienceUniversity of Calgary - Calgary4Divergence Preserving Approximation of Vector FieldsFluids are often simulated on regular grids. At each grid point, the fluid velocity is stored as a vector. In order to analyze the flow, most flow-visualization algorithms interpolate the vector field at non-grid points in a component-wise fashion. This straightforward approach is not conservative i.e., the interpolated vector-field is no longer divergence-free. The aim of this research project is two fold: 1) Quantify the error introduced due to this component-wise treatment. 2) Investigate efficient strategies that attempt to lower the error with little or no overhead. Such strategies include: using alternate sampling lattices, prefiltering the vector data so as to minimize the divergence, and using alternate flow-field representations that are inherently divergence-free. This research project required a sound background in mathematics and strong programming skills. The ideal student should be well-versed in: - Multivariate Calculus and Linear Algebra - Programming languages such as C/C++, Matlab and preferably knowledge of Unix shell scripting. - Familiarity with introductory computer graphics. Although not required, it would be helpful if the student is familiar with Fourier analysis.3The student will work on the first objective. The approach will be both qualitative and quantitative. This will involve conducting numerical experiments with synthetic and real flow data to ascertain the effect of different lattice types and interpolation methods on reconstruction error< and visual quality.JianKang KinesiologyePredictive models of epidemiological factors and seasonal effects on injury producing sports in youth}Despite the development of both a tuberculosis vaccine and antituberculosis drugs, Mycobacterium tuberculosis, remains one of the most important bacterial pathogens in the world and tuberculosis kills more people than any other infectious disease. India and China, two countries that participate in the MITACS Globalink program, have the highest number of cases in the world, though rates of infection are higher in many countries in Sub-Saharan Africa. The tuberculosis vaccine is not effective enough and tuberculosis treatment regimens are long and carry a significant risk of toxic side effects. Individuals with HIV are at increased risk of M. tuberculosis infections and the simultaneous treatment of HIV/AIDS and tuberculosis is even more difficult. There is great need for new antituberculosis drugs. In order to establish infections, bacterial pathogens require high-affinity iron uptake systems in order to acquire this essential nutrient from a human host, where iron is tightly sequestered by proteins and free iron is at a concentration that is many orders of magnitude below that required for sustaining bacterial life. M. tuberculosis is no exception to this and has at least two pathways to acquire iron from the host. One of these pathways is the synthesis of the mycobactin siderophores, a collection of small molecule with high affinity for iron that are either exposed at the surface of or secreted by M. tuberculosis. These molecules take iron from the human iron-binding and storage proteins transferrin, lactoferrin, and ferritin and transport it back into the cell where it can be used for bacterial metabolism. Pathways used by bacteria to acquire and store iron are potentially targets for the development of new antimicrobial agents, which could disrupt bacterial growth by disrupting bacterial access to iron. The Murphy Laboratory has recently begun a new project to study the mycobactin siderophores and there are two main goals of the project. First is to characterize the structure of proteins required for the synthesis of the mycobactins. Second is to identify new proteins required for the secretion and reuptake of the mycobactins. - web-based programming - database management systems - interest in Artificial Intelligence (Fuzzy Logic/ Artificial Neutral Networks) - user interface design - knowledge in Earth Sciences would be useful but not mandatorypThe student would be the primary researcher in this work under the supervision of John Meech. He or she would report and interact on a daily basis with Professor Meech with respect to the progress of the work. Interaction with other graduate students and professors in the department would also take place through seminars and shared laboratory and computer lab space.MichaelMurphyMicrobiology and Immunology>X-ray crystallographic study of a mycobactin synthesis protein@ This project involves the evolution of an existing computer software system designed to teach mineralogy and assist in the identification of a mineral specimen. The current system operates in a proprietary hypertext-based software environment that needs to be upgraded to run in a Windows-based browser such as Firefox or MS-Explorer. The system uses a fuzzy expert system to present information on over 250 minerals and to assist a student in learning how to identify an unknown specimen. It uses observations made by the user to increase or decrease the degree of belief in a particular mineral name that is initially chosen by the user. In this way the user acquires the ability to make observations about the properties of rocks and minerals such as colour, streak, luster, S.G., hardness, crystal structure and habit, cleavage, twinning, and numerous unique characteristics such as twinning and fluorescence; and apply these facts to identify minerals. The research involves evolving the existing rules and fuzzy logic structure into a form compatible with web-based programming. Some of the existing rules can be made to operate in a more efficient manner using an Agent-based approach to reuse rules and apply them across the different properties rather than being applied as separate entities. The system uses a number of Fuzzy Associative Memory maps to take input information and determine the degree of belief in an output. Input and output is typically of a linguistic nature except where measurements have been made such as S.G. and hardness. The method of Defuzzification uses a Weighted Average approach although some rules use Weighted Inferencing similar to an Artificial Neural Network. Examination of ways to streamline these two alternatives will occur during the research. The ability to easily add new minerals into the system will also be examined. The research plan for this work is as follows: Week 01 - Familiarization with the existing system Week 02 - Learning the AI elements in the system Week 03 - Transferring the hypertext documents into HTML/XML Week 04 - Continuing with hypertext documents and database Week 05 - Creating rules for on-line application/manipulation of data Week 06 - Continuing to create rules Week 07 - Continuing to create rules Week 08 - Testing the system Week 09 - Testing the system Week 10 - Developing the User Interface Week 11 - Final clean-up and write-up of a brief report Week 12 - Presentation of the new system to our department The ideal candidate for this project would possess knowledge in web-based programming, data management systems, and would wish to learn more about AI techniques such as fuzzy logic, expert systems, and artificial neural networks. An interest in minerals and earth sciences would be an asset, but expertise in this area is not required to do this research. BThe student is expected to have taken some courses in biophysics and thermodynamics, and possess some fundamental knowledge on molecular interactions. He/she is also expected to have had some training in numerical calculations and computation. Past experience with MD simulation is not necessarily required but is a plus. The student is expected to perform the following tasks: 1. Getting familiarized with the tools to be used to conduct the MD simulations. We will use software package NAMD for the MD simulation and VMD for visualization and data analysis. CHARMM, a well recognized force field for biological systems, will be applied to describe the atomic interactions of the PEI and the nucleic acid molecules. The computations will be done through parallel computing facilities on campus and on Compute Canada (a national-wide computational network). 2. Building molecular models for the cell membrane and for the PEI-nucleic acid aggregate. This mainly involves making use of existing models in the literature or developed previously in my group and performing modification and equilibration of these models. 3. Performing MD simulation on the interaction of the PEI-nucleic acid aggregate with the cell membrane. 4. Data analysis: this involves both structural analysis and energy analysis. MeechMining Engineering*University of British Columbia - Vancouver/A Fuzzy Expert System on Qualitative Mineralogyj Gene therapy is an arising therapeutic technique that involves delivering genetic materials into cells to treat diseases including cancers, hereditary diseases and viral infections. Gene delivery normally uses carrier molecules such as viruses, liposomes and cationic polymers to transfer nucleic acids into cells. An effective delivery carrier must be able to form stable aggregates with the nucleic acids at the beginning, carry them through the cell membrane, and release them at the end of the delivery process. Viruses are yet the most common and efficient delivery carriers, but they are greatly limited on their general use due to safety concerns. Synthetic polymers, as alternatives to viral carriers, can condense and form nanoparticles with nucleic acids to facilitate the delivery, with the advantages of less toxicity, lower cost, easiness < to produce and versatility for different applications. Polyethylenimine (PEI), as one of the most promising gene-delivery polymers, has been intensely studied experimentally and has been identified to bear the potential of becoming practical carriers in future clinical usage. Despite the great potential of PEI and PEI-based gene carriers, many issues related to their mechanisms of action remain to be clarified. The influence of PEI architecture, the role of lipids, the stability of PEI-nucleic acid aggregates and the interaction between the formed aggregates and cell membrane are fundamental issues that require clarification for functional delivery using PEI-based carriers. Ultimately, elucidation of these features will provide guidelines for designing optimal delivery systems for gene-based therapies. In the past few years, we have used molecular dynamics (MD) simulation as a tool to study the role of PEI as a gene delivery carrier. In particular, we have investigated the effect of PEIb molecular architecture and protonation state on the binding of PEI to DNA, calculated the free energy of binding between PEI and DNA, and simulated the aggregation of DNA molecules mediated by PEIs. Results from our work have led to the publication of several papers in high quality international journals such as the Biophysical Journal and Biomacromolecules. The proposed project aims at understanding a particular aspect of PEI-based carriers, namely the interaction of PEI-nucleic acid aggregates with cell membrane. The passage of PEI-nucleic acid aggregates through the cell membrane is a necessary step in the delivery process and we expect that using MD simulations, important information can be obtained at the molecular level on how the cellular uptake of the aggregates can be affected and improved. The student should have a strong interdisciplinary background in computer science and human-computer interaction. First, the student should have strong computer programming background and be able to program for the mobile web and mobile devices. This includes knowing programming languages such as Java, Cocoa, and JavaScript (and jQuery), and being able to create databases using mySQL. Second, the student should have a background in designing and conducting qualitative or quantitative studies with end users. This includes planning and conducting interviews, observing users performing tasks with systems, and analyzing studying results.wFirst, the student will conduct an initial abbreviated literature review of research in the areas of location-based games, augmented-reality games, and pervasive games with a particular emphasis on scalable game design and natural disaster games. This will provide a background of research in this area, an understanding of the important research questions, and exposure to the techniques others have used to develop scalable pervasive games. Second, the student will prototype a pervasive game based on design requirements derived form existing studies of pervasive games. The student will test play this game and refine it through its design and implementation using an iterative design process. Third, the student will evaluate the game by planning and conducting a study with game players that will focus on assessing which game elements help promote the scalability of the game.GaryWangEngineering ScienceTDevelopment of Large-Scale Optimization Tools for Flexible Assembly Process PlanningThis research project, currently funded by NSERC and General Motors (GM) Canada, aims at developing the state-of-the-art optimization tools for assembly process planning. An auto body consists of many flexible sheet metal parts. The assembly process great affects the dimensional quality of the final automobile. Such an effect could be modelled and analyzed using the Monte Carlo simulation with commercial software tools. The challenge lies in the lack of an effective optimization method to search for the best assembly process and its configurations. Carefully selected by GM, our group is specialized in sophisticated optimization methods that can solve so-called high-dimensional, expensive, black-box (HEB) problems. This research will adopt a number of approaches to tackle the assembly process planning problem and the student will participate in the research with a team of post-doctor fellows, Ph.D. students, and Master's students.- Computer programming skills and experiences - Java programming language - GUI development experiences - Software design and implementation - Mathematics and algorithms - Matlab IWhile participating in the research project with other teammates, the student will mostly be involved in programming and implementing optimization algorithms, graphic user interface, and performing other software development related tasks. The student will be able to work with GM engineers, understand their needs, learn about software and algorithm design, and implement advanced algorithms into tools for practitioners. The student will also be getting training with the state-of-the-art assembly variation analysis tool, 3DCS, and how to integrate optimization with the analysis.Tian TangMechanical Engineering University of Alberta - EdmontonAlbertafMolecular Dynamics Simulations on the Delivery of Polyethylenimine Aggregated Nucleic Acids into Cells~  Pervasive games are a new genre of computer game played on mobile devices in the everyday locations we inhabit. With them come a new way for people to socialize, interact, understand places and locations, and engage in various aspects of community. The challenge, however, is we still do not know how to best design pervasive games to fully engage game players such that they can find new ways of exploring and understanding aspects of their culture and geographical situation. One specific challenge<  that exists with many pervasive games is the creation and orchestration of game content and activities. By creation, we mean the construction and placement of game content. By orchestration, we are referring to the monitoring of players?activities in the game, and the quality and continued availability of game content, to ensure that play proceeds smoothly for players. Orchestration involves actively monitoring players to ensure they stay out of harmb way and continue to participate in the game. Here the orchestration needs are highly dynamic as the physical constraints of the game appear unbounded to the player, as do the rules (i.e., who to interact with). In general, game creation and orchestration in pervasive games is crucial to a gameb success. If it is done poorly, players may not enjoy the game or may be at risk; perhaps, even worse, non-players who do not realize they are part of a game as bystanders may be at risk. Pragmatically, challenges with creation and orchestration also mean that pervasive games are often dne offs?and only available in a single location or conducted over a short period of time, never to be run again. As a result, participation is limited or game players cannot continue to play like they might computer or online games. Our interest is in understanding how pervasive games can be designed to be scalable. By scalable, we are referring to a gameb ability to be: 1) played and orchestrated over long periods of time in a variety of locations, and 2) played by a large number of players (e.g., hundreds or thousands of people). This project explores how scalable pervasive games can be designed for players to learn about and share knowledge between family, friends, and community members related to impending and potential natural disasters. For example, it looks at how Vancouver residents can learn about emergency preparedness for a potential major earthquake and how they can share their knowledge with family and friends to help them prepare as well. We will design a prototype of a pervasive game that utilizes design elements that will potentially make the game scalable and then evaluate the game through actual play by end users. The outcome will be a broader understanding of what game elements promote scalability (e.g., narrative, game mechanics, group play) and how these aspects affect game play experiences in the setting of natural disaster preparedness.lThe student should have a strong interdisciplinary background in computer science and human-computer interaction. First, the student should have strong computer programming background and be able to program in both Java and .Net languages. Particularly important skills include the knowledge and application of computer networking principles through programming. Second, the student should have a background in designing and conducting qualitative or quantitative studies with end users. This includes planning and conducting interviews, observing users performing tasks with systems, and analyzing studying results.>The student will initially conduct an abbreviated literature review of research in the areas of video-mediated communication, media spaces, and remote video-based embodiments. This will provide a background of research in this area and an understanding of the important research questions, the ways in which video communication systems have been created in the past, and the open areas for research exploration in this topic area. Next, the student will prototype one or more video communication systems where s/he will follow a typical design process. This includes sketching potential design ideas, ideating within small research teams, and then rapidly prototyping potential solutions. The student will also conduct informal critiques and studies with others to explore potential options. Finally, the student will evaluate the final prototype of the system by planning and conducting a study that will answer several research questions aimed at understanding how well the system supports connecting family members over distance and creates a sense of presence and connectedness.9Scalable Pervasive Games for Natural Disaster Preparation Video chat is a technology that has rapidly proliferated in usage in the home for connecting family and friends over distance. The availability of inexpensive webcams and free software such as Skype has made this possible for everyday families. Most research to date has explored the use of small, portal-sized views (e.g., tablet displays, laptop screens) for video communication systems connecting family and friends over distance. Yet these views are often challenging to use in a real home context and often require people to come close to the display. This makes it difficult to use the video communication system while far away (e.g., across the room). The goal of this research project is to explore the possibility of using large displays for creating dife-sized?video connectio< ns between typical home environments. By life-sized, we are referring to video connections that are displayed on large surfaces or displays that exceed the size of typical computer monitors. For example, displays such as large televisions (> 50? or large projected displays that encompass an entire wall. We want to understand the challenges in designing such systems using low cost hardware, networking, and sensing capabilities (typical to most homes) and what value family members would find in using such life-sized video connections for sharing activities over distance and creating a sense of remote presence. In particular, we are interested in supporting activities that occur over a long period of time, as opposed to shorter conversations. Such activities may include group get-togethers (e.g., birthday parties), shared dinners, group games, or simply hanging out. As part of this project, we will design one or more versions of life-sized video communication prototype systems that utilize different types of large displays for viewing video connections. Through iterative design, we will prototype and explore different options and try them out autobiographically in a home environment. The focus will be on providing feelings of presence over distance and peripheral viewing of remote activities or shared activities. The project will then involve evaluating the above system(s) by conducting formal studies with families who will use the systems for one or more activities. Studies will focus directly on ascertaining user reactions to the systems and answers to our research questions. Overall, this research will provide a better understanding of how life-sized video communication systems should be best designed to support the everyday needs of families at home. It will allow us to understand in what situations life-sized displays best support the viewing and participation in family activities over video links such that future product design can focus on supporting these instances. It will also allow us to understand the benefits and shortfalls of life-sized video communication systems so we understand in which cases designs should continue to focus on smaller, portal-sized displays as a viable medium. The student should have either a chemistry or biochemistry degree, preferentially has research experience in surface chemistry or analytical chemistry laboratories. The student will be responsible for the entire experimental work under the supervision of the principle investigator (PI): (1) Surface preparation and characterization of plastic substrate materials (2) Surface immobilization of DNA oligos and optimize the reaction conditions (3) Develop signal reading methods; (a) reproduce the silver staining method and improve the sensitivity, and (b) develop new methods of reading by either enzymatic reactions or chemilumnescent labeling approach. At the end of the term, the student will be also asked to prepare a written report, and make an oral presentation to the research group of the PI. Carman NeustaedterInteractive Arts and Technology Simon Fraser University - SurreyCLarge Display Video Communication Systems for Domestic Environments The technology of microarrays revolutionized molecular biology by allowing high-throughput screening of biomolecular interactions and it makes the detection of specific biomedical analyte faster and more efficiently. Some of its important applications include gene profiling, clinic diagnostics, and drug discovery. However, due to the requirement for expensive equipment, the microarray technology is limited to a certain number of facilities, e.g., biomedical laboratories and hospitals. In order to make the technology more accessible to everybody, it is essential to develop a less expensive microarray platform and signal-read-out protocol. We have discovered that UV/ozone treatment of the polycarbonate (PC) plates produces a hydrophilic surface with a high density of reactive carboxylic acid groups. Covalent immobilization of DNA probes via both passive (reagent-less photo-patterning and coupling in bulk solution phase) and flow-through (creation of microarrays with microfluidic channel plates) procedures has been demonstrated. Subsequent hybridization shows uniform and strong fluorescent signals for complementary target DNA and allows clear discrimination between fully complementary targets and strands with a single base-pair mismatch. We have recently developed a protocol to cnhance?the assay signals by autometallography. In particular, biotin-labeled target DNA will be hybridized with the probe DNA immobilized on the PC substrates and subsequently treated with gold nanoparticle-streptavidin conjugates. Silver was then deposited on the gold deed?to increase the particle size from a few to several hundreds nanometers. The binding sites will be observed with a standard flatbed scanner (or a digital camera). We will test the enzyme-based colorimetric approach that has been popularly used in ELISA assays. First, an enzyme-linked antibody will be used label the target DNA strands. In the presence of enzyme substrate, the enzymatic reaction will then take place to change the color of the surface. Specifically, the commercially available horseradish peroxidase (HRP)-linked anti-biotin antibody can be utilized to reduce colorless TMB (3, 3? 5, 5?tetramethylbenzidine) to produce a deep blue product. Such a color change can be either monitored by a CMOS imager or even a standard flatbed scanner. The other alternative would be use of chemilumnescent molecules to label the binding site [2]; these molecules (e.g., luminal and acridinium ester) will emit l< ights upon reacting with H2O2 in the presence of an enzyme (e.g. HRP). This approach will be more appropriate if the imaging device is sensitive to low level of emitted photons. [1] Li, Y.; Wang, Z.; Ou, L. M. L.; Yu, H.-Z. (2007) DNA detection on plastic: Surface activation protocol to convert polycarbonate substrates to biochip platforms, Analytical Chemistry, 79: 426-433. [2] Woodhead, J. S.; Weeks, I. (1985) Chemiluminescence immunoassay, Pure & Appl. Chem. 57: 523-529. .Students with a wide range of skills can be considered for this project. A student with a focus on molecular genetics may be suitable for the analysis of gene expression by RT-qPCR, in situ RNA hybridization or enzymatic assays of recombinant proteins. On the other hand, students with a background in computing sciences may find bioinformatics analysis of the identified set of genes (currently > 33,000) or the statistical analysis of associations interesting applications, especially as specialists in these two fields face a very favorable labor market. A student with a focus on molecular genetics will work with a PhD student on the project on the characterization of identified candidate genes using a wide-range of recombinant DNA technologies. Most likely, the student will work in parallel with the PhD candidate using the same technologies, but have responsibility for the analysis of a limited set of genes. A student with a focus on computing sciences or population genetics will work with a postdoctoral scientist to extract DNA from individuals in a large test population with known tropolone contents, identify genetic variation in candidate genes in individuals in this population, primarily single nucleotide polymorphisms, and test the potential associations using the TASSELL software package. BingyunSun ChemistryXHigh throughput functional proteomics for surface proteins on mouse embryonic stem cellsProteomics is a high throughput analysis of the structure and function of proteins in biological systems. Our glycoprotein-targeted proteomic analysis employs liquid chromatography (LC) as a separation tool in line with mass spectrometry (MS) as an analyser to identify and quantify glycoproteins and their site of glycosylation. In a typical analysis, proteins will be extracted from biological samples, digested into peptides enzymatically, and glycosylated peptides will be chemically enriched. The N-linked glycopeptides will be selectively analyzed through LC-MS. Thousands of peptide MS spectra can be acquired in a short duration and processed by bioinformatics tools to infer protein structure and function. This project will focus on improving the sensitivity of glycoproteomics method that we have developed previously (Sun, et. al., MCP, 2007). Successful remedies will be deployed to study membrane proteins of mES cells. Our lab has extensive experiences with these cells, and we have both host and gene knockout mES cells in culture. The new knowledge gained from these newly developed sensitive analyses will be evaluated easily through comparison to our curated databases in identifying stem-cell specific protein markers. Connections drawn between surface molecular topology and intracellular protein networks could also provide us novel insights into stem cell biology and physiology. Exicuting such project will provide students hands on experiences on all aspects of proteomics technology, including protein chemistry, liquid chromatography separation techniques, tandem mass spectrometry, and data-analysis skills. Our lab also harbours various molecular and cellular techniques, such as mammalian cell culture and assays, immunoassays, fluorescence imaging, and single-cell assays, which can be acquired during training. Successfully trained students will have opportunity to continue the project as a graduate student.Students with good analytical skills, including hands on experience with HPLC and Mass Spectrometry, basic statistic and bioinformatic skills, and solid training in biochemistry such as protein extraction, protein purification, and chemical modification are highly recommended. Self-learning and team work are both encouraged in the lab. Highly motivated individuals with good ethics, integrity, and an eager to enrol in proteomics field will be further considered.The participants will have opportunity to work closely with a multidiscipline research team composed of undergraduate and graduate students from Departments of Chemistry, Molecular and Cellular Biology, and Health Science. All the necessary protocols encountered in this internship have been developed previously, the prospective students need to execute the experiments under guidance, analyze the results, and discuss with senior students or the principle investigator of the outcome and potential improvements. Through this process, students will be well trained on critical thinking and troubleshooting skills, and be pampered by team work spirit and good ethics. Hogan Yu#Department of Chemistry and 4D LabsDNA microarrays on plastics Based on lumber prices, Western redcedar (Thuja plicata) is at times the highest prized forest tree species in Canada and with a yearly export of more than $ 770 million in 2006. T. plicata lumber is regarded as durable and rot resistant and therefore highly priced for special outdoor applications such as decking, furniture, roof and wall shingles. The rot resistance of trees and the related durability of generated lumber are primarily attributed to high levels of tropolones, antimicrobial compounds that accumulate in the heartwood. Tropolones purified from heartwood are also used in the medical/natural medicine/cosmetic market primarily in Asia because of their strong antimicrobial properties. Although rot resistance is a key trademark for T. plicata lumber, there is considerable variation for this trait in second-growth trees, and a significant portion of cut trees are culled because of considerable internal rot. There is also a similar variation in lumber durability. The genetic stock used for T. plicata production in Canada is produced by the British Columbia Ministry of Forests and range, our major partner in this project. These genetic stocks are in turn used for seed production primarily by our industrial partner Western Forest Products Inc. Currently, generational breeding for heartwood rot resistance in T. plicata is not practical as tropolones accumulate in heartwood at quantifiable levels after > 15 years. The intention of this project is to develop genetic markers for early prediction of tropolone content with the aim of enabling breeding for rot resistance in Western redcedar. To this effect, we have mapped the secretion of oleoresin/tropolone to ray cells, more specifically to ray cells in the two most recent layers of heartwood. We are currently sequencing the RNA from these tissues using next generation sequencing technology. Candidate genes, in particular those encoding monoterpene synthases, cytochrome P450, reductases and dehydrogenases, will be assessed by in situ RNA hybridization for expression in ray cells followed by ability to carry out specific steps in the synthesis of the most abundant tropolone ?beta-thujaplicin. Finally, sets of candidate genes will be used for genetic association studies in which genetic variants in these genes will be assessed to what extent they correlate with high levels of tropolones in a large population of unrelated T. plicata trees. This approach, currently used extensively to identify genetic variation associated with disease in humans, is likely to provide us with variants/markers for prediction of high tropolone content, which can be used as markers to predict rot resistance in T. plicata breeding populations. The student must have a solid academic record with a sound understanding of physics, mathematics and programming. Experience with Linux/Unix, the MATLAB environment or another programming language is beneficial. Experience with COMSOL Multiphysics or finite element analysis in optics or electromagnetics would be ideal. The ability work independently, be analytical and creative, and to c< ommunicate clearly is required.The student will extend the current developed finite element method based models of the optical behaviour of a single GNR to models of ensembles of GNRs, which take inter-GNR interactions into account. Since ensemble models will require vastly larger computational resources than the current models, the student will first set up the model on the SHARCNET high performance computing (HPC) system (that has thousands of computing cores and thousands of gigabytes of RAM, allowing for very large-scale computational models). While supervision of the student will be on-going, regular weekly meetings with Dr. Kumaradas will be set up to discuss progress and address the difficulties they will inevitably encounter. The student will also work closely with several graduate students in Dr. Kumaradas?lab. The student will have the opportunity to perform independent research and gain experience with the development and use of large-scale simulations and development of nano-scale materials in a research environment. The student will be required to participate in a weekly seminar series held among research groups. The student will be invited to present their results at this series towards the end of their project. It is anticipated that the results from the project will be submitted for presentation at a regional scientific conference. The student may be asked to prepare a short manuscript for submission to a peer-reviewed scientific journal. Der ChyanLin;Nonlinear dynamics and energy transfer in physical systems.Energy preservation and its efficient usage have been critical to the long-term sustainability and, ultimately, the survival of our society. It is also the essential component to guarantee a smooth transition of the current growth-based economic system to a more, Earth-friendly, environmental conscious economic system. The proposed research project is derived from the research program aimed at developing a micro energy preservation system. The objective of the project is to achieve efficient energy transfer between subsystems. The success of such a transfer allows which is otherwise considered as wasteful energy be properly captured and used at a later time. The novelty of the current approach is to use the newly proposed method that extracts general synchronization between subsystems to accomplish such a task. It is anticipated to apply this technology in wide variety of contexts from biological to environmental. The student should be familiar with programming, and taken courses related to differential equations, general physics, and dynamics. The student with experiences in MATLAB and basic knowledge of building electronic circuitry is strongly recommended to apply.The student is to work closely with the graduate students currently supervised by the applicant. The student is to assist in numerical and hardware experiments: (a) implement new energy transfer models, (b) conduct dynamical system time series analysis, and (c) fine tune existing energy transfer hardware unit, (d) record and analyze experimental data to assess efficiency of energy transfer.FilippoSalustriDesign by Analogy CatalogThe hypothesis of this project is that creativity is largely an act of forming analogies. By studying and eventually codifying how humans find and use analogies, we should be able to improve designers' innovation skills. The first step in this process is to construct a catalog of known and presumed cases where analogical reasoning led to a design insight or innovative (and successful) product.Reasonable facility with English; ability to analyze the literature, under guidance; facility with web searching and tools such as google docs or MS Office; preferred background in engineering, industrial design, or architecture. Optional experience in biomimetics would be beneficial.Under supervision, the student will search the literature and the web for instances of analogical reasoning leading to designed products. The student will construct diagrammatic representations of the analogies and prepare documentation summarizing those analogies. CmapTools (available free for academia) will be the diagramming tool used. The student will also analyze the catalog thus developed for trends and patterns within the analogies.Design of a Design ObservatorynA "design observatory" is a facility in which participants execute design activities while being observed and recorded. The observations and recordings are then analyzed to study the cognitive and other acts that designers carry out during design. Ryerson has no such facility. The goal of the project is to design such a facility, including defining how much space will be needed, enumerating equipment and other resources necessary for such a facility, and estimating both initial and operational costs of the facility. Extensive research on other similar facilities elsewhere in the world will have to be conducted.SReasonable facility in English; ability to search the literature for pertinent works; ability to analyze data for trends and patterns; familiarity with standard document preparation tools; some experience with preparing feasibility reports and budgets is desirable. Preferred background in engineering, industrial design, or architecture.The student will conduct the entire project, as described above, under the applicant's supervision. The deliverable will be a report proposing a location, equipment, and initial and operating cost estimates of a design observatory to be located within the applicant's Department.8Morphological charts for concept ideation and generationMorphological charts are a well-known tool to assist in design concept generation; using morphological charts, it is often possible to generate thousands of design concepts. However, there are a number of variations of the tool which have never been integrated. There are also shortcomings in the method associated with the tool; in particular, the "culling" of the solution space by identifying and discarding "poor" design concepts. If done manually, this culling process can be very tedious and time-consuming. The goal of this project is to develop a systematic method to cull the design concepts generated by a morphological chart, and develop a computer-based tool (possibly, but not necessarily in MS Excel or Google Spreadsheets) to assist in the culling process.Reasonable facility in English; advanced experience with MS Excel or Google Spreadsheets; ability to research the existent literature in areas relating to the project; web programming skills may be beneficial but are not necessary.The student will learn about morphological charts and their associated methods. The student will then devise an algorithmic, systematic approach to culling the design space that results from the use of a morphological chart. The student will then implement that approach in a tool like MS Excel or Google Spreadsheets. A final report / user manual for the method is expected. All this work will be done under the supervision of the applicant.AEvaluating techniques to improve the efficiency of clothes dryersZThere is an "urban legend" that adding a dry terry-cloth towel to wet clothes in a clothes dryer will cause the wet clothes to dry faster, thus saving energy for virtually no expense. These kinds of interventions, involving minor changes to human behaviour rather than extensive technological changes, is a key method to achieve a sustainable society. However, there is no "proof" of the urban legend. The goal of the project is to conduct experiments (with existing equipment at Ryerson University) to determine if the urban legend is true and, if it is true, how much energy may be saved this way.Reasonable facility in English; some experience with instrumentation for measuring temperature, humidity, and air flow; experience in heat & mass transfer or HVAC; general facility with spreadsheet systems like MS Excel or Google Spreadsheets.[The student will be instructed on how to conduct experiments using available equipment at Ryerson University. The student will then assist in designing< the experiments to determine whether the "urban legend" noted above is true. The student will execute the experiments, compile and analyze the resulting data, and write a report on the subject.Sharareh Taghipour'Risk-based maintenance of gas pipelinesaPoor safety and quality performance in the oil and gas infrastructure can be catastrophic. Asset integrity management ensures that oil and gas facilities perform their required functions efficiently and effectively while protecting life and the environment. Harsh operating conditions on Canada's east coast and uncertainty in the degradation process of assets are some challenges of integrity management. We will develop risk-based methodologies for prioritizing pipeline segments and will construct optimization models for their inspection and corrosion maintenance activities. The approach will be based on pre-posterior Bayesian and condition-monitoring maintenance. The novelty of this work consists of modeling the degradation process of pipelines and determining optimal inspection and maintenance plans under uncertain deterioration and safety constraints. We are looking for an industrial engineering student with adequate background in statistics and mathematics. The student should be able to work with industrial data and interpret the statistical results. Over the period of 12-week, the student will be either involved in statistical analysis of industrial data or conduct the literature review of research subject matter. The student will be asked to write a journal paper if significant result is obtained.0Reliability assessment of electric power systemsThe importance of power system reliability is recognized when the electricity supply is interrupted. A challenge in the reliability estimation of electrical systems is to consider time-varying factors influencing the failure rate of the system. These factors include system characteristics, such as age, operating conditions, such as load level, and environmental and weather conditions, such as humidity and freezing rain. This novel work will develop a new approach for incorporating these factors in reliability estimations of large-scale power systems. We will study the effects of operational, environmental, and utilization variability on the reliability of power generating units under the condition of intermittent generation and varying load. A combination of graph theory, physical network modeling and generic programming will be used for reliability prediction of electric systems. The changes in reliability for different system configurations will be evaluated using sequential Monte Carlo simulation. We are looking for an electrical or industrial engineering student with adequate background in statistics and mathematics. The student should be able to work with industrial data and interpret the statistical results. Sustainable assets managementSustainability deals with managing natural resources to meet the needs of the present generation while permitting the future to meet its needs. Life cycle costing (LCC) is a traditional technique for assessing the economic aspects of a system, including acquisition, operations, maintenance, and replacement costs. Life cycle assessment (LCA) is a method for measuring the environmental impacts of a system or product over its life cycle. In this project we will integrate LCA and LCC in order to determine the optimal decisions through the life cycle of an asset. We will propose new life cycle optimization models for sustainable management of energy-consuming assets such as boilers and pumps. The multi-objective and multi-constraint optimization method will be used to incorporate environmental, economical, and social aspects of asset management. The life cycle of an asset will be optimized using dynamic programming to minimize costs, environmental impacts, and energy consumption, and to maximize the asset performance and benefits.We are looking for an industrial engineering student with adequate background in mathematics ans statistics. The student should be able to work with industrial data and interpret the statistical results. /Optimization model for breast cancer screening 'Breast cancer is the most common cancer among Canadian women. One in 9 women is expected to develop breast cancer during her lifetime, and of these, one in 29 will die. Although the causes have not yet been completely identified, a woman can minimize the chances of developing breast cancer and possibly dying from it, as some factors are known to increase the risk of formation and development of a cancer cell. Age, race, environmental risk factors, family history all influence breast cancer incidence, development and mortality rates. Mammography screening is performed periodically in the hope that with early detection, breast cancer can be treated effectively. Because mammography screening brings forward the time of diagnosis, it potentially decreases mortality, yet too-frequent screening is both unnecessary and costly, and it also subjects patients to high amounts of radiation. In this project we model the natural history of breast cancer using an appropriate stochastic process. We incorporate age and other related risk factors in the model by considering multivariate transition probabilities of progression states and sojourn time. We use the model to determine age to start and terminate mammography screening, as well as optimal screening schedules for different age-based risk-group subpopulations.We are looking for an epidemiology or industrial engineering student with adequate background in statistics and mathematics. The student should be able to work with clinical data and interpret the statistical results. Over the period of 12-week, the student will be either involved in statistical analysis of clinical data or conduct the literature review of research subject matter. The student will be asked to write a journal paper if significant result is obtained.ScottTsai9Lab-on-a-Chip Particle Coating Technology For BiomedicineThis research project will involve technique development and device engineering. Specifically, to fully establish our new microfluidic co-flow coating technique, we will develop a protocol to polymerize the coating layer, demonstrate the proof-of-concept capture of the conformity of coatings on non-spherical particles, engineer biocompatible and multiple layer coatings, and test the coating technique with living cells. The outputs of the proposed research project will be in the form of fundamental fluid mechanics (e.g. coating flows) and applications (e.g. the immunoisolation of cells for biomedicine). There will be many opportunities to learn about optical experimental techniques, microfabrication technology, and viscosity dominated fluid mechanics. We welcome enthusiastic students who have diverse interests in engineering, chemistry, biology, physics, and applied mathematics to join the LFFI in the research and development of microfluidic coating technologies for biomedical applications.uRequired background: - registration in an undergraduate program in one of mechanical engineering, chemical engineering, biomedical engineering, materials engineering, chemistry, biology, physics, or applied mathematics Required skills: - ability to conduct experiments - able to work independently - working knowledge of one of Matlab, OriginPro, or Microsoft ExcelExperimental work in the development of a protocol to polymerize the coating layer, demonstrate the proof-of-concept capture of the conformity of coatings on non-spherical particles, engineer biocompatible and multiple layer coatings, and test the coating technique with living cells.Mirza Faisal BegSchool of Engineering Science!Simon Fraser University - BurnabyBritish Columbia8Segmented brain MRI atlas for smartphone/tablet computerWBrain MRI atlas is a vital tool for learning in neuroanatomy. Such atlases in an integrated handy format are not available to the user community. The overlay of segmented anatomical structures on the same subjects MRI image of the brain provides an integrated visual input that correla< tes the anatomical location, structural morphology and spatial perception. This augments the conventional methods of learning with quick reference from real world data in a handy format. The enhanced learning experience with anatomical annotations and colored representations with multiple views is expected to provide a holistic perspective. The thrust on user experience and user accessibility is key to the acceptance in the clinical environments. This projects plays a vital role in the dissemination of information to clinicians, patients and researchers likewise.Image processing fundamentals. C, C++ / Java proficiency. Sophomore year mathematics. software development experience is a plus. Mobile computing device application development experience is a plus. Computer graphics fundamentals.The student will be responsible for creating complete functional atlas application to be used in the mobile computing device environment. This involves some crucial steps such as conversion of medical imaging data to user viewable format; compliance with the medical imaging standards of the regulatory bodies, design of workflows for individual data types etc. Implementation and testing of the script/code in simulation environment and testing on a live device will be part of the process. Learnings: Experience of software development cycle for medical image analysis. Experience with designing of user accessible software for mobile computing device. Potential for submission of manuscript for publication review.JimMattssonBiological SciencesWEnabling selection for heartwood rot resistance and durability of Western redcedar wood The overall research project aims to develop a method for the combined detection, treatment and treatment monitoring of tumours in a way that would revolutionize the detection and treatment of several types of cancer. There has been a lot of interest in utilizing the targeted delivery of gold nano-particles (GNPs) to provide increased contrast?between tumour and normal tissue for both thermal therapy (which refers to the use of heat to coagulate and destroy tumours) and photoacoustic imaging (the combined use of photons and sound for imaging). While individual GNPs exhibit interesting optical behaviour due to their size/shape dependant properties, assemblies of GNPs can produce more interesting optical behaviour caused by the effect of coupling on their plasmonic bands. The formation of gold nanorods (GNRs) into assemblies can be accomplished by coating different parts of their surface with polymers that are phobic of philic to different solvents. The ability to form self-assemblies of GNR chains as a function of temperature has been recently demonstrated. This resulted in the measured peak absorbance shifting from near 750 nm at room temperature to 850 nm when heated to 45 Celsius. It is expected the coupling between GNPs formed in chain-like structures will affect the photoacoustic behaviour. Since the chain formation is temperature dependant it is hypothesized that the photoacoustic signal will be temperature dependant as well. The project will focus on the development and utilization of computational models based on the finite element method, using the commercial software, COMSOL Multiphysics. The chains will be modelled using the coupled photoacoustic model (being developed) to determine the role of temperature on the strength of the photoacoustic signal generated. This understanding will aid in the development of combined detection, treatment and treatment monitoring of cancer. The experience obtained in this research project is expected to benefit the student in many ways. The student will obtain hand-on experience with high HPC and complex multidisciplinary finite element analysis. The direct involvement with the medical physics group at Ryerson and with the collaborators at Lyon, France and McMaster University will provide the student with enhanced opportunities to pursue future work terms or graduate studies. The student will also gain understanding of the scientific process of inquiry, critical examination of results, and analytical interpretation of data.The student needs to be interested in working outside (max. 8 hours/day) in a beautiful location under a variety of weather conditions, with low levels of biting/stinging insects in unmown pastures and forested areas. Students must be familiar and comfortable with matrix algebra and it would be helpful if students were familiar with Matlab (www.mathworks.com/products/matlab/). It would be helpful (but not absolutely necessary) for the student to know how to drive a vehicle.jIn close collaboration with others (myself or a field assistant), the student will plant seeds of known genotypes in freshly tilled ground. On a weekly basis, the student will census each demography plot to determine if seeds have germinated, grown to become plants, flowered and/or died. At the end of the summer, the student will learn to estimate seed production of each plant. When not in the field, the student will be using Matlab to estimate population growth rates of the previously collected data, or writing up the report (including introduction, methods, results and discussion) on this project. At the end of the summer, the student will give a presentation to our labgroup about their findings. Timetable for Proposed Research ?May 1st ?May 15th: Set up of experiment ?May 15th ?August 15th:Weekly census of each demography plot. ?May 15th ?July 15th: Data analysis of previously collected data. ?July 15th ?August 15th: Write up an introduction, methods, results and discussion of paper. ?August 15th< ?September 1th: Estimate fecundity of demography plants. ?August 15th : Present results to labgroup AlexandreDouplikPhysics-Photosensitizer Doped Dental Composite ResinsuThe primary aim is to develop dental composites with controllable release of antimicrobial agents to provide antimicrobial therapy or maintenance without replacing dental filling. Dental composites will be doped with photosensitizers in form of nanoparticles. The doped photosensitizers will be activated by light to expose photosensitizer nanoparticles into the tooth cavity, in turn the interaction of the photosensitizer with light produces oxygen radicals, which, in turn will kill the pathogenic flora within the vicinity of the photosensitizer exuding the dental filling. The photodynamic reaction occurs involving three components - (1) photocatalysts (photosensitizer), (2) light and (3) molecular oxygen to generate oxygen radicals such as singlet oxygen. The photodynamic reaction will be activated by non-UV light as compare to the UV light for photo-polymerisation of the composite resin to avoid interference between two processes. The dental composite will be developed to provide molecular oxygen and oxygen radical diffusion as well as light penetration through the material. The dental composite properties will be adjusted to resist the photodynamic destruction inside the dental filling itself. The photosensitizer doping will be developed to avoid or minimise photo degradation due to the sun light exposure. This biotechnology study will be translated to clinics and industry.The student should be familiar with dental technologies, particular with dental composites, photosensitizers, principles of photodynamic therapy and optical radiation propagation in the matter.<The student will perform (1) preparation of nanoparticle form from various photosensitizers (phtalocyanines and metal based), (2) doping the photosensitizers into the dental resins, (3) spectroscopic study of the photodynamic reaction on the surface of the photosensitizer doped dental composites in water solutions.SethDworkin%Mechanical and Industrial EngineeringAComputer Simulation of Microgravity Combustion and Soot Formation>Soot, also known as black carbon particulate, is a major environmental nuisance and a health hazard. Smaller particles can be inhaled into the lungs and absorbed into the blood stream leading to a variety of health disorders and in some cases, death. The formation of soot within a flame is based on a complex set of physical and chemical processes that interact in a variety of ways. As such, it is difficult to make predictions of soot formation for different fuels or combustion devices without large, detailed computer simulation. One question arises, which is, what is the dependence of soot formation on buoyancy? This question is relevant to space applications where microgravity reduces or removes buoyancy effects, and accidental fires which lead to soot formation can be detrimental in a limited-air breathing environment. The intern will perform detailed computer simulations of microgravity combustion and soot formation using our in-house codes. Predictions of soot formation by the computations will be compared to experimental data where available and to 1g simulations.NThe student should be comfortable with computer programming in a base level language such as C, C++, of Fortran (Fortran preferred). The student should be majoring in Mechanical Engineering and have taken courses in fluid dynamics and thermodynamics. If the student has also studied Numerical Analysis and Combustion, that is a bonus._The student will perform detailed computer simulations of microgravity combustion and soot formation using our in-house codes. Predictions of soot formation by the computations will be compared to experimental data where available and to 1g simulations. The student will participate in weekly group meetings and discuss and interpret research results.UDetermining Optimum Design Specifications for a Hybrid Heat Absorption Cooling SystemHeat activated adsorption and absorption chillers offer an interesting potential alternative to traditional electrical compressor air conditioners. Rather than being powered by electricity, they are powered by heat that can be supplied either by natural gas burners, solar panels, or fuel cells. While they are more expensive to build than compressor based systems, they are less expensive to operate on an annual basis because the cost per kWh of heat is often much less than that of electricity. Some important questions relating to economic viability of these systems have arisen, such as, how many years of operation will it take for the higher initial investment to pay off? Is it best to combine these systems with a traditional system and try to benefit from somewhat reduced initial costs and somewhat reduced operating costs? If so, what is the optimal capacity of both systems. The student will perform detailed calculations to understand and begin to answer these questions._The student should be comfortable with computer programming and numerical analysis and preferably have experience with Matlab programming. The student should be majoring in Mechanical Engineering and have taken courses in fluid dynamics and thermodynamics. If the student has also studied Numerical Analysis and Engineering Economics, that is a bonus.TThe student will begin by adapting Matlab optimization algorithms that our group has previously developed for the problem at hand. Then the student will perform detailed calculations to determine optimum hybrid system specifications and how those may vary with regard to changes in building needs, system performance, and commodity pricing.Tony Hernandez+Centre for the Study of Commercial Activity>Web-based GeoSpatial Analytics for the Consumer Service SectorThe project will involve designing and building a prototype web-based data analytics portal to access geospatial data and analysis from the Centre for the Study of Commercial Activity (CSCA), Ryerson University. The site will need to provide a number of basic querying, reporting and analytical functions. The core functionality will link to existing web-based mapping capabilities, such as Google Maps, and thus provide a {aeographic?window into the underlying data. A series of spatial querying functions will also need to be developed to provide flexibility in the way in which end-users can interact and/or extract data from the site. A number of case-study geospatial data analytics applications will be built within the site in order to showcase the potential for web-enabled interaction with the CSCA geospatial data warehouse. These example applications will be developed based on input and feedback from the industry partners. The project will also look into the potential to develop parallel interfaces that are mobile-device enabled. The project deliverables will include a working prototype of the geospatial analytics web interface, documente< d project design workflow and a series of recommendations for future development of the data analytics portal.The primary skills required are web programming and web-based data analytics. Experience within an Oracle (or similar relational) database environment, along with associated business intelligence and analytics software would be beneficial. Knowledge of web-based mapping, geographic information systems, spatial database structures and spatial functions is also preferred. Additional training on the business applications of geospatial analytics will be provided.WThe student will be working with a group of full-time geospatial analysts and database programmers in a dedicated research lab environment. The primary task will be the development of a prototype data analytics site. This will require interaction with the team of CSCA analysts and programmers, under the direct supervision of the Director of the CSCA. It is envisaged that following the initial design phase, that the bulk of the time will be spent using web development tools to code the required basic data analytics functionality (query, retrieval, display, visualization, mapping, extraction). The student will also be expected to seek feedback from CSCA analysts and, with the support of the Director, from industry partners. The project will provide substantial scope to develop spatial analytical skills and to learn from the existing team of geospatial experts. The student will be expected to provide apdate?presentations and demonstrations to the team on a weekly basis. This will provide substantial opportunity for group interaction and collaborative learning along with a supportive feedback loop.Dae KunHwangYOptimization and enhancement of microfluidic-based synthesis of functional microparticles  Microfluidic-based synthesis offers not only distinctive features in which particles are highly monodispersed, non-spherical, and multi-functional, but also precise control over their material properties (size, shape, chemical anisotropy, and functionality). For instance, microfluidic-based synthesis using stop-flow lithography (SFL) offers versatility for the direct fabrication of such microparticles with custom shape and multi-function/composition, and also allows for the encapsulation of cells and substances of interest. More importantly, the additional ability to tune the shape of highly homogenous microhydrogels provides considerable advantages for tailoring drug-release kinetics for drug delivery applications and assembling them to build complex cellular matrix based on an bottom-up approach for tissue engineering applications. Furthermore, this method allows us to pattern multi-degradable sites in a single hydrogel microparticle, which can be exploited to create ideal degradation and drug-release profiles. In spite of great successes in the microfluidic-based synthesis of functional microparticles, this area of research is still in early stages of development and the full potential of these techniques has not yet been realized. Furthermore, the resulting functional microparticles have only seen limited use in their potential fields of application, including photonics, diagnostics, drug delivery, and tissue engineering, where highly controlled and enriched properties of individual particles are required. Objective of this project is to generate microparticles with desired features and properties such as (i) hydrogels microparticles with controlled porous nanostructured patterns (micro-sponge) and (ii) multi-functional biodegradable hydrogel microparticles. A student candidate is desired to have the following qualifications: 1) Third or fourth year in the department of Chemical Engineering. 2) Desired course taken: Fluid Mechanics, Thermodynamics, Heat and Mass Transfer and Transport Phenomena. QA selected student will be involved in some of research activities for the proposed project. He/she will be a part of the project team and will work directly with a postdoctoral student. The detailed research activities that the student will be involved in as follows: 1) Design of microfluidic channels for particle generation including (a) the determination of relevant parameters such as channel geometry and channel dimension and (b) fluid dynamic computation using Comsol Multiphysics. 2) Fabrication of microfluidic channels using soft-lithography. 3) Quantitative characterization of resulting particles including the examination of network swelling, degradation, and mass-loss profiles of biodegradable hydrogels using an optical microscope in conjunction with a florescent probe. Through these hands-on experiences, the student will learn general technical skills in optical microscopy, microfluidic channel fabrication, computation, and instrumental analysis, which are relevant skills for many scientific disciplines. In addition, the student will have a regular weekly meeting with the supervisor, and join in regular group meetings and seminars. The student will also prepare a technical report/ paper based on his/ her research experience. This will offer the student to have a chance developing effective communication skills. Carl Kumaradas{The Development of Finite Element Analysis based Models for the use of Gold Nanoparticles in Cancer Detection and Treatment Colonizing populations face novel selective pressures, which may cause rapid shifts in life history. Such shifts may be pronounced when colonists are hybrids, which represent a suite of new phenotypes. This rapid evolution may be most obvious when species with divergent life histories hybridize. In this scenario, selection could increase invasiveness. Hybridization thus may play a major role in rapid weed evolution. In collaboration, the student and I will quantify the impact of hybridization and life history on population dynamics of a problematic weed. The demographic consequences (e.g., increased weediness) of a shift in life-history traits due to hybridization may not be inferable from effects of hybridization on seed production, which I have studied previously. Rather, a complete life-cycle analysis may be more appropriate to understand how hyb< ridization and shifts in life history impact weediness. By including seed germination and seedling establishment, we will gain a more complete understanding of the fitness and demographic potential of Raphanus lineages that have experienced natural selection or artificial selection for a particular life history strategy (i.e., either early reproduction or large size at reproduction). We will grow plants with different genetic backgrounds in two common garden experiments and measure the plantb demography and seed production of plants. Raphanus seeds germinate in the spring, and develop into rosettes within three weeks. Rosettes increase in size over the summer until bolting begins and once flowering is initiated, vegetative growth ceases (leaves fall off). Flowering and seed production continue throughout the fall until a hard frost kills the plants, and the seeds over-winter. I have already collected one yearb worth of data on this project, so over the course of the summer, the student and I will analyse the previously collected data at the same time as collecting new data, allowing them to experience all portions of this type of project in an accelerated fashion. To do this, we will then use a periodic matrix model for annuals and a Life Table Response Experiment approach to describe the demography of wild and hybrid radish harvested from the field plots and the artificial selection lineages in a common garden setting. Elasticity analyses will be used to determine which life history stages were the major determinants of population growth for each biotype and each experimental site. The broader impact of our research will be to improve our understanding of weed and life history evolution and provide important insights for invasion and conservation biology. Timetable for Proposed Research ?May 1st ?May 15th: Set up of experiment ?May 15th ?August 15th:Weekly census of each demography plot. ?August 15th ?September 1th: Estimate fecundity of demography plants. ?May 15th ?July 15th: Data analysis of previously collected data. ?July 15th ?August 15th: Write up results. ?August 15th : Present results to labgroupFThe student needs to be interested in working outside (8 hours per day, maximum) under a variety of weather conditions (not above 37C), with low levels of biting/stinging insects in unmown pastures and forested areas. Students must be familiar with the taxonomy of major plant families and identifying plants using taxonomic keys. It would be helpful if students were familiar with ecological sampling techniques. Students will also work with native bees and must be comfortable with the rare event of being stung. It would be helpful for the student to know how to drive a vehicle.As part of a team, the student will map geographic boundaries of all known populations of the study plants. The student will assist in assessing population size of each known population. On a weekly basis, the student will monitor flowering of these populations through the season and erect pollen traps at various distances from the populations of interest. The student will also catch pollinators with nets and wash pollen off their bodies. Although this project is largely field-based, the lab group also runs weekly meetings to discuss current scientific literature, as well as projects in progress in the lab. The student will be expected to participate in these discussions by reading suggested publications, debating experimental approaches, as well as commenting on manuscripts in progress. By the end of the semester, the student will also have produced a written document, complete with background information, methods, and results. This, too, will be presented in our labgroup meeting.DHow can hybridization shift life history toward increased weediness?pFundamental to conservation science is the ability to locate rare organisms and estimate their abundance as well as to follow and predict invasions of weedy species. Yet, quantitative data on geographic range, number of populations and population size often don't exist for many uncommon species as this is a labour intensive job. Further, species invasions are notoriously difficult to track at their leading edge because of their ability for long distance dispersal. We will be developing a field sampling method that exploits pollen dispersal combined with environmental barcoding of pollen to identify local species, of known local geographic range, known number of populations and known population size. We will create a method that allows us to determine geographic range, number and size of known plant populations. We will map plant populations on an 800 acre property (Koffler Scientific Reserve), assess their population size and number of populations for a series of study species. Several study species will selected based on pollination syndrome (e.g., wind, bees, butterflies), life history (perennial, annual), habitat (prairie, forest) and conservation status (locally rare, locally invading). Flowering phenology of these populations will be tracked through the season. We will then erect pollen traps at various distances from these populations and sample pollen on a weekly basis. At the end of the field season, we will use genome barcoding methods to identify the species of origin of pollen collected on pollen traps as well as their relative abundance on the pollen trap. Finally, we will create spatially explicit analytical models to "find" the populations censused throughout the summer to determine the utility of pollen traps as remote sensors of small populations of either rare or invading plants. We expect the development of this method and use of cutting edge technology will be of interest to both for-profit companies that perform environmental assessments in Canada and abroad, as well as non-profit organizations with large land trusts and several sectors of the government with mandates to manage large land areas.We are looking with someone with a background in psychology or biology. Other than that, < we want someone who is curious and willing to learn.The main duties include locating pertinent studies, converting their statistical findings into a common metric, then amalgamating all findings into a single cffect size?that represents overall strength of association. Much of this will be accomplished with a meta-analysis programme called comprehensive Meta-Analysis.?In addition, the RA would code theoretically important aspects of the studies to determine whether they moderate the influence of early parenting on HPA function. Thorough, ongoing training and interaction with the principal investigator, two graduate students, and at least one other undergraduate student would be involved. Reviewing the existing literature provides an overview of substantive and methodological features of research. It does not provide a textural sense of what is actually being studied. However, we have several ongoing studies in the lab which involve observing and coding mother-child interaction in the home and in the lab, observing child behavioural response to challenges presented in the home and in the lab, collecting saliva samples from child and mother, and storing, preparing, and assaying those samples for cortisol. We will invite the RA to participate in some of the home and lab visits (to serve as an assistant to the coordinator) and to observe all other procedures, so that they get an experiential sense of what is involved in designing the studies they are coding, running them, and collecting the data. We hope to impart a sense of the excitement involved in the full process of research. LesleyCampbellChemistry and Biology?Using e-Barcoding to remotely locate rare and invading species.s The hypothalamic-pituitary-adrenal (HPA) axis, through its end product cortisol (dhe stress hormone?, is linked to most diseases of humankind, physical (e.g., diabetes, obesity) and psychological (e.g., anxiety, depression). Early environmental factors set the HPA axis on a trajectory that affects its later function and influences development across the lifespan. Given its pervasive and enduring implications, the axis is a primary focus of developmental research. Nevertheless, our understanding of the system remains rudimentary. One potent influence on HPA development that has been widely studied involves parenting. Recent research, including our own, shows that even normal variation in parenting has large effects of HPA function. Although research links parenting and HPA function, basic questions remain. How strong is the association? What aspects of parenting affect HPA function? What aspects of HPA function does parenting affect (circadian rhythm, stress reactivity, cortisol trajectory)? What factors modify the influence of parenting on cortisol secretion? Until we answer such questions, we will not understand the basics of HPA function nor can we intervene to prevent atypical HPA development and its adverse effects. We will address such issues via meta-analysis. Meta-analysis involves quantitative review of the literature. It involves locating pertinent studies, converting their statistical findings into a common metric, then amalgamating all findings into a single cffect size?that represents overall strength of association. In addition, theoretically important aspects of the studies are coded to determine whether they moderate the influence of early parenting on HPA function. For example, early parenting may have a stronger impact of HPA function than later parenting, so we would code for age at which parenting was assessed. Females are considered less vulnerable to early stressors than males, so we would code for percent females in a sample. Essentially, we propose a meta-analysis to determine the strength of association between parenting and cortisol, and what factors moderate that association. These basics are necessary to understanding HPA function and intervening effectively. The HPA axis is itself an important target of investigation because it is linked not just to one or two adverse outcomes, but to almost all disease processes, physical and psychological. *Applicants for the project should have a background in distributed systems and have experience< programming with Java in Windows and Linux environments. Experience with service-oriented architecture/Web services and cloud computing platforms such as Amazon EC2 are an advantage but are not required.The student will be involved in the initial phases of the project. The project plans to use Topology and Orchestration Specification for Cloud Applications (TOSCA), which is a platform-independent specification method from OASIS, to describe applications. The student will help with the design and development of software tools to automatically deploy an instance of an application specified in TOSCA onto EC2. LeslieAtkinson PsychologyRyerson University - Toronto+Early Influences on the HPA axis: Parenting Cloud computing is growing in popularity as a paradigm for providing on-demand resources. Organizations are beginning to see clouds as an economical way of augmenting, or even replacing, their existing IT infrastructure. As a result, many organizations are considering moving their applications and data to a cloud environment in order to take advantage of its flexibility and potential cost savings. There are, however, numerous challenges associated with making this move. The first challenge is that the current cloud computing landscape is both complex and in a constant state of flux. For example, organizations have to decide on the most appropriate type of cloud, that is private, public or hybrid, and in the case of public clouds they must also choose from a variety of cloud providers with different quality guarantees and pricing models. There are also numerous technical challenges associated with migrating applications and data to clouds including deploying and provisioning an application in a cloud to meet required Quality-of-Service (QoS) levels; monitoring application performance and dynamically re-provisioning as demand fluctuates in order to maintain QoS commitments and minimize costs, and deploying and managing applications across multiple clouds, including both hybrid clouds and federated clouds. Cloud application management is therefore a complex process and this complexity must be substantially reduced if organizations are going to be comfortable with moving to a cloud. One approach to reducing the complexity of managing cloud applications is to incorporate autonomic computing techniques and principles so that management tasks are performed automatically according to high-level policies from the users. From an organizationb point of view, the Quality-of-Service (QoS) requirements of an application and the costs of the cloud resources are key factors in cloud application management. Organizations will want to run their applications on a cloud provider that can satisfy the QoS requirements in the most cost-effective manner. We hypothesize that a QoS-aware application management service can use these QoS requirements to guide management decisions and actions. The overall objective of the proposed research project is to provide a comprehensive framework for QoS-aware management of cloud applications and an application management service that implements the framework. The framework will encompass a set of novel models and methods to meet the challenges outlined above. The application management service will take advantage of relevant open-source products and proposed industry standards wherever possible in order to allow rapid transfer to industry. \Completion of the first two years in an undergraduate computer science, computer engineering or related discipline. The student should be able to program in at least one of the major programming languages like C/C++/C#/Java in order to implement early research prototypes. The student should display an interest in and a willingness to learn about how research is performed and disseminated. Industrial experience related to software development would be considered an asset. Knowledge of scripting languages like Perl or Python and statistical packages like R would also be considered an asset.Dr.Hassan, Dr.Shihab (Post Doctoral Fellow), and Mr.Syer (PhD Student) will work with the student at SAIL. The student will also have the opportunity to interact with the other researchers (16 - 20 at any given time) in the lab who have expertise in other areas of Software Engineering. The student will learn to perform a literature search through cataloging related research. They will attend research meetings and provide input along with other SAIL members, in order to come up with possible solutions to measure the quality of mobile software systems. The student will formalize the theory as hypotheses and then collect and analyze the data from existing mobile software systems in order to validate the various hypotheses. At the end of the project the student along with the other researchers will consolidate the results in an academic paper that will be submitted for publication. The student will learn the various steps of conducting a research project in the software engineering domain: 1. Literature search 2. Hypothesis building 3. Data collection, and 4. Hypothesis validation. They will build and apply software tools for extracting and analyzing data from various mobile open source projects. They will learn how to mine software repositories, such as source code and bug repositories, and how to build statistical models that help us better understand the information retrieved from these repositorie< s. The student will also learn how to prepare weekly reports and monthly presentations for the research group. The skills learned are transferable and applicable to any software-related job the student may undertake in the future.Software Quality ModelsMuch software engineering research is focused on the creation of models to predict effort requirements and defect probabilities. Such models are important means for practitioners to judge their current project situation, optimize the allocation of their resources, and make informed future decisions. However, software engineering data contains a large amount of variability and has high number of dimensions (many different metrics to measure the quality of software). Recent research demonstrates that such variability leads to poor ?ts of machine learning models to the underlying data. Recent results suggests' the splitting of datasets into more ?ne-grained subsets with similar properties (i.e. local models). In this project we intend to study how to build such local models. One question we hope to answer is: which clustering algorithms would be the best to split the dataset into smaller groups, such that the software quality models are more accurate in capturing and describing the behaviour of software? We will also experiment with non-linear statistical models and determine if they would be better at modelling software engineering data.Dr.Hassan, Dr.Nagappan (Post Doctoral Fellow), and Mr.Bettenburg (PhD Student) will work with the student at SAIL. The student will also have the opportunity to interact with the other researchers (16 - 20 at any given time) in the lab who have expertise in other areas of Software Engineering. The student will learn to perform a literature search through cataloging related research. They will attend research meetings and provide input along with other SAIL members, in order to come up with possible solutions to measure the quality of software systems. The student will formalize the theory as hypotheses and then collect and analyze the data from existing software systems in order to validate the various hypotheses. At the end of the project the student along with the other researchers will consolidate the results in an academic paper that will be submitted for publication. The student will learn the various steps of conducting a research project in the software engineering domain: 1. Literature search 2. Hypothesis building 3. Data collection, and 4. Hypothesis validation. They will build and apply software tools for extracting and analyzing data from various open source projects. They will learn how to mine software repositories, such as source code and bug repositories, and how to build statistical models that help us better understand the information retrieved from these repositories. The student will also learn how to prepare weekly reports and monthly presentations for the research group. The skills learned are transferable and applicable to any software-related job the student may undertake in the future. Software Evolution Dashboards Contemporary software development projects consist of hundreds of developers working together on a very large scale source code base, making many rapid changes to the source code at the same time. Trying to understand and monitor which parts of the code are being worked on by which developers is a constant challenge for both managers and developers. Currently, the lack of knowledge on the allocation of resources to source code increases the likelihood of faults, known as bugs, and decreases the quality of the software product. Recently, we at the Software Analysis and Intelligence Laboratory (SAIL ?http://sail.cs.queensu.ca/) have proposed a solution to this problem that automatically monitors which parts of the source code are under revision as the source code evolves over time. Our solution automatically extracts statistics about different topics (e.g., database connection, menu click, save file, print file) in the source code, and determines how those topics are changing over time. Examples would be: the save file topic is under revisions by [some employee] or the print file topic has remained dormant for [time period]. Our vision is to incorporate this knowledge into software evolution dashboards, which can be used by project stakeholders to monitor the development effort in ultra-large scale software projects. Currently, we have developed our solution at a theoretical level and performed initial case studies and accuracy assessments. This work is published in international IEEE conferences [SCAM 2010, MSR 2011].Basic familiarity with programming in the standard imperative programming languages like Java, C/C++ and C# is required. In addition, knowledge of data and file manipulation in the Unix/Linux environment, of scripting languages (Python, Perl), of web development (HTML, JavaScript, CSS, XML, XQuery, jQuery), of data analysis and of visualization would all be considered assets.The student will assist Dr.Hassan, Dr.Thomas (Post Doctoral Fellow), and Mr.Chen (PhD Student) in implementing the software evolution dashboard. The student will also have the opportunity to interact with the other researchers (16 - 20 at any given time) in the lab who have expertise in other areas of Software Engineering. Given the theory developed in our prior work and the extracted evolution data, the student will research and design a graphical user interface (GUI) that allows project stakeholders to effectively interact with the extracted data. The student will also have the opportunity to learn the various steps of conducting a research project in the software engineering domain: 1. Literature search 2. Hypothesis building 3. Data collection, and 4. Hypothesis validation. Additionally the student will assess third-party software libraries for actively displaying data like the Google Web Toolkit or the Jitsu web application toolkit. Using one of these libraries, the student will develop an interactive application to display the evolution data to a user, allowing the user click on certain parts of the data to view more detail. Interface design and source code implementation will be key aspects of this project. The student will also learn how to prepare weekly reports and monthly presentations for the research group. The skills learned are transferable and applicable to any software-related job the student may undertake in the future.PatrickMartin*QoS-Aware Management of Cloud Applicationse Did you ever wonder how hard it is develop an iPhone or Android app? Did you ever wonder who works on these apps? How hard is it compared to developing the Firefox web browser? At the Software Analysis and Intelligence Lab (SAIL -- http://sail.cs.queensu.ca/) we are trying to answer these questions. We are part of a team of researchers worldwide looking at this new type of software development, called mobile software development. Today, software engineering research focuses on traditional software systems like the Firefox web browser or Microsoft Windows, which take years to develop and teams of designers, developers and debuggers. Software engineering is rapidly changing though. Emerging domains, such as mobile devices, are growing rapidly and depend heavily on new software operating systems like Android and the applications that they run, commonly referred to as apps. The mobile device industry, including smartphones and tablets, is one of the strongest emerging industries today with an estimated worth over 30 billion dollars and strong market growth trends. Software engineers for mobile devices are faced with new and different challenges than developers who develop traditional software systems. For example, since power consumption and memory usage are a scarce resource on mobile devices, large calculations are often performed on external ULSS - like the cloud computing infrastructure that Amazon, Google and Microsoft can provide. Thus software development for mobile devices has its own set of interestin< g challenges. The changes to the development process required by mobile software systems have not yet been defined. In particular, the quality of the software developed for the mobile industry is of paramount importance to the millions of users who depend on them everyday. In this project we intend to study the different aspects of software quality with respect to mobile applications. The goal of this project is to determine the metrics that best describe the quality of the mobile applications. Then, we plan to interpret these metrics and use them to shed light on: 1) factors that impact the quality of mobile applications, and 2) changes to improve software quality. Our findings will provide valuable knowledge to mobile software development organizations such as Research In Motion (RIM), Microsoft, Google and Apple.A successful student for this project should possess a strong background in either chemistry or materials science. Some knowledge and basic skills in quantum mechanics would facilitate the operation with quantum chemical program packages, but is not required since the programs can be treated as black boxes. Apart from that the student should be willing to read through the literature and learn about quantum chemistry in general.rThe student will collect a large number of candidate materials for the benchmark set. This is an ideal opportunity for the student to get in contact with primary literature and to learn about the way how scientific articles are written without having to know every single detail about its content. Once the scope of pertinent compounds has been identified, the feasible and representative ones have to be selected and grouped according to certain criteria. Assisted by the supervisor, the student will formulate such criteria and also decide which molecules will enter the benchmark set. In the next phase, highly accurate calculations will be performed on the molecules of this set and the results have to be stored in an easily retrievable form. The student will thereby learn about the different approaches and methodology that exist in quantum chemistry and their respective accuracy, but also their limitations and computational cost. The tradeoff between cost and accuracy that permeates computational science will be directly apparent in this stage. Furthermore, the student will be exposed to high-performance computing. EmilyCranstonASurface Engineering of Renewable Materials based on NanocelluloseThis summer project will include a variety of tasks pertaining to the design of renewable composites, coatings and liquid formulations based on nanocrystalline cellulose (NCC). This will include the preparation and characterization of NCC from various cellulose sources as well as the characterization of newly developed materials. This work will enable the student to become skilled in various wet-lab chemistry techniques and polymer/nanoparticle characterization tools such as dynamic light scattering, atomic force microscopy, surface plasmon resonance, rheology, optical microscopy, thin film characterization and electrophoretic mobility testing. ?Previous laboratory experience (preferably in a wet lab) ?Experience with synthetic chemistry, microscopy and polymer characterization ?Generally interested in nanotechnology, polymer materials and nanocellulose ?Organization and computer skills ?Highly motivated and results-driven individual with excellent interpersonal skills ?Excellent verbal and written communication skills ?Must be able to work well in teams and to work independently with limited supervision See description of project above: The focus will be on synthesizing compatible NCC (hydrophobic and polymer-grafted) for incorporation into sustainable nanocomposites. Characterization and compounding of modified NCC with other polymer matrices will also be undertaken.QiyinFangBiomedical EngineeringEndoscope trackingIn this undergraduate research project, we plan to develop an imaging system that can monitor and track and endoscope's movement and position during screening. Imaging processing and analysis using Matlab; using and programing microcontrollers hands on experiences building customized PCB based electronic circuitry; The student will work independently on this project to complete a full development cycle: from design, feasibility study, building the instrument, and performance testing. RafikLoutfySchool for Engineering PracticeLAuto ID Solutions for improving Track Worker Safety in Transportation SectorpSince the advent of rail transportation, the integrity of the rail and track conditions play a significant role in ensuring that trains are able to move people safely. Although this work has been hazardous since rail transportation was created, there have been few successful attempts at improving location awareness of approaching trains and the location of these mobile track inspectors. Typically the mobile inspection crews have no advance warning when a train is approaching, and train crews usually receive system wide broadcasts of where inspection crews are working. While some transit systems have more robust procedures, the work remains hazardous. Track worker protection is of critical importance in the rail industry around the world. As is the case in North America, the rail industry around the world relies on lookouts and communication to alert track workers to the presence of approaching rail vehicles. Although policies and procedures exist to keep track workers safe, accidents do happen. The goal of this research, and the resulting technology development, will create improved location awareness of track workers and approaching trains to prevent serious injuries and the loss of life. We are working with Bombardier as well as a few transit agencies to identify hazardous operations and to research and develop technology solutions that can be used to provide better location awareness for the track inspection workers as well as the train operators. Our hope is to develop a system that can provide relevant localized information as opposed to relying on system broadcasts that donb often generate the same attention.FThe role of the student would be to assist the team with development of the user interface design. Although the team is exploring various physical and virtual interfaces, the student will be asked to assist with the interface design for the central management system. Activities that the student might undertak< e would include gathering requirements, designing prototype interfaces and programming those interfaces. The student will be expected to work with the academic team and the industry partner. Hence interpersonal skills and strong communication skills will be required. bThe skills required would include software design and programming, database design, business process re-engineering, software life cycle management, and understanding of wireless communications. Soft skills including strong interpersonal skills, communication skills, time management skills and ability to creatively solve problems will be required. PrashantMhaskar2Constrained Control Lyapunov-function constructionOne of the fundamental unsolved problems in control theory is the choice of a constrained control Lyapunov-function for nonlinear systems-a problem that goes to the heart of defining and understanding what stability means. In our group we have used results on null controllable regions for linear systems to address this problem for linear systems. This project will focus on a class of nonlinear systems.Strong mathematical skills and willingness to understand and use advanced mathematical concepts. Good expertise with programming (matlab).The student will spend the first month understanding some of the advanced control concepts, then contribute to the development of the results and illustration via a simulation example.*Fault-Handling in Chemical Process Systems The need for a comprehensive framework to diagnose and handle actuator and sensor faults has been well recognized. This project will strive to develop rigorous first principles and data-based fault-detection and isolation filters and fault-tolerant control designs. Very strong mathematical skills and willingness to understand and implement advanced mathematical and chemical engineering concepts. Also requires good programming (matlab) ability.qThe student will help in the development of the framework and illustration of the results on simulation examples.'Modeling and control of batch processesBatch processes are utilized to create high value products. The importance of specialized modeling and control methods for batch processes is also well recognized. This project will advance the modeling and control framework for batch processes recently developed in our group.Excelleng mathematical skills and understanding of chemical engineering fundamentals, as well as proficiency with programming (matlab).eThe student will help in the development of the framework and set up illustrative simulation results.JohnPrestonEngineering Physics0Flexible solar power for active window dressingscThe development of flexible photovoltaic structures potentially enable them to be incorporated into a broad range of structures. This project will consider the incorporation of a photovoltaic system into woven fabric structures. This project is in collaboration of MW Canada Limited, a Cambridge-based manufacturer of advanced woven materials. The goal of this project is provide the necessary power to provide enhanced functionality for systems utilizing woven fabric. This could be as simple as providing actuation for a window blind. There are currently 3 options for such systems; hardwired, battery driven, solar-recharged battery driven. This last option involves a conventional Si solar cell pasted to the window with a wire providing a trickle-charge to the battery. Within our project, we are considering several other functionalities which include light sensing and automated diffusing of direct light, linkages to wi fi networks, home security, self cleaning and electrically driven antibacterial action. The average power requirements of these applications is very modest and as a result PV efficiency is not the dominant concern. Instead we need to focus on compatibility with manufacturing processes and a robust system architecture which is resilient with respect to localized failures due to damage and loss of performance due to materials degradation. MW Canada has the capacity to interweave patterns of conducting fibre into their products, producing bus bars that will enable the active area to be sectioned into a large number of independent devices that will be automatically isolated if failure occurs. Students interested in this project should have some background in electrical measurement and optics. A background in either semiconductor device physics or in polymer chemistry would be an asset.PThe student will be involved in all aspects of the project, from materials development through to consultations with the industrial sponsor. Support will be provided by the PDF & 2 graduate students working on the project. Key elements of the project include: * simulation of circuit performance using available software * preparation of electronic materials compatible with woven structures * characterization of materials and refinement of preparation processes * fabrication & testing of device elements * integration into complete device * testing of prototypes during scale-up Dominik PJBarzQueen's University - KingstonbDesign of electrokinetic unit operations suitable for Lab-on-a-Chip and other microfluidic devicesMicrofluidic devices have become an essential part of our daily life and are engaged in a multitude of technologies such as inkjet printers and pregnancy tests. Another microfluidic important technology is the so-called Lab-on-a-Chip (LOC). An ideal Lab-on-a-Chip is a device that integrates a multitude of chemical/biological laboratory functions on a single chip of only few square centimetres in size. A typical application could be the determination of bloodb insulin content of a diabetic person. The LOC concept features several advantages such as reduced fabrication and maintenance costs, improved performance, and also less waste. Many LOCs employ electrokinetic phenomena to realize unit operations, such as mixing, pumping, analyzing of ions, or the manipulation of particles or cells, necessary for successful operation. Electrokinetic phenomena are related to the presence of a so-called electrical double layer (EDL). Most materials feature electrical surface charges when in contact with a liquid. For example, the surface of a microchannel wall in contact with an electrolyte is usually electrically charged. This surface charge has a substantial influence onto the distribution of the nearby ions in the liquid. In a certain distance to the wall, the liquid remains electrically neutral. However, very close to the wall (i.e. in the EDL) the liquid is charged. This EDL phenomenon can be used to generate electroosmosis which is a favourable too< l to induce flows in microstructures without applying a pressure gradient or to design micro pumps or micro mixers without involvement of any mechanical parts. The present project is concerned with the measurement of the surface characteristics of microfluidic substrates in contact with typical microfluidic liquids. The ultimate goal is to obtain empirical correlations of a substrateb surface charge depending on the liquid properties. Such correlations can be engaged for numerical simulations of electrokinetics in LOCs and other microfluidic devices. 9This work is suitable for a chemical or mechanical or materials engineering student with motivation and good spirit. The student should have some experience in working in a chemical lab, doing bench chemistry and instrumental analysis. Some background in microfluidics and interfacial phenomena would be an asset.The student will be provided with an opportunity to expand his/her knowledge of microfluidics and surface science. The student will be involved with characterization of microfluidic substrates such as silica, PMMA or PDMS in contact with different liquids. The student will do basic wet chemistry tasks to prepare the samples for the experiments. In detail, the student will mix solvents, electrolytes and buffers so that liquids with a defined ion content (ionic strength) and pH value are prepared. Quality control of the liquids is mandatory which is done by measuring liquidb conductivity and pH value. Different characterization techniques, depending on the testing material, are available. Particulate and planar materials can be characterized by electrophoretic light scattering and streaming potential measurements, respectively. Series of measurements will be conducted for different liquid parameter. Eventually, the student will be involved in the interpretation of the experimental results, so that he/she can develop basic understanding of Surface Science. The student will meet the PI on a regularly basis and substantial supervision and support on this research work will be provided. Besides the training in technical skills, the student will acquire various soft skills, such as teamwork, communication and problem solving, which will help meeting the requirements necessary for careers in industry and academia.DavidBerman Pathology and Molecular MedicineOHigh impact biomarker discovery to improve prostate and bladder cancer outcomesA significant portion of the 113,000 patients worldwide who die yearly from bladder cancer could gain many years of life by receiving established chemotherapy regimens. Chemotherapy given prior to surgery (cystectomy) can extend lives of select bladder cancer patients by 10 or 15 years. However, because more patients are harmed by these therapies than helped by them, chemotherapy is infrequently given. For example, fewer than 15% of eligible patients in Ontario receive neoadjuvant chemotherapy for bladder cancer. We are developing gene-based assays that will improve chemotherapy utilization by better predicting which patients are most likely to benefit. The student should have excellent analytic abilities and be able to communicate effectively in English. Coursework and/or experience in genetics and molecular biology is also essential.MUsing human cancer tissue specimens, the student will work closely with the P.I. and a postdoctoral fellow to perform histopathologic analysis, clinical annotation, and DNA and RNA extraction. Under supervision, the student will subject purified RNA and DNA to molecular analyses, including next generation sequencing and real time polymerase chain reaction analysis. The student will also have the opportunity to learn about bioinformatic analytic techniques. The goal of this work is to discover molecular signatures that will predict response to chemotherapy in bladder cancer patients.Gabor FichtingerSchool of Computing7Visualization of deformation fields in medical imaging Image registration methods are often used in image-guided medical interventions to determine dislocation of the target organ. Advanced registration methods can determine not just a rigid transformation, but also organ deformations. Performance assessment of deformable image registration algorithms requires visualization of the resulting deformation fields, by means of colored image slices, isolines, arrows, deformed grids, etc. 3D Slicer (www.slicer.org) is an open-source medical image analysis and visualization application that is used extensively in the Perk Lab. This application contains deformable image registration algorithms, but unfortunately it cannot visualize deformation fields.Required: Practical aptitude and mindset, linear algebra, numerical methods, programming in C++. Preferred prerequisites: Any combination of Computer Integrated Surgery, Computer Graphics, Medical Informatics or Medical Imaging, Math/Linear algebra.3PROJECT OBJECTIVES Develop various methods for visualizing deformation fields in C++, using the VTK visualization toolkit Evaluate the visualization methods on clinical images, select a few methods and optimize them Create an extension plugin for the 3D Slicer application that contains the selected visualization methods SKILLS DEVELOPED Context: Visualization for image-guided surgery. Analytical: Image registration, processing, visualization. Experimental: Software usability assessment and optimization, software performance optimization. (Contour interpolation in medical imagingPlanning of medical procedures, such as radiation therapy or minimally invasive interventions, almost always require delineation of organs and other important structures by closed curves. Most frequently this is performed manually, by drawing contours around the objects on several two-dimensional cross-sectional images. Typically the contours change smoothly between adjacent image slices, therefore it could be possible to speed up the contouring process by drawing contours manually only on a limited subset of images and generating contours between the manually contoursed slices by interpolation. The contour interpolation method could be also us< ed for generating visually appealing and accurate surface models from sparse manually defined contour sets. ~A curious mind and always-ready-to-try-new-things attitude C++ programming experience Interest in computational geometry+PROJECT OBJECTIVES Review methods that have been proposed for contour interpolation or two-dimensional shape morphing, such as: http://www.geometrictools.com/Documentation/MedialBasedMorphing.pdf http://www.ceremade.dauphine.fr/~cohen/mypapers/GabrielidesCohenSSVM09.pdf http://books.google.ca/books?id=I50DB1GoweoC&pg=PA124&lpg=PA124&dq=morph... http://ieeexplore.ieee.org/xpl/freeabs_all.jsp?arnumber=1530222 http://www.vtk.org/doc/nightly/html/classvtkVoxelContoursToSurfaceFilter.html Implement a chosen algorithm (preferably in C++, but also acceptable in Matlab) Test and optimize the algorithm Optional: integrate the algorithm into SlicerRT a software package for radiation therapy researchers SKILLS DEVELOPED Analysis, manipulation, and visualization of geometric objects in Matlab or C++, using open-source toolkits Collaboration with a multidisciplinary group of medical imaging algorithm researchers, software developers, and clinicians Using professional tools and methodologies for algorithm research and software development cSensor fusion for robust pose tracking using a combined accelerometer-magnetometer-gyroscope deviceDHigh-accuracy optical and electromagnetic trackers are commonly used for image-guided interventions for tracking the position and orientation (pose) of surgical tools and target objets. In certain applications these tracking devices could be replaced by small, low-cost inertial measurement units (IMU), which contains integrated accelerometer-magnetometer-gyroscope sensors. Typical size of the sensor is a few centimeters by a few centimeters and it costs about $150. This allows new applications that have not be possible to implement with optical or electromagnetic sensors. uA curious mind and always-ready-to-try-new-things attitude C++ programming experience Interest in signal processingPROJECT OBJECTIVES Review of previous applications of IMUs for image-guided interventions Review of methods for IMU sensor fusion Implement a chosen algorithm in C++, integrate it into the Plus open-source software platform Test and optimize the algorithm Evaluate the applicability of the algorithm on an image-guided intervention application (e.g., visualization of relevant part of the patient's CT/MRI/US image while inserting a needle) SKILLS DEVELOPED Analysis, manipulation, and visualization of geometric objects in C++, using open-source toolkits Collaboration with a multidisciplinary group of medical imaging algorithm researchers, software developers, and clinicians Using professional tools and methodologies for algorithm research and software development AhmedHassanSchool Of Computing#Mining Mobile Software Repositories% Nonlinear optical (NLO) properties of molecules and polymers have attracted the attention of many scientists and engineers in the last decades. It is very desirable to devise not only mechanically robust and inexpensive materials showing strong NLO activity. Nonlinear optical materials have essential applications in the design of lasers, but are also of practical use in materials engineering (e.g., for use in coatings for stealth aircraft) and are of interest in computer engineering (because photonics are a promising candidate for next-generation computing devices). Unfortunately, it is hard to formulate simple rules or to propose straightforward concepts that would allow for a rational design of compounds with taylored NLO properties. In addition, the properties responsible for the large NLO effects in molecules and polymers, such as the first and second hyperpolarizability, are quantities that are relatively difficult to measure experimentally. For fragile and unstable polymers (which, also, are often the ones with the most promising NLO effects), there is little experimental data available. Quantum chemistry can help there in calculating NLO properties of these molecules without worrying about the efforts of synthesis or measurement. Furthermore a large batch of materials can be screened and, more importantly, the structures can be altered to study the effect of the molecular geometry on the NLO properties. These structure-property relationships are known to be of utmost importance for a successful rational design. On the other hand, the quantum mechanics calculations are not problem-free. In particular, achieving high accuracy can be very demanding and time consuming. Depending on the molecular size and the quantity to be investigated, making a tradeoff between accuracy and computational costs is inevitable. Moreover, several methods applied to the same compound can contradict each other in their predictions of NLO effects and it is sometimes hard to determine how meaningful the results of a certain method applied to a particluar group of molecules are. Given these facts, it is necessary to come up with a database of benchmark molecules, whose properties are exactly and reliably known and which allow to distinguish once and for all between good and bad computational approaches. These test set has to be designed with care to have a representative selection of small to medium-sized molecules, treatable on a high level of theory, that are nevertheless abstractions or building blocks of more realistic compounds. Once such a standardized set exists, the evaluation of newly appearing computational methods is straightforward and the discrepancies in current computational methods can be determined either to be of general source or to apply only to certain classes within the benchmark set.  In the first stage of the project, the student will perform computational quantum chemistry calculations on the molecules in the database using standard computational chemistry software like Gaussian. The student will then work together with the preceptor and his graduate students to modify the in-house kriging program for this application. The next step of the project is to apply the kriging model to the database of compounds; this allows us to test the effectiveness of the model. Finally, the electrophilicity/nucleophilicity of molecules that are not in the database (due, for example, to experimental difficulties) will be predicted.)Benchmarking nonlinear optical properties When one is searching for a molecule with desired properties, one is faced with myriad possibilities, and exhaustive experimental and computational studies are impossible. Consequently, alternative ways to characterize the system are required. Molecular similarity approaches are promising, based on the basic idea that similar molecules tend to have similar properties, they allow one to leverage experimental and/or computational data about known substances to predict the properties of an unknown system. Specifically, the properties of a diverse set of molecules (called the deference set?or cata set? are used to estimate the properties of similar unknown molecules (called the dest set?or dnknowns? by exploiting the relationships between the molecules in the sets. Molecular similarity is one way of mimicking humans?incomparable intu< ition and pattern recognition with machine learning. This perspective motivates the use of kriging, a powerful estimation tool used in geostatistics, to compute molecular properties. Kriging is closely linked to regression analysis, and is sometimes called Gaussian process regression; it is a special case of kernel regression and can be mapped (with some difficulty) onto a neural network. The advantage of the kriging approach is its statistical rigor. With kriging we can improve our estimate for a test moleculeb property by considering its similarity to compounds in the reference set. In this project we will try to develop kriging models for electrophilicity and nucleophilicity of organic molecules. The reference set of data has been collected by Mayr in an online database of about 700 molecules. Even though this database is large, the accuracy of the values in the database is of uneven quality due to the difficulty of making the accurate kinetics measurements that are needed to determine the fundamental electrophilicity and nucleophilicity values. Theoretical approaches to electrophilicity and nucleophilicity are also problematic: computing accurate chemical reaction rates in solution is extremely difficult, and for the larger molecules of interest to organic chemists, almost impossible. The goal of this project is to use affordable (but inaccurate) computations, corrected by a kriging model, to predict electrophilicity and nucleophilicity values. We have had success with this approach for closely related properties like Bronsted-Lowry acidity and basicity, which makes us confident such an approach can work. The significance of the project is twofold. First of all, having a computational model for electrophilicity and nucleophilicity is of inherent significance because it allows organic chemists to predict reaction rates and to choose the correct solvent for a reaction before performing an experiment. Second, the machine-learning approach to computational chemistry is very new, and further validation of its utility provides a firm foundation for further development, with other applications. Ideal background would include knowledge and familiarity with calculus, statistics, and scientific/mathematical programming and numerical algorithms. Any student with interests and skills in mathematics and/or computational modeling, however, could be productive.K In the first stage of the project, the student will select a target enzyme from BindingDB (or another similar database) and build a library of small molecules with known binding affinities to the target. Building the library will require running computational quantum chemistry software (e.g., Gaussian). After the library has been built, the student will use our in-house kriging program to estimate the binding energies. This will require some programming, but there are expert programmers available to help the student in this portion of the project, if necessary. The known experimental data will then be used to validate the kriging method for biological applications. In the second phase of the project, the student will build another database, this time one proposed by our experimental collaborators. Again, computational chemistry calculations will be performed for each molecule in the database and we will attempt, first, to validate the kriging model against the (rather limited) available experimental data. Then the student will predict which molecules, among the ones that the experimentalists have not yet synthesized, are most likely to be effective. This prediction will be given back to our experimental collaborators but whether or not the prediction is successful will probably not be known on the timescale of the project. *Department of Chemistry & Chemical Biology@Predicting Chemical Reaction Rates with Machine Learning Methods The goal of this project is to develop accurate and efficient ways to estimate the properties of molecules and materials in cases where rigorous computational approaches are either impractical or impossible. To do this, we will adapt recent theoretical and computational advances in machine learning methods to the problem of property prediction. The specific approach we plan to use is called kriging, or Gaussian process regeression. Kriging is a procedure for constructing the best linear unbiased estimator for a propertyb value at an unknown location using the measured property value at other locations. Its utility arises because it captures the spatial continuity that is observed in most natural phenomena: property values change smoothly, so nearby locations have similar property values. There is a type of dolecular continuity?present in chemical phenomena: similar molecular structures have similar properties. To apply kriging in chemistry, we replace the cistance between locations?with a quantitative measure of the cissimilarity between molecules.? This can be achieved by des< igning an appropriate molecular similarity measure. Once a molecular similarity measure has been selected, we can find the optimal set of kriging weights and estimate the properties of an unknown molecule using the measured property values of other molecules. An appealing feature of this approach is that it produces statistical error bounds, so one can estimate the error in the predicted properties. The goal of this project is to use kriging to estimate the binding affinity of potential drug molecules for a pharmaceutical target. In the initial stage of the project, an enzyme pair from BindingDB will be selected; BindingDB is a database providing a large number of targets (mostly enzymes) and information on the EC50 (a measure of drug potency) of many molecules. Using this database, we can validate the kriging methodology for biological applications. In the second stage of the project, we will study an target that my research group has been interested in for a long time; this enzyme is implicated in cancer progression. Using a family of compounds that experimental collaborators have proposed to us, a binding-energy model will be designed. There are about 40 compounds with known affinity, and about 100 compounds that the experimentalists are interested in (but they do not wish to synthesize all of them!). The ultimate goal of this project is to provide computational guidance to our experimental collaborators by predicting which molecules are most likely to be effective against the target. The significance of this project is twofold. First, predicting which molecules are likely to have anti-cancer activity against our target is of high practical importance. Second, the specific type of kriging we are using has never been used in the context of predicting small-molecule binding energies for pharmaceutical applications. Establishing a new, more effective, method for binding-energy estimation will have long-term significance and broad applicability. The MITACS student should have a background consistent with most 3rd and 4th year chemical engineering programs. Experience and understanding of the following concepts are a large plus: Steady state modeling using Aspen Plus or other simulators; economic analyses of chemical processes, an understanding of sustainability and life cycle analyses, particularly using ECO99 and other sustainability index approaches; basic process design and analysis techniques.The MITACS student will combine Aspen Plus simulations of a biofuels-to-liquid process with a commonly accepted life cycle analysis methodology (such as ECO99) to perform a cradle-to-the-grave life cycle analysis of a thermochemical process using slagging gasifiers. Much of the modeling work in Aspen Plus has been completed by prior undergraduate students, and so most of this task will involve using the existing model to perform simulations of a variety of processes and extract meaningful data from the results. The student will also be required to apply the results to a life cycle analysis and learn to use software which helps with that purpose. The student will work in our computer lab alongside other undergraduate students, and work with the professor directly. Interactions with other undergraduate and graduate students are anticipated. The student is expected to make meaningful contributions and will be eligible for inclusion on any academic publications that may result. PaulAyersChemistry & Chemical BiologyPNew Machine Learning Methods for Predicting the Effectiveness of Drug Candidates If Canada or any other nation is to address its long-term transportation energy needs by massively expanding the use of biofuels, we must be sure that such an approach is truly sustainable. There are two main routes for biofuels: biological (such as fermentation of corn to produce ethanol) and thermochemical (such as the gasification of biomass to produce syngas). Each technique has its own strengths and weaknesses. Gasification is the primary technology used in the thermochemical route, in which pulverized biomass is gasified at high temperatures into syngas, which contains H2, CO, and wastes such as CO2 and H2O. After cleaning, the H2 and CO can then be combined into more useful energy products such as diesel, gasoline, methanol, dimethyl ether, fuel-grade hydrogen gas, or burned for electricity. Since biomass is a renewable fuel, this process has very low net CO2 emissions and can even have net negative CO2 emissions if CO2 capture and sequestration techniques are employed. However, many gasifiers (such as the entrained downward-flow variety) produce a solid waste called slag, containing the non-volatile components of the biomass (primarily metals in their oxide form). Currently, this is primarily used as an ingredient in asphalt. Although the carbon (i.e. CO2) in the biofuel is renewable since it perpetually cycles between plant and atmosphere, the metals (valuable plant nutrients such as phosphorus, iron, magnesium, etc.) are not returned to the earth in a useable form since they are sequestered in the slag. As a result, any massive effort to use slagging gasifiers to extract energy from biomass for any purpose will necessarily result in the gradual depletion of soil nutrients. These can be replaced with fertilizers which will then have some other affect, such as increased algal blooms, depletion of some other mineral resource, etc. The question, then, is how much of an effect will this be? Is this a meaningful amount of soil nutrition being depleted from the biosphere, or is it vanishingly small such that there is no real concern? To answer these questions, a life cycle analysis on a biomass-to-liquids process must be performed which considers all of the cradle-to-the-grave effects and their secondary impacts on other contributions. For example, the increased amounts of processing for fertilizers in the long run may cause increased CO2 emissions, pollution, or resource depletion which could potentially offset the environmental gains achieved by a thermochemical biomass-to-liquids approach to our energy infrastructure. xThe student should have a strong mathematical inclination and experience with computer programming in any fundamental language (C/C++, Python, Pascal, Java, etc.) although C/C++ is preferred. A familiarization with optimization techniques, control systems, and process design commonly found in 3rd and/or 4th year chemical engineering undergraduate courses are a huge plus. .For this summer project, the MITACS student will assist with the implementation, testing, and analysis of the proposed optimization algorithm for a variety of applications, and especially for the biofuels application described above. The student will work in our computer lab and be supervised by the professor directly, but will also regularly interact with other undergraduate and graduate researchers working on other aspects of the project. More specific tasks may include (1) developing and programming new codes which execute different aspects of the proposed algorithms; (2) developing tools which help test the code on a variety of frameworks, especially concerning parallel computing; (3) running test programs on sample problems and analysing the resulting data to determine time, convergence, and quality characteristics; (4) applying the algorithm directly to our biofuels simulations running within Aspen Dynamics. The student is expected to make meaningful contributions and will be eligible for inclusion on any academic publications that may result. 4Lifecycle Analysis of the Nutrient Cycle of Biofuels0 We are developing new ways of producing sustainable biofuels from non-food-competitive biomass feedstocks grown in Canada, such as switchgrass, forestry bi-products, and other forms of lignocellulose. However, in order to produce enough biofuels for transformative chang< e to our nationb energy infrastructure, there are many systems issues in the supply chain and chemical processing which must be overcome. To address this, we are currently developing demicontinuous?approaches to producing biofuels such as biobutanol (gasoline and ethanol substitute) and bio-dimethyl-ether (diesel substitute) to overcome these challenges at lower costs. Although the semicontinuous approach has significant promise, it is incredibly complex, and as such traditional approaches to process design no longer apply. Although attempts at designing the process can be made cy hand? a formal mathematical optimization technique is required to determine the key process parameters. Batch sizes, distillation parameters, heat duties, flow rates, transition behavior, controller tuning parameters, set-points, and many other parameters must be determined simultaneously in order to discover a configuration which achieves quality, sustainability, and profitability constraints. However, this is a significant challenge due to the high dimensionality of the problem and the character of the computer models on which our analyses are based. As such, we have found that existing optimization solvers (specifically, those which solve the class of problems known as clack-box? are wholly inadequate for our needs. Therefore, in order to assist in our work in biofuels process, we propose the development of a new optimization algorithm which is suitable not only for our particular biofuels problem, but for a large class of black-box optimization problems. The proposed approach is to combine well known stochastic solvers such as particle swarm optimization (PSO) or differential evolution with branch-and-bound techniques that systematically reduce the size of the problem to improve convergence toward a global optimum. Branch-and-bound techniques work by systematically dividing the optimization problemb dearch?space into regions, and mathematically proving that the global optimum cannot be in one region or another, thus eliminating it from consideration. However, branch-and-bound requires explicit knowledge of the model equations in order to do this, which are unavailable for our problem and other black-box problems. Therefore, we propose a new probability-based algorithm gets around the problem of requiring explicit knowledge of model equations by creating implicit, approximations of the model using the knowledge gained by particle swarm optimization runs. With this technique, we cannot completely eliminate one region of the search space, but should be able to estimate the probability that the global optimum should exist in one region or another. For black-box problems, this should be significantly faster and more likely to converge upon a true global optimum than the current state-of-the-art. Knowledge in at least one of the following topics: - feedback control - design of experiments - mechatronics - material scienceThe student will, after adequate training, be operating a table top SACE micro-machining facility and implement some force-feedback algorithms with the aim to optimize drilling time and micro-hole quality. This work will be conducted under close supervision of a research associate in our team and frequent interaction with our industrial partner will take place. The student will have the opportunity to learn to use Matlab/Simulink and DSpace (a real-time system) for the implementation of control algorithms. The student will as well be trained in the utilization of scanning electron microscope (SEM) for the characterization of the micro-holes. ThomasAdamsChemical EngineeringMcMaster University - HamiltonSProbability-based black-box branch-and-bound optimization for biofuel system designBeside silicon, glass is the most important material used in the constantly growing field of micro-devices. Several applications need glass because of its unique properties: chemical resistance, transparency, low electrical and thermal conductivity, and biocompatibility. Some of these devices are: micro-accelerometers, micro-reactors, micro-pumps, and medical devices. The limiting factor for increasing the usage of glass in micro-devices is its limited structuring possibility. Chemical etching technologies (such as with hydrogen fluoride) are well established, but remain too slow and expensive for many industrial applications. Other technologies are available, such as laser machining or mechanical machining (ultra-sonic or powder blasting). Both are hampered by the difficulty in obtaining good surface quality and the potential for structural damage. In general, high aspect-ratio structures are a challenging problem. A possible answer is Spark Assisted Chemical Engraving (SACE). SACE, also known in the literature as Electro Chemical Discharge Machining, or ECDM (not to be confused with Electro Discharge Machining and Electro Chemical Machining for Conductive Materials, two processes that are also referred to as ECDM in combination), is based on electrochemical discharges. The principle is simple: the work-sample and two electrodes are dipped into an electrolyte (typically aqueous NaOH). The cathode is used as a tool. When applying a voltage higher than a critical value (typically 30V) a gas film around the tool is formed by coalescence of t< he bubbles growing on its surface. Electrochemical discharges occur between the tool and the electrolyte. The heat generated locally promotes etching of the work-sample. SACE needs neither clean-room facilities nor mask fabrication, unlike most micro-machining technologies. Today, one-dimensional SACE drilling is well characterized in open loop operation. However, no control strategy has been reported. Developing feedback strategies would open new possibilities for SACE machining, helping to overcome its main limitations. The aim of the proposed project is to develop drilling strategies based on force-feedback control.SID First Name Last Name Department UniversityProvince ProjectTitle DescriptionSkillsRequired StudentRoleIndustry Partner?ProgramJamesGreen.Department of Systems and Computer EngineeringCarleton University - OttawaOntarioLA system for protein-protein interaction prediction evaluation and consensusThis project would involve the design and implementation of a system to perform continual evaluation of a wide variety of primary-sequence-based protein-protein interaction (PPI) prediction methods. The system will continuously and automatically scan the PPI repositories for new interactions, and launch a variety of PPI prediction methods on these proteins. In this way, a continuous and unbiased performance metric can be computed in order to elucidate the true accuracies of each method. Furthermore, it is hoped that a machine learning consensus decision method can be developed to leverage the outputs of the individual PPI prediction methods in order to achieve increased sensitivity and specificity.Strong software skills. Experience with web programming and databases. Interest in bioinformatics and protein function. Interest and some familiarity with machine learning and pattern classification methods.The student would be responsible for implementing a variety of PPI prediction methods identified from the literature, or writing remote procedure invocation methods for existing PPI prediction web servers. The student will also write code to continually monitor the PPI repository databases and identify new data as it is released. The entire system will require an intuitive web interface with a database back-end. Lastly, (time permitting), the student will develop a consensus method using machine learning approaches to create a new PPI prediction method with increased accuracy. It is hoped that a manuscript describing the system and its findings over time will be prepared and published, and the student would be a co-author on such publication.No GlobalinkANIL MAHESHWARISchool of Computer Sciences!Algorithms for Geometric NetworksThe student will be studying a geometric network, e.g. the network formed by a set of wireless nodes, and two nodes are joined by a link, if they are in close vicinity to each other. We are going to study geometric and graph theoretic properties of these types of networks, especially the properties that can lead efficient algorithms. Properties which are of interest to us are connectivity, size of separators, spanning ratio, routing, how to make it dynamic, etc. During the project, the student will be asked to review literature, will be assigned a set of research papers, and a couple of problems to look at. Part of the work may require an implementation.3rd year course in Algorithms and Data Structures. A course on probability. Mathematical Maturity - should be able to read and understand Mathematical proofs. A decent knowledge of any programming language.>This is already mentioned with the description of the project.;Understanding the practicality of low-distortion embeddingsOne of the classical results in the field of low-distortion embeddings is the Johnson-Lindenstrauss Lemma, which is known for three decades. There are several fundamental theoretical results, as well as very interesting approximation algorithms that handle data sets in high-dimenions. The question which we are asking is - is this Lemma useful in practice? Are there non-trivial instances of a practical problem where the application of this lemma can make a significant difference? I expect the student to have learnt the proof of Johnson-Lindenstrass Lemma, as well as have some background in terms of Mathematics surrounding this lemma. It will be desirable if a student has some exposure to designing algorithms, and have decent programming skills. 3After understanding the meaning and significance of the Lemma, I am expecting the student to do some literature search to see what kind of experimental research results are there which use this Lemma. Using the conclusions of those papers (if any), I am expecting the student to explore at least 1-2 applications where one can potentially show that this Lemma is crucial in achieving efficiency. This part may require implementation and testing. As such the project requires fairly significant mathematical sophistication as well as vision to filter out relevant research results from a vast amount of literature surrounding this lemm. One of the key aspect of this research will be to prepare a survey article which highlights the practical aspects of the J.-L Lemma. Be prepared to sift through research articles!GabrielWainer Systems and Computer EngineeringHReal-Time embedded systems development using a simulation-based approach}Real-time systems are built as sets of components interacting with their environment. In most cases (including robotics, traffic control, manufacturing and industrial applications, etc.), these applications must satisfy "hard" timing constraints. If these constraints are not met, systems decisions (even correctly computed) can lead to catastrophic consequences for goods or lives. The development of real-time controllers in distributed environments has been proven a very complex task, in terms of both development difficulties and related costs. We have provided a new systematic method and associated automated tools to develop hard real-time control applications reducing both development costs and delivery time. We use a simulation-based methodology for development, incrementally replacing simulated components by their real counterparts interacting with the surrounding environment. [?C++ programming ?Java programming ?With preference, some experience with FPGA hardware The candidate will follow the methodology for developing real-time embedded application. A target application will be identified (to be discussed with the candidate according to his/her background and interests), and a complete application will be developed from scratch using our techniques and tools (which include advanced visualization tools, a development environment, and specialized hardware). The activities will be carried out in the CFI Advanced Laboratory for Real-Time Simulation (ARS). This infrastructure consists of a high-performance computing platform (64 high speed processors linked with a very high speed interconnect) to support an advanced real-time simulation engine (including AD/ DA interfaces and graphics workstations for human interaction). Expected learning opportunities include: ?an introduction to modeling and simulation tools ?fine tuning C++ skills ?fine tuning Java skills ?real-time and embedded systems development techniques ?experience in an advanced high performance computing environment/Service-Oriented computing: Mashup the InternetoAt present, numerous applications are built as a service-oriented architecture, making them available throughout the Internet (the most popular is probably Google Maps). In this project we will explore different methods for "mashing-up" existing applications. The following video shows an example of such application: the software collects geographical information, sends it to a weather service, the weather information is fed into a fire simulator, and the fire spread can be seen on a Google map (this work was a finalist for the Services Student Contest). http://www.youtube.com/arslab#p/u/20/PRfNDeBUPFs This video shows another mashup where a traf<fic simulation is mashed up with Google Earth maps. http://www.youtube.com/arslab#p/u/3/zE_qSxTwxeQ The activities will be carried out in the CFI Advanced Laboratory for Real-Time Simulation (ARS). This infrastructure consists of a high-performance computing platform (64 high speed processors linked with a very high speed interconnect) to support an advanced real-time simulation engine (including AD/ DA interfaces and graphics workstations for human interaction). j ?C++ programming ?Java programming ?With preference, some experience with Web Services, SOAP and XML The candidate will follow the methodology for developing real-time embedded application. A target application will be identified (to be discussed with the candidate according to his/her background and interests), and a complete application will be developed from scratch using our techniques and tools (which include advanced visualization tools, a development environment, and specialized hardware). The activities will be carried out in the CFI Advanced Laboratory for Real-Time Simulation (ARS). This infrastructure consists of a high-performance computing platform (64 high speed processors linked with a very high speed interconnect) to support an advanced real-time simulation engine (including AD/ DA interfaces and graphics workstations for human interaction). Expected learning opportunities include: ?an introduction to modeling and simulation tools ?fine tuning C++ skills ?fine tuning Java skills ?real-time and embedded systems development techniques ?experience in an advanced high performance computing environment 9Mobile applications: adding simulation to your smartphoneWe have built an environment for modeling and simulation interfaced with an service-oriented architecture API (making it available throughout the Internet). The software collects geographical information, sends it to a weather service, the weather information is fed into a fire simulator, and the fire spread can be seen on a Google map. Other mashups include a traffic simulation is mashed up with Google Earth maps, a model of flooding, and several educational applications. This project focuses on building a concrete application combining simulation and maps, but, in this case, using mobile devices (smartphones). We will explore the development of an application combining simulation and maps . Winnie N.Ye ElectronicsMulti-touch screen technologyA multi-touch screen is a display which can detect the presence and location of a touch or contact to a display area by a finger or fingers. This technology was first demonstrated in the early 1980s, and has been becoming commercially available. The main types of such technology include: resistive, capacitive, optical, dispersive signal, and frustrated internal refraction (FTIR). This project will explore the touchscreen technology based on light-matter interaction, and the student team will built a functioning novel multi-touch screen.GBasic photonics background; requires simulation and experimental skillsThe student or the student team will design a novel multi-touch screen, simulate the performance, and built a functioning screen.ZhiChen.Building, Civil, and Environmental EngineeringConcordia UniversityQuecPAn environmental multimedia modeling approach to manage complex pollution issuesEnvironmental pollution problems with impact on multimedia environment have attracted a great deal of public attention. More actions are required on formulating remediation strategy and setting effective standard and regulations for reducing the toxic pollutants in the environment. To achieve these needs, comprehensive understanding and characterizing the behaviour of chemical in the environment are essential tasks for environmental risk assessment and management. A new integrated environmental multimedia modelling system (FEMMS) is being developed, which consists of four modules: the polluting source (landfill) module, the unsaturated zone module, the groundwater module, and the air dispersion module. The FEMMS is designed to examine complex multimedia environmental problems. Additionally, the four multimedia modules of the developed approach will be embedded with a fuzzy-set method to handle system uncertainties. [1]. Strong background in mathematics and biochemistry with profound computer skills [2]. Satisfactory written and spoken English [1]. Participate site monitoring including sampling and data analysis. [2]. Participate in environmental multimedia model development based on background and interest. [3]. Participate in results reporting including publicationYesRolfWuthrich#Mechanical & Industrial Engineering4Glass micro-machining by glow discharge electrolysis   %a6' WN`=r#| ݘ{,}4NSQ]  X p#x?:$Q6U(N}5 *k/  *y;T.5i/=;M8LHVocӁ[#q0ʬnһR:Ga *? t)+/@5yNj` Dhfa27 G`zF.=:_Qlpv'[ P r d&>3<<LUlw] x h(roNN2GPJ\\eb l'ĉ/Fb#^ '~S2Bi RF Kkcc||?hRy}-}.00\)_ *}-}.00\)_ *}-}.00\)_ *}-}.00\)_ *}-}.00\)_ *}-}.00\)_ *}-}.00\)_ *}-}.00\)_ *}-}.00\)_ *}-} .00\)_ *}-} .00\)_ *}-} .00\)_ *}-} .00\)_ *}-}.00\)_ *}-}.00\)_ *}-}.00\)_ *}A}.00\)_ *ef-#,#}A}.00\)_ *ef-#,#}A}.00\)_ *ef-#,#}A}.00\)_ *ef-#,#}A}.00\)_ *ef-#,#}A}.00\)_ *ef -#,#}A}.00\)_ *L-#,#}A}.00\)_ *L-#,#}A}.00\)_ *L-#,#}A}.00\)_ *L-#,#}A}.00\)_ *L-#,#}A}.00\)_ *L -#,#}A}.00\)_ *23-#,#}A}.00\)_ *23-#,#}A}.00\)_ *23-#,#}A}.00\)_ *23-#,#}A} .00\)_ *23-#,#}A}!.00\)_ *23 -#,#}-}".00\)_ *}-}#.00\)_ *}A}$.00\)_ *-#,#}A}%.00\)_ *?-#,#}A}&.00\)_ *23-#,#}-}'.00\)_ *}A}(.00\)_ *-#,#}A})a.00\)_ *-#,#}U}*.00\)_ *-#,# ;_ }-}+.00\)_ *}-},.00\)_ *}}-}.00\)_ *-#,# ;_ -"?? _ }}..00\)_ *-#,#??? ;_ ???-"?? ???_ ???}-}/.00\)_ *}-}0.00\)_ *}A}1}.00\)_ *-#,#}-}2.00\)_ *}-}3.00\)_ *}A}4.00\)_ *-#,#}A}5.00\)_ *-#,#}A}6.00\)_ *-#,#}A}7.00\)_ *-#,#}A}8.00\)_ *-#,#}A}9.00\)_ * -#,#}A}:e.00\)_ *-#,#}};???.00\)_ *-#,#??? ;_ ???-"?? ???_ ???}}<??v.00\)_ *̙-#,# ;_ -"?? _ }}=.00\)_ *-#,# ;_ -"?? _ }-}>.00\)_ *O20% - :_eW[r 1 ef %O"20% - :_eW[r 2 ef %O&20% - :_eW[r 3 ef %O*20% - :_eW[r 4 ef %O.20% - :_eW[r 5 ef %O220% - :_eW[r 6  ef %O40% - :_eW[r 1 L %O#40% - :_eW[r 2 L湸 %O'40% - :_eW[r 3 L %O+40% - :_eW[r 4 L %O/40% - :_eW[r 5 L %O340% - :_eW[r 6  Lմ %O 60% - :_eW[r 1 23 %O$60% - :_eW[r 2 23ٗ %O(60% - :_eW[r 3 23֚ %O,60% - :_eW[r 4 23 %O060% - :_eW[r 5 23 %O460% - :_eW[r 6  23 %~vRk+h I}%=h 1 I}%O=h 2 I}%?=h 3 I}%23/h 4 I}%5]  %+8^ĉ %5}Y  a%GGl;` %OO'^ '^[0]o{  }% uhgUSCQ >      ~ @       > >  >    ~ ŵ@        >  >    ~ Ƶ@       > >  >    ~ ǵ@       > >  >    ~ l@              ~ з@        >  >    ~ @           > > >  ~ u@       >  >  ~ {@       >  >  ~ '@       >  >  ~ (@       >    ~ )@       > a  >b    ~ 8@ c d    e f >g  h    ~ @ i j k   l m >n  >o    ~ @ p q r   s >t >u  >v    ~ @ w x    y z {  |    ~ @ w x    } ~       ~ @ w x            ~ a@       >   >    ~ @       >   >    ~ m@       >       ~ @       > >  >    ~ @       > >  >    ~ @       > >  >    ~ c@       >` >T  >U    ~ d@      V >W >T  >X    ~ e@      Y >Z >[  >\    ~ "@ ] ^    _ >S L  M    ~ @ N O P Q  R >K E  >F    Dl ! " # $ % & ' ( ) * + , - . / 0 1 2 3 4 5 6 7 8 9 : ; < = > ? ~ @ G H I Q  J >D A >B  ~ !@ !G !H !I !Q ! !C !>@ ! ! > !  ! ~ "@ " " "  "Q " "! "" "# " $ "  " ~ #@ #% #& #' #Q # #( #) #* # + #  # ~ $ @ $% $& $' $Q $ $, $- $. $ / $  $ ~ %@ %0 %1 %2 %Q % %3 %4 %5 % 6 %  % ~ &@ &7 &8 & &Q & &9 &>: &>; & >< &  & ~ '@ '= '> '  'Q ' '? '> ' ' > '  ' ~ (@ ( ( (  (Q ( ( ( ( (  (  ( ~ )@ ) ) )' )Q ) ) ) ) )  )  ) ~ *@ * * *' *Q * * *> * *  *  * ~ +@ + + +' +Q + + +> + + > +  + ~ ,@ , , ,' ,Q , , ,> , ,  ,  , ~ -b@ - - -' -Q - - - - -  -  - ~ .n@ . . .' .Q . . . . .  .  . ~ /r@ / / /' /Q / / / / /  /  / ~ 0w@ 0 0 0' 0Q 0 0 0 0 0  0  0 ~ 1n@ 1 1  1' 1Q 1 1  1>  1>  1  1  1 ~ 2Z@ 2 2 2 2 2 2 2 2> 2 > 2  2 ~ 3G@ 3 3 3 3 3 3 3> 3 3 > 3  3 ~ 4@ 4 4 4 4 4 4 4> 4 4  4  4 ~ 5@ 5 5 5 5 5 5 5> 5 5 > 5  5 ~ 6@ 6 6 6 6 6 6 6> 6 6  6  6 ~ 7@ 7 7 7 7 7 7 7> 7 7  7  7 ~ 8@ 8 8 8 8 8 8 8 8> 8  8  8 ~ 9O@ 9 9 9 9 9 9 9 9 9 > 9  9 ~ :@ : : : : : : :> :> :  :  : ~ ; @ ; ; ; ; ; ; ;> ; ;  ;  ; ~ <ض@ < < < < < < < < <  <  < ~ =߷@ == = = = = = => => =  =  = ~ >r@ > > > > > > > >> > > >  > ~ ?@ ? ? ? ? ? ? ?> ?> ?  ?  ? Dl@ A B C D E F G H I J K L M N O P Q R S T U V W X Y Z [ \ ] ^ _ ~ @@ @ @ @ @ @ @ @ @~ @ > @  @ ~ A@ A A A A A A A>} An A >o A  A ~ B)@ Bp Bq B Br B Bs Bt Bu B v B  B ~ C@ Cw Cx Cy Cz C{ C| Cm CZ C [ C  C ~ D@ D\ D D] Dz D{ D^ D>_ D` D a D  D ~ E@ E\ E E] Ez E{ Eb Ec Ed E e E  E ~ F@ F\ F F] Fz F{ Ff Fg Fh F i F  F ~ G@ Gj Gk G Gz G{ Gl G>Y GM G N G  G ~ H@ HO H HP Hz H{ HQ HR HS H T H  H ~ I@ IU IV IW Iz I{ IX I>L I>< I = I  I ~ J@ JU JV JW Jz J{ J> J>? J@ J >A J  J ~ KK@ KB KC KD Kz K{ KE KF KG K H K  K ~ L@ LI LJ L Lz L{ LK L>; L+ L >, L  L ~ M@ M- M. M/ M0 M M1 M>2 M3 M 4 M  M ~ N@ N- N. N/ N0 N N5 N>6 N7 N 8 N  N ~ O@ O O9 O O0 O O: O>* O>$ O >% O  O ~ P`@ P& P' P( P0 P P) P># P> P > P  P ~ Q)@ Q Q  Q! Q0 Q Q" Q> Q> Q > Q  Q ~ R@ R R R  R0 R R R R> R  R  R ~ S-@ S S S S0 S S S> S  S   S  S ~ T@ T  T  T T0 T T T> T T  T  T ~ U@ U U U U0 U U U U> U  U  U ~ V+@ V V V  V0 V V V> V> V > V  V ~ W,@ W W W  W0 W W W> W> W > W  W ~ X-@ X X X  X0 X X X> X> X > X  X ~ Yv@ Y Y Y Y0 Y Y Y> Y> Y  Y  Y ~ Zy@ Z Z Z Z0 Z Z Z> Z> Z > Z  Z ~ [@ [ [ [ [0 [ [ [ [ [  [  [ ~ \@ \ \ \ \ \ \ \> \ \  \  \ ~ ]@ ] ] ]  ] ] ] ]> ] ]  ]  ] ~ ^@ ^ ^ ^' ^ ^ ^ ^> ^ ^ > ^  ^ ~ _@ _ _ _ _ _ _ _> _> _ > _  _ Dl` a b c d e f g h i j k l m n o p q r s t u v w x y z { | } ~  ~ `@ ` ` `D ` ` ` ` ` ` > `  ` ~ a@ a a aD a a a a a a > a  a ~ b@ b b bD b b b b> b b > b  b ~ cm@ c c c c c c c> c c  c  c ~ d@ d d d d d d d d> d  d  d ~ e<@ e e e e e e e> e> e  e  e ~ fo@ f f f f f f f> f> f > f  f ~ g/@ g g g g g g g g g > g  g ~ h@ h h h h h h h h h  h  h ~ i@ i i i i i i i> i> i > i  i ~ j@ j j jD j j j j j j  j  j ~ k@ k k k k k k k k` k a k  k ~ l@ lb lc ld l l le l>f lg l h l  l ~ mӶ@ m mi m m m mj m>k ml m m m  m ~ n@ nn no nD n n np nq nr n s n  n ~ o͵@ ot ou ov o o ow ox o>y o >z o  o ~ p_@ p{ p| pD p p p} p>~ p p  p  p ~ q @ q q q q q q q>_ q> q  q  q ~ r'@ r r rP r r r r  r! r " r  r ~ s[@ s# s$ s% s& s s' s>( s>) s * s  s ~ t`@ t# t$ t% t& t t+ t, t>- t . t  t ~ u.@ u/ u0 u u1 u u2 u>3 u>4 u 5 u  u ~ v/@ v/ v0 v v1 v v6 v>7 v>8 v 9 v  v ~ w0@ w/ w0 w w1 w w: w>; w>< w = w  w ~ xJ@ x/ x0 x x1 x x> x>? x@ x A x  x ~ ys@ yB yC y y1 y yD yE yF y G y  y ~ zt@ zB zC z z1 z zH zI zJ z K z  z ~ {u@ {B {C { {1 { {L {M {N { O {  { ~ |2@ |P |Q |R |1 | |S |T |U | V |  | ~ }@ }W }X }Y }1 } }Z }>[ }>\ } ] }  } ~ ~@ ~W ~X ~Y ~1 ~ ~^ ~> ~>  ~ >  ~  ~ ~ @     D 1    >      Dl                                ~ {@   D 1     >  ~ @ =   1   > >   ~ ĵ@   D 1   >    ~ @    1   >  >  ~ @   D 1   >    ~ &@    1   >    ~ Զ@   P 1       ~ @  t D 1   >    ~  @  t D 1       ~ @       >    ~ =@   P        ~ @       >  >  ~ @       >  >  ~ @       >  >  ~ @   P        ~ @   D    >    ~ @       >    ~ ö@       >    ~ O@       > >   ~ Q@       > >   ~ :@       >  >  ~ @           ~ L@       >    ~ @   P    >    ~ @        v w  ~ ]@ x y    z >{ | >}  ~ ʷ@ ~      u a b  ~ η@ ~     c u a d  ~ @  e f   g h >i >j  ~ @ ] ' P   k >l m n  ~ @ ] ' P   o >p m q  ~ +@ r s    t ` Z [  Dl               ~ @ \ ] ^   _ >Y S >T  ~ @ U V W   X >R >H >I  ~ V@ J  K   L >M N >O  ~ ݷ@ J  P   Q >G : ;  ~ @ <   =  > ? @ A  ~ ѹ@ B C D E  F >9 >" >#  ~ @ $ %    & >' ( >)  ~ @ * + P   , >- >. >/  ~ @ 0 1 P   2 >3 >4 >5  ~ @ 6 7    8 >! > >  ~ þ@    1   > > >  ~ ľ@    1   > >   ~ @           ~ 6@        >   ~ @       > > >  " >@7ggD&<3 Oh+'08@P `lxziyuanUser@֧@sT՜.+,0HP X`hpx '  GL2013-CSC-project-upload-Dec20   !"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\]^_`abcdefghijklmnopqrstuvwxyz{|}~     !"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\]^_`abcdefghijklmnopqrstuvwxyz{|}~     !"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\]^_`abcdefghijklmnopqrstuvwxyz{|}~     !"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\]^_`abcdefghijklmnopqrstuvwxyz{|}~     !"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\]^_`abcdefghijklmnopqrstuvwxy{|}~Root Entry FWorkbookSummaryInformation(zDocumentSummaryInformation8