SIURO | Volume 6 | SIAM
 

Text/HTML


SIAM Undergraduate Research Online

Volume 6


SIAM Undergraduate Research Online Volume 6

Optimal Control in Discrete Pest Control Models

Published electronically January 18, 2013
DOI: 10.1137/11S011250

Author: Kathryn Dabbs (University of Tennessee)
Sponsor: Suzanne Lenhart (University of Tennessee)

Abstract: We use discrete time models to represent the dynamics of two interacting populations, a “valuable" population and a “pest" population. We investigate optimal control in the form of decreasing the growth rate of the “pest" population with the goal of maximizing the “valuable" population while minimizing the cost of the control. We compare different types of growth functions for the “valuable" population and their impact on the optimal control.

Assessment of Statistical Methods for Water Quality Monitoring in Maryland's Tidal Waterways

Published electronically April 17, 2013
DOI: 10.1137/12S012070

Author: Rosemary K. Le (Brown University), Christopher V. Rackauckas (Oberlin College), Anne S. Ross (Colorado State University) and Nehemias Ulloa (California State University, Bakersfield)
Sponsor: Matthias K. Gobbert (University of Maryland, Baltimore County)    

Abstract: The Chesapeake Bay and its surrounding tributaries are home to over 3,600 species of plants and animals. In order to assess the health of the region, the Maryland Department of Natural Resources (DNR) monitors various parameters, such as dissolved oxygen, with monitoring stations located throughout the tidal waterways. Utilizing data provided by DNR, we assessed the waterways for areas of water quality concern. We analyzed the percentage of the readings taken for each parameter that failed to meet the threshold values and used the Wilcoxon Signed-Rank Test to determine the statuses of the stations. In order to assess the applicability of the Wilcoxon Test given the positive skew in the data, a simulation was performed. This simulation demonstrated that log-transforming the data prior to performing the Wilcoxon Test was not enough to reduce the Type I Error to reasonable levels. Thus, our team developed a relative ranking using a set of multiple comparison methods: a version of the Tukey Test on variance-transformed proportions, the Bonferroni adjustment method, a Bayesian method, and the Benjamini-Hochberg rejection method. From the ranking results we identified when each ranking technique is most applicable to our data.

Modeling Learning and Cooperation in Iterative Games

Published electronically May 8, 2013
DOI: 10.1137/12S011866

Authors: Aleksey Chernobelskiy, Vineet Dixit, Agostino Cala, Siddharth Pandya, and Hector Javier Rosas (University of Arizona)
Sponsor: Scott Hottovy (University of Arizona)

Abstract: In this paper, we describe the general framework of neural networks and how such frameworks can be adapted to model human game play and the learning that takes place during iterative games. We introduce a method of pre-processing game matrices in an effort to produce cooperative strategies in games with non-cooperative dominant strategies, such as the Nash Equilibrium solution to the Prisoner's Dilemma. We find that the introduction of the pre-processed matrix increases the probability that the network plays a cooperative strategy significantly when compared to the network behavior without pre-processing.

Finding a Needle in a Haystack: An Image Processing Approach

Published electronically May 28, 2013
DOI: 10.1137/12S0119008

Author: Emily Beylerian (University of California, Los Angeles)
Sponsor: Hayden Schaeffer (University of California, Los Angeles)

Abstract: Image segmentation (also known as object/edge detection) is the process of dividing an image into its constituent parts using information about the boundaries between objects, edges within objects, variations in intensity, et cetera. Often, the human eye can easily recognize salient information from an image; however, background variations in intensity, noise and other degradations, and other highly oscillatory features make the process of image segmentation challenging. This work is unique because we propose using a cartoon-texture-noise separation to remove highly oscillatory features from the image prior to segmentation. The cartoon and texture components can be used to analyze important information from the original image; specifically, by applying a segmentation algorithm on the cartoon component, we can extract objects from the original image. A new numerical implementation is provided for one of the two decompositions used as well as various experimental results. The method is applied to the classic example of finding a needle in a haystack, as well as real images where the texture component and noise causes problems for standard techniques.

Sensitivity to Noise in Particle Filters for 2-D Tracking Algorithms

Published electronically May 28, 2013
DOI: 10.1137/12S012136

Authors: Dong-Hyeon Park, Stephanie Porter, and Sarah Warkentin (Harvey Mudd College)
Sponsors: Erin Byrne and Rachel Levy (Harvey Mudd College)   

Abstract: A particle filter algorithm was used to simulate a Remotely Operated underwater Vehicle (ROV) tracking a moving target in 2-D space. The simulation modeled the behavior of a Sea Perch ROV modified with mounted cameras to perform blob-tracking on the target. Thirteen different noise levels were sampled for both distance and angle, with 100 trials per noise level. The angular noise demonstrated an exponential effect on the performance of the particle filter algorithm, while distance noise had minimal impact on the accuracy of the tracking.

The Effects of Spatial and Temporal Grids on Simulations of Thin Films with Surfactant

Published electronically May 30, 2013
DOI: 10.1137/12S011878

Authors: Greg Kronmiller, Eric Autry, and Celeste Conti (Harvey Mudd College)
Sponsor: Rachel Levy (Harvey Mudd College)

Abstract: In this work, we investigated a numerical solver that combines Alternating Direction Implicit (ADI) methods with CLAWPACK to address mixed-type equations, such as the parabolic-hyperbolic system of PDEs describing surfactant spreading on a thin liquid film. In particular, we probed the effects of the spatial and temporal grid on the results of the simulations. Spatial grid effects were studied by rotating a controlled set of initial conditions relative to the grid, while temporal grid effects were studied by varying the time step and spatial resolution.

Spread of a Rumor

Published electronically May 30, 2013
DOI: 10.1137/12S011829

Authors: Nickolas Fedewa, Emily Krause, and Alexandra Sisson (Central Michigan University)
Sponsor: James Angelos (Central Michigan University)

Abstract: Modelling the random spread of a rumor has a long history. In this article we consider a random process that is based on sampling without replacement leading to the use of the discrete hypergeometric distribution. First considered is the model with only spreaders and ignorants followed by more general models where there are spreaders, ignorants, and stiers. In this case a multivariate hypergeometric model is applied. It is shown that, as in the traditional case, not all ignorants hear the rumor.

Waste Not, Want Not: Putting Recyclables in Their Place

Published electronically June 5, 2013
DOI: 10.1137/13S012509
M3 Challenge Introduction

Authors: Jenny Lai, Abram Sanderson, Amy Xiong, Lynn Zhang, and Roy Zhao (Wayzata High School, Plymouth, MN)
Sponsor: Thomas Kilkelly (Wayzata High School, Plymouth, MN)

Summary: The increased usage of plastic, paper, and other recyclable materials, due to convenience and efficiency, has not been matched by available recycling methods. These readily disposable goods have replaced reusable products such as glassware, resulting in landfills inundated by wastes—such as plastic and Styrofoam—that are not biodegradable (Rogers). While the immense consumption of plastics is harsh on the environment, these synthetic polymers are too integrated in modern-day society to be suspended or discontinued. How might we reconcile the use of these goods with cost-efficient recycling methods for every state and township in the United States?

Our team has been asked to predict the production rate of plastic waste over time, and to forecast the amount of plastic waste present in landfills in ten years. To begin, we assumed that while an increase in population over the next ten years will increase plastic waste output, and that there is a limit on the total amount of plastic generated that is discarded. Thus our model for production rate of plastic is sigmoidal in nature, with a carrying capacity (maximum amount of plastic discarded) of 30,000 tons/year. By integrating our sigmoid function, we predicted the amount of plastic waste present in landfills in 2023 to be 1,026,000 tons.

We were also asked to design a mathematical model that could determine which recycling method is most appropriate for a city, and apply it to Fargo, ND; Price, UT; and Wichita, KS. Our approach began with the assumptions that geographic location has a negligible impact on recycling rate for each method of recycling; each city will have at least one recycling facility; the use by citizens of drop-off and curbside pickup recycling is mutually exclusive; people will recycle in the correct manner; every household has recyclable wastes; and cities may be modeled as circles. Thus our first model considered the probability that a person would recycle at a drop-off center based on distance to the center. Our second model then determined the costs of collecting and operating curbside pickup, taking into account area, population density, and total household units of each city. Analysis led to the conclusion that Price, UT should employ drop-off recycling only, while Fargo, ND and Wichita, KS should employ curbside pickup as the most cost-efficient methods. On a national scale, we must report to the EPA how our model can lead to a municipal recycling guideline policy to govern all states and townships in the United States in an effort to mitigate the problem of recyclables not being recycled. Our model is best applied to cities and townships, as the factors considered—population, area, and household density—are specified on a city and township level. Furthermore, our model should not be used on a state level as states include cities and townships of varying sizes and development, including rural and urban regions. We conducted a cost-benefit analysis of each recycling method based on city population and area. Based on our analysis, we determined that it is more cost-efficient for cities with relatively small populations to adopt drop-off recycling only, while cities with larger populations to adopt curbside pickup recycling. Therefore we recommend that the EPA allows each municipality to determine their own recycling method based on our mathematical model because the variables involved in costs of recycling are unique to each municipality. However, as a general standard, the EPA should require all cities and townships beginning in 2016 to recycle by the method best for them, in order to put recyclables in their place so that future generations are not left to deal with a world wasted away.

Computing Complex Singularities of Differential Equations with Chebfun

Published electronically June 24, 2013
DOI: 10.1137/12S011520

Author: Marcus Webb (University of Cambridge)
Sponsor: Lloyd N. Trefethen (Oxford University)

Abstract: Given a solution to an ordinary differential equation (ODE) on a time interval, the solution for complex-valued time may be of interest, in particular whether the solution is singular at some complex time value. How can the solution be approximated in the complex plane using only the data on the interval? A polynomial approximation of the solution always fails to capture singularities; to extrapolate solutions with singularities, approximation with rational functions is more appropriate. In this paper, a robust form of rational interpolation and least-squares approximation, due to Pachón, Gonnet et al., is discussed and tested. It is found that the method avoids the issue of spurious poles found by many standard rational approximations, but that it is not suitable when a high degree of accuracy is required.

A Numerical Implementation of the Space-Time Finite Elements Method for the 1+1 Klein-Gordon Equation

Published electronically July 15, 2013
DOI: 10.1137/13S012315

Author: Hyun Lim (South Dakota State University)
Sponsor: Jung-Han Kimn (South Dakota State University)

Abstract: We have implemented a fully implicit numerical approach based on space-time finite element methods for the Klein-Gordon equation in the 1(space)+1(time) dimension. The purpose of this paper is to present a stable and parallelizable numerical method. The proposed numerical method is applied to generate successful simulation results of spin-0 particle propagation in a charged scalar field. The time additive Schwarz method is vital to make successful simulations with KSP (Krylov Subspace Methods) solvers. The time parallelizable algorithm is implemented through PETSc(Portable, Extensible, Toolkit for Scientific Computation, developed by Argonne National Laboratory).

Thermal Detection of Inaccessible Corrosion

Published electronically July 25, 2013
DOI: 10.1137/13S012157

Authors: Matthew Charnley (University of Notre Dame) and Andrew Rzeznik (Cornell University)
Sponsor: Kurt Bryan (Rose-Hulman Institute of Technology)

Abstract: In this paper, we explore the mathematical inverse problem of detecting corroded material on the re-verse side of a partially accessible metal plate. We provide a novel formulation of the two-dimensional problem using a heat source as the detection method, developing a numerical method for performing these reconstructions. The reconstruction is performed via integration against test functions, and we will show how a linearization can be used to simplify the initial problem and explain a regularization method used to obtain acceptable results for the corrosion profile. Results will be shown for a variety of corrosion profiles and system thermal parameters and error will be quantified for each case. It is shown that the reconstruction of small corrosion profiles is within a reasonable amount of error for real-world applications (20%), and while larger corrosion is less accurately reconstructed, this method will allow for detection of corrosion of any size. Possibilities for future work are also outlined, including the definition of regularization parameters and extending the procedure to different domains.

Modeling Atmospheric Carbon Dioxide over the United States

Published electronically September 3, 2013
DOI: 10.1137/13S012352

Author: Wesley Long (LaGrange College)
Sponsor: Jon Ernstberger (LaGrange College)

Abstract: In this undergraduate research project, we implement a published, component-based model of the total carbon dioxide concentration in the atmosphere above the United States. To do so we use a simple ordinary differential equation to quantify the behavior of the carbon cycle (as described in published NOAA findings), compute curves of best fit to approximate CO2 contributions due to emissions from fossil fuels and forest fires, and employ Henry's law to estimate the overall oceanic CO2 absorption effect. Finally, via residual comparison, model parameters are estimated to determine a model approximation.

Modeling the Effects of Malaria Preventative Measures

Published electronically September 4, 2013
DOI: 10.1137/12S011805

Author: Monroe Griffin (Wofford College)
Sponsor: Anne Catlla (Wofford College)

Abstract: Malaria is a serious and sometimes fatal epidemic affecting nearly half of the world's population and is the 5th leading cause of death by infectious disease world-wide. There are currently two recommended methods for prevention and eradication of the disease: insecticide-treated mosquito nets (ITNs) and indoor residual spraying (IRS), but efficacies and compliance vary from region to region. In this project, we look at the effects of ITNs and IRS as methods for eradication of malaria. To compare these methods, we develop a differential equation model and apply the next generation matrix method to determine the basic reproductive number. The differential equation model builds on classical SIR epidemiological models, with added constraints for the two preventative measures. Analysis shows that the effects of ITNs and IRS can help eradicate the disease. We find that the effect of ITNs is significantly greater than the effect of IRS. We conclude that the combination of compliance and efficacy for ITNs needs to be at least 61% and that there is no such percentage for IRS alone that will eradicate the disease. At a minimum, in combination, compliance and efficacy for ITNs needs to be at least 60% and compliance and efficacy for IRS needs to be at least 60%.

Invasion Fronts and Pattern Formation in a Model of Chemotaxis in One and Two Dimensions

Published electronically September 5, 2013
DOI: 10.1137/12S012008

Authors: Koushiki Bose (Brown University), Tyler Cox (Georgia Institute of Technology), Stefano Silvestri (Boston University), and Patrick Varin (Franklin W. Olin College of Engineering)
Sponsor: Matt Holzer (University of Minnesota)

Abstract: The purpose of this paper is to explore spatio-temporal pattern formation via invasion fronts in the one and two dimensional Keller-Segel chemotaxis model. In the one-dimensional case, simulations show that solutions that begin near an unstable equilibrium evolve into periodic patterns. These in turn evolve into new patterns through a process known as coarsening. In the two-dimensional case, we encounter only periodic patterns in the wake of the initial front. Transverse patterning only arises as a result of a transverse instability of these periodic patterns from the leading invasion front.

The Geometry of the Narayana Fractal

Published electronically October 25, 2013
DOI: 10.1137/13S012303

Authors: Jack Farnsworth, Rahul Isaac, and Stella Watson (Furman University)
Sponsor: Thomas Michael Lewis (Furman University)

Abstract: This paper examines the fractal nature of the Narayana fractal, an object defined by \Nu = {(i,j) \in  N X N : N(i+j+1,j+1) = 1 (mod 2)}. where \Nu(n,k) = 1/k(n k)(n k-1) are the Narayana numbers. This object closely resembles a fractal derived from Pascal's triangle. This similarity is used to prove that the Hausdorff dimension of the Narayana fractal is log 3/ log 2, and the limit of the Narayana fractal converges to the union of Sierpinski's gasket with one additional point.

Feature Identification for Colon Tumor Classification

Published electronically October 28, 2013
DOI: 10.1137/13S012212

Authors: Melody Lim, Anthony Hou, Natalie Congdon, and Janine Chua (UC Irvine)
Sponsors: Dr. Fred Park, Dr. Ernie Esser, and Anna Konstorum (UC Irvine)

Abstract: Hepatocyte Growth Factor (HGF) has been shown to be increased in the tumor microenvironment due to increased secretion by cancer-associated stromal cells. Qualitatively, high extracellular HGF has been correlated with increased growth and dispersiveness of a tumor. In this study, we develop quantitative methods to measure HGF-induced tumor growth and dispersion. Using image processing and machine learning techniques, we effectively classify images of colon cancer tumor spheroids cultured in +/-HGF conditions. Our goals are to define features that are effective for classification and to further help biologists quantify the effect of HGF on tumor spheroids.

Contextual Point Matching for Video Stabilization

Published electronically December 30, 2013
DOI: 10.1137/13S012285

Authors: Ling Han Meng, Joseph Geumlek, Holly Chu, Justin Hoogenstyrd (University of California, Irvine)
Sponsor: Ernie Esser (University of California, Irvine)

Abstract: We explore the potential of applying a contextual shape matching algorithm to the domain of video stabilization. This method is a natural fit for finding the point correspondences between subsequent frames in a video. By using global contextual information, this method outperforms methods which only consider local features in cases where the shapes involved have high degrees of self-similarity, or change in appearance significantly between frames while maintaining a similar overall shape. Furthermore, this method can also be modified to account for rotationally invariant data and low frame rate videos. Though computationally-intensive, we found it to provide better results than existing methods without significantly increasing computational costs.