Difference between pages "User:Maxblack45" and "Data Assimilation of High Dimensional, Nonlinear Dynamic Systems"

From REU@MU
(Difference between pages)
Jump to: navigation, search
(Background Readings)
 
 
Line 1: Line 1:
='''Project Description'''=
+
'''Researcher:''' [[User:lnass|Louis Nass]] 
I am researching [http://reu.mscs.mu.edu/index.php/(i,j)-step_competition_graphs ''(i,j)-step competition graphs''] while attending the ten-week Marquette REU; my mentor is [http://www.marquette.edu/mscs/facstaff-factor.shtml Dr. Kim A.S. Factor]. I will, initially, be doing background reading with Carissa Babcock and then we will branch into our own research questions. As my research becomes more personalized, I will contribute more to this description. My weekly log can be found below.
+
'''Mentor: '''[http://www.marquette.edu/mscs/facstaff-spiller.shtml Dr. Elaine Spiller]
  
='''Background Readings'''=
+
==General Overview==
I felt having a list of the background reading I am doing would be helpful. Hence, I have created one below. These readings include the ones Dr. Factor gave me and the ones I have found. Note: if a link is missing, the reading is either from a textbook or I was unable to find a link.
+
[[Data Assimilation For Fluid Dynamic Models]]
  
*''BioMath: The Biology and Mathematics of Food Webs'' by Midge Cozzens, Nancy Crisler, Randi Rotjan, and Tom Fleetwood
+
==Project Specifications==
*Chapter 3 from ''Applied Combinatorics'' by Fred S. Roberts
+
During the spring semester and summer of 2017, Dr. Spiller and myself have been working on the Data Assimilation of the Lorenz '63 system. The Lorenz '63 system is three-dimensional, nonlinear system that has chaotic solutions depending on the initial conditions and values of its parameters. I first worked to solve this system, using MatLab. From there I developed a Particle Filter and a Kalman Filter to be used to assimilate the calculated solutions with some 'observations'. From the calculated solutions, I created 'noisy observations' by simply adding normalized error to each point of the solution set for some deviation. Thusly, using the filters, I observe the effects, how well the data matches, and track the error as the standard deviation and period for the filters change.
*[https://pdfs.semanticscholar.org/84cd/7fb860483f6ab2699ca8118de30bd20f2eb7.pdf ''(i,j) Competition Graphs''] by Kim A.S. Hefner, Kathryn F. Jones, Suh-ryung Kim, J. Richard Lundgren, and Fred S. Roberts.
+
*[http://www.sciencedirect.com/science/article/pii/S0166218X10003501 ''The (1,2)-step competition graph of a tournament''] by Kim A.S. Factor and Sarah K. Merz
+
*[https://www.researchgate.net/publication/266069208_An_introduction_to_12-domination_graphs ''An Introduction to (1,2)-Domination Graphs''] by Kim A.S. Factor
+
*[http://ac.els-cdn.com/S0166218X05001009/1-s2.0-S0166218X05001009-main.pdf?_tid=36995176-4701-11e7-b81d-00000aacb35e&acdnat=1496345799_24933243dc1d7943d7669b01ba4dcb39 ''The m-step, same-step, and any-step competition graphs''] by Wei ho
+
*[http://ac.els-cdn.com/S0166218X04002008/1-s2.0-S0166218X04002008-main.pdf?_tid=4db26550-4701-11e7-9a29-00000aab0f02&acdnat=1496345838_0602db3093f1b3c7a0137620eb428303 ''Connected triangle-free m-step competition graphs''] by Geir T. Helleloid
+
*[http://ac.els-cdn.com/S0166218X00002146/1-s2.0-S0166218X00002146-main.pdf?_tid=8948cf6a-4700-11e7-af12-00000aab0f26&acdnat=1496345508_3370ea3bf71913b7bf5f67101e12bad1 ''The m-step competition graph of a digraph''] by Han Hyuk Cho, Suh-Ryun Kim, and Yunsun Nam
+
*[http://www.sciencedirect.com/science/article/pii/0166218X83900859 ''A Characterization of Competition Graphs''] by R.D. Dutton and  R.C. Brigham
+
*[http://lab.rockefeller.edu/cohenje/assets/file/014.1CohenIntervalGraphsFoodWebsRAND1968.pdf ''Interval Graphs and Food Webs: A finding and a Problem''] by Joel E. Cohen
+
*''Using Food Webs In Order To Determine Possible Predictors of Primary and Secondary Extinctions'' by Kaitlin A. Ryan (a master's thesis)
+
*[https://www.researchgate.net/publication/265807682_The_12-step_competition_number_of_a_graph ''The (1,2)-step competition number of a graph''] by Kim A.S. Factor, Sarah K. Merz, and Yoshio Sano
+
  
='''Project Log'''=
+
The next system that will be observed is the Lorenz '96  system which is a higher dimensional, nonlinear system The Lorenz '96 system is known for modeling the atmospheric behavior of equally spaced locations and is commonly used for the forecasting of weather related dynamics. The goal is to first create a program that solves the system, then to apply the filters and again observe the behavior, yet at the higher dimensions.
 
+
=='''Week 1 (5/30/17-6/2/17)'''==
+
 
+
 
+
==='''Day 1 (5/30/17)'''===
+
*Attended MSCS REU orientation
+
*Met with mentor, Dr. Kim A.S. Factor, and began discussing project details
+
*Dr. Factor and I discussed goals: having ''BioMath: The Biology and Mathematics of Food Webs'' done by Thursday and reading ''The (1,2)-step competition graph of a tournament'' (by Kim A.S. Factor and Sarah Merz) by next Wednesday
+
*Completed pages 1 through 30 of ''BioMath: The Biology and Mathematics of Food Webs''. I plan on finishing the packet tomorrow and seeing how far I can get through chapter 3 of ''Applied Combinatorics'' by Fred S. Roberts
+
 
+
==='''Day 2 (5/31/17)'''===
+
*Attended library orientation
+
*Finished the BioMath packet.
+
*Began reading ''The (1,2)-step competition graph of a tournament''
+
*Began reading chapter 3 of ''Applied Combinatorics''.
+
 
+
==='''Day 3 (6/1/17)'''===
+
*Attended a talk on good research practices by Dr. Kim Factor
+
*Filled out direct deposit form
+
*Met with Dr. Kim Factor and drafted the milestones and discussed moving our lunch with Carissa to Wednesday
+
*Reviewed milestones and compared them to the program's google calendar
+
*Printed additional readings for the weekend
+
*Continued reading chapter 3 of ''Applied Combinatorics''
+
*Continued reading ''The (1,2)-step competition graph of a tournament''
+
 
+
==='''Day 4 (6/2/17)'''===
+
*Uploaded a pdf of my [http://reu.mscs.mu.edu/images/2/2a/Milestones.pdf milestones] for the summer
+
*Worked on chapter 3 of ''Applied Combinatorics''
+
*Continued reading ''The (1,2)-step competition graph of a tournament''
+
*Began reviewing other papers
+
*Created project's Wiki [http://reu.mscs.mu.edu/index.php/(i,j)-step_competition_graphs page]
+
 
+
==='''Reflection: End of Week 1'''===
+
Concerning my research, this week was relatively uninteresting. That is, I spent most of my time trying to find literature (beyond what Dr. Factor gave me), working through background readings, and finding a good work area. Of those three tasks, I spent most of my time on finding a good work area (it was my top priority); I can happily say I completed this task. I cannot wait to really dig into my research--make it past the background reading.
+
 
+
I accomplished my goals for this week. I attended various meetings, orientations, and finished the BioMath module, among others. I intend on furthering my research this weekend. Specifically, I hope to read multiple papers and complete chapter 3 from ''Applied Combinatorics''. Next week, I'll be meeting Carissa on Tuesday, and we'll finally discuss the material Dr. Factor gave us; I'm excited.
+
 
+
=='''Week 2 (6/5/17-6/9/17)'''==
+
 
+
 
+
==='''Day 1 (6/5/17)'''===
+
*Finished reading chapter 3 from ''Applied Combinatorics''
+
*Created a LaTex document where I am writing my notes on ''Applied Combinatorics''--a link to it is [http://reu.mscs.mu.edu/images/3/3c/Notes-applied-combinatorics.pdf here], please note it is still in progress
+
*Created a LaTex document for teaching Carissa TeX
+
*Configured the printer in room 410
+
*Finished formatting my personal wiki and began a rough draft of the project's wiki
+
*Confirmed a meeting with Carissa tomorrow, 6/6, for 8:30 AM before the ethics training.
+
 
+
==='''Day 2 (6/6/17)'''===
+
*Attended meeting with Carissa at 8:00AM
+
*Attended ethical research training at 9:00AM
+
*Handed off ''Applied Combinatorics'' to Carissa for reading on Graph theory
+
*Began reviewing ''The (1,2)-step competition graph'' with Carissa at 3:15PM
+
*Made note to include approximate total number of hours in end-of-week reflection
+
*Left the building at 7:08PM, will be back in two hours...Didn't make it back to Cudahy, finished at the apartment. Estimate for the day is 13.5-14 hours
+
 
+
==='''Day 3 (6/7/17)'''===
+
 
+
*Got to Cudahy at 9:00AM, discussed various definitions with Carissa
+
*Lunch with Dr. Factor around 12
+
*Received ''The (1,2)-step competition number of a graph'' by Factor, Merz, and Sano for reading
+
*Began compiling examples with Carissa
+
 
+
==='''Day 4 (6/8/17)'''===
+
 
+
==='''Day 5 (6/9/17)'''===
+
   
+
==='''Reflection: End of Week 2'''===
+
 
+
=='''Week 3 (6/12/17-6/16/17)'''==
+
 
+
 
+
==='''Day 1 (6/12/17)'''===
+
 
+
==='''Day 2 (6/13/17)'''===
+
 
+
==='''Day 3 (6/14/17)'''===
+
 
+
==='''Day 4 (6/15/17'''===
+
 
+
==='''Day 5 (6/16/17)'''===
+
 
+
==='''Reflection: End of Week 3'''===
+
 
+
 
+
=='''Week 4 (6/19/17-6/23/17)'''==
+
 
+
 
+
==='''Day 1 (6/19/17)'''===
+
 
+
==='''Day 2 (6/20/17)'''===
+
 
+
==='''Day 3 (6/21/17)'''===
+
 
+
==='''Day 4 (6/22/17)'''===
+
 
+
==='''Day 5 (6/23/17)'''===
+
 
+
==='''Reflection: End of Week 4'''===
+
 
+
 
+
 
+
=='''Week 5 (6/26/17-6/30/17)'''==
+
 
+
 
+
==='''Day 1 (6/26/17)'''===
+
 
+
==='''Day 2 (6/27/17)'''===
+
 
+
==='''Day 3 (6/28/17)'''===
+
 
+
==='''Day 4 (6/29/17)'''===
+
 
+
==='''Day 5 (6/30/17)'''===
+
 
+
==='''Reflection: End of Week 5'''===
+
 
+
 
+
 
+
=='''Week 6 (7/3/17-7/7/17)'''==
+
 
+
 
+
==='''Day 1 (7/3/17)'''===
+
 
+
==='''Day 2 (7/4/17)'''===
+
 
+
==='''Day 3 (7/5/17)'''===
+
 
+
==='''Day 4 (7/6/17)'''===
+
 
+
==='''Day 5 (7/7/17)'''===
+
 
+
==='''Reflection: End of Week 6'''===
+
 
+
 
+
=='''Week 7 (7/10/17-7/14/17)'''==
+
 
+
 
+
==='''Day 1 (7/10/17)'''===
+
 
+
==='''Day 2 (7/11/17)'''===
+
 
+
==='''Day 3 (7/12/17)'''===
+
 
+
==='''Day 4 (7/13/17)'''===
+
 
+
==='''Day 5 (7/14/17)'''===
+
 
+
==='''Reflection: End of Week 7'''===
+
 
+
 
+
=='''Week 8 (7/17/17-7/21/17)'''==
+
 
+
 
+
==='''Day 1 (7/17/17)'''===
+
 
+
==='''Day 2 (7/18/17)'''===
+
 
+
==='''Day 3 (7/19/17)'''===
+
 
+
==='''Day 4 (7/20/17)'''===
+
 
+
==='''Day 5 (7/21/17)'''===
+
 
+
==='''Reflection: End of Week 8'''===
+
 
+
 
+
 
+
=='''Week 9 (7/24/17-7/28/17)'''==
+
 
+
 
+
==='''Day 1 (7/24/17)'''===
+
 
+
==='''Day 2 (7/25/17)'''===
+
 
+
==='''Day 3 (7/26/17)'''===
+
 
+
==='''Day 4 (7/27/17)'''===
+
 
+
==='''Day 5 (7/28/17)'''===
+
 
+
==='''Reflection: End of Week 9'''===
+
 
+
 
+
=='''Week 10 (7/31/17-8/4/17)'''==
+
 
+
 
+
==='''Day 1 (7/31/17)'''===
+
 
+
==='''Day 2 (8/1/17)'''===
+
 
+
==='''Day 3 (8/2/17)'''===
+
 
+
==='''Day 4 (8/3/17)'''===
+
 
+
==='''Day 5 (8/4/17)'''===
+
 
+
==='''Reflection: End of Week 10'''===
+

Revision as of 20:01, 7 June 2017

Researcher: Louis Nass Mentor: Dr. Elaine Spiller

General Overview

Data Assimilation For Fluid Dynamic Models

Project Specifications

During the spring semester and summer of 2017, Dr. Spiller and myself have been working on the Data Assimilation of the Lorenz '63 system. The Lorenz '63 system is three-dimensional, nonlinear system that has chaotic solutions depending on the initial conditions and values of its parameters. I first worked to solve this system, using MatLab. From there I developed a Particle Filter and a Kalman Filter to be used to assimilate the calculated solutions with some 'observations'. From the calculated solutions, I created 'noisy observations' by simply adding normalized error to each point of the solution set for some deviation. Thusly, using the filters, I observe the effects, how well the data matches, and track the error as the standard deviation and period for the filters change.

The next system that will be observed is the Lorenz '96 system which is a higher dimensional, nonlinear system The Lorenz '96 system is known for modeling the atmospheric behavior of equally spaced locations and is commonly used for the forecasting of weather related dynamics. The goal is to first create a program that solves the system, then to apply the filters and again observe the behavior, yet at the higher dimensions.