EGR 103/Spring 2020/Lab 7

From PrattWiki
Jump to navigation Jump to search

The following document is meant as an outline of what is covered in this assignment.

Preparing For Lab

Note: Going forward, I will refer to the Sakai / Resources Folder / ChapraPythonified / Chapter 14 folder as "CP14"

  • Read the "Part Four" introduction in Chapra, pp. 343-345. In section 4.2, note that we have talked / will talk about Chapters 14-18 but we will skip Chapter 16.
  • Read the introduction to Chapter 14.
  • Read 14.1
    • The Python version of the code in 14.1.3 is in CP14 and named Chapra_14_1_3.py. You will need the Table14p03.txt data file for it to work. Notice in particular that Numpy's variance and standard deviation functions require a keyword argument to match MATLAB's default cases.
  • We have covered the Python versions of 14.2. Example 14.2 and 14.3 are CP14 and named Chapra_ex_14_2.py and Chapra_ex_14_3.py.
  • Skim 14.3. We did the mathematical proof in class. See General_Linear_Regression for specific and general versions but do not get too hung up on the math.
  • Skip or skim 14.4. Linearization is an important concept but outside of the scope of this lab. Figure 14.13 would be a good one to look at though as it shows how plotting transformed variables in different ways might lead to an understanding of the underlying mathematical relationship.
  • Skip 14.5.1. The linergr is a way-too-specific version of what we will eventually use.
  • Read 14.5.2. Note that the differences between Python and MATLAB here will be that we need to set up arrays differently (MATLAB can just use bracketed numbers separated by spaces - Python needs np.array([LIST]) and the np in front of certain commands.
  • Skip or skim the case study.
  • Go to Python:Fitting and specifically:
    • Read the intro and note that to do the demos you will need a Cantilever.dat file with the contents shown
    • Take a look at the common command references
    • Look through the common code; for this lab there will be a special version of it called lab7_common.py which just skips the get_beam_data() function.
      • Note that the make_plot() command works great for generic plots but for the plots in the lab you will generally need to write your own specific code.
    • Take a look at the Polynomial Fitting code and make sure you completely understand each part of it. The parts with a white background are going to be common to all the demonstrations while the code with a yellow background will be what changes each time.
    • Take a look at how to make changes to Polynomial models Python:Fitting#Polynomial and General Linear models Python:Fitting#General_Linear
  • You can now do Chapra 14.5 completely.
  • Skim 15.1. Although you still have problems to do in Chapter 14, the way you are going to do general linear regression is slightly different from what is presented in Chapter 14.
  • Skip 15.2. This will be a part of Lab 8.
  • Skim 15.3. We did this in class; once again, see General_Linear_Regression.
  • I am working on Pythonifying Chapter 15 - stay tunes!
  • Take a look at the General Linear Regression code and make sure you understand what each line does. The np.block() command is the key element in producing the $$\Phi$$ matrix needed for the least squares fit method of the linear algebra package in numpy.
    • Stop before Multidimensional - that will be a part of Lab 8
  • You can now do Chapra 14.7, 14.27, and 15.10 completely.
  • Skip 15.4.
  • Skim 15.5. Note that it amplifies that the goal is to minimize $$S_r$$. However, skip the MATLAB code entirely and instead:
  • Go to Python:Fitting and specifically:
  • You can now do Chapra 15.11, 15.22, and 15.29 completely.

Doing Work

  • You may work on your own computer and save files to your own hard drive.
  • I strongly recommend occasionally backing up your work to your CIFS drive, but I also strongly recommend not using the CIFS drive while you are working; if the internet connection glitches, the connection will be lost along with the unsaved work.
  • To make the backups while working from home,
    • Go to the How To Get Work Done page and look specifically at How_To_Get_Work_Done#Connecting_from_Off_Campus
    • Once your CIFS drive is connected, navigate to your EGR103 folder and create a lab7 folder.
    • Once you are done working on parts of the assignment, you can
      • Create a VPN connection
      • Mount / remount your CIFS drive
      • Open a file window to look at your local files
      • Open another file window to look at your CIFS folder
      • Carefully copy your local files to your CIFS folder. PLEASE be sure you are moving files the right direction! To be extra careful, before making this step, you may want to make a local copy of your whole lab7 folder.

Submitting Work

Note that you will not be creating a LaTeX document for this assignment. Rather, there are two ways that you will be submitting your work for this assignment:

  • Each problem has some short answer / multiple choice problems using McGraw-Hill's Connect platform.
    • Be sure to register at Connect with the code I provided in a previous announcement here
    • After you are enrolled, EGR 103 should be listed as one of your courses. Click on the link for EGR103S20.
    • On the left side navigation bar, if it is not expanded, click on the icon that looks like a lectern (5th one down below the calendar and above the zig zag)
    • The "Chapter 14 Linear Regression" is a LearnSmart exercise that is certainly not required but allows you to test certain concepts from the chapter. Skip that for now, but you may want to go back to it later.
    • The "Lab 7 Online" is where you want to go now.
    • Be sure to carefully read each problem - sometimes Connect will change a number or a prompt slightly from the book problem.
    • Use the "Prev" and "Next" arrows on the bottom of the screen to navigate through problems, or glick the grid between the arrows to pick a specific problem.
    • You can save and exit at any time and resume work where you left off.
    • You can Submit partial or complete solutions at any time. Connect will give you a grade based on which parts are correct. When you re-open the assignment, your old submissions will still be there.
    • You have an infinite number of submissions before the due date; however, I have "Check my work" disabled for now.
    • The navigation part is occasionally wonky. If it feels like you skipped ahead, use Prev, Next, or the grid to get back to where you think you should be.
    • Once you get an assignment 100% correct, you will not be able to look at the assignment again until after the due date.
    • Once the due date has passed, you will be able to look at the assignments and the explanations for the answers. Note: if there is coding involved in an answer, the solution will be presented as MATLAB code.
    • Each problem is worth 10 points. Later assignments may have different point captures.
  • Several problems require graphs
    • Be sure to use tight layout and save as a .png (graphics) file, not a .eps file. You are not going to be making a LaTeX document for this assignment.
    • You will be uploading all of your codes and all of your graphs to Sakai before the due date. The Sakai assignment has a list of the files we expect to get.
    • Be sure to include the honor code statement on each code.

Additional References for args and kwargs

The functions used to solve nonlinear regression require that certain arguments be passed in a very specific way. You will need to understand how *args works for this part. The information below covers both *args and **kwargs. Basically, *args and **kwargs will capture additional parameters in a function call. The *args are unnamed and captured in a tuple while the **kwargs are named and captured in a dictionary. Note that once there is a named parameter in a function call, every parameter after that must also be named! Here are some references:

Typographical Errors

None yet!

Specific Problems

  • Be sure to put the appropriate version of the honor code -- if you use the examples from Pundit, the original author is either DukeEgr93 or Michael R. Gustafson II depending on how you want to cite things.

Chapra 14.5

Chapra 14.7

Chapra 14.27

Chapra 15.10

Chapra 15.11

  • See Python:Fitting#Nonlinear_Regression
  • For the initial guesses, make sure you understand the subscripts for the parameters and then figure out how to approximate their values from the information provided in the problem.

Chapra 15.22

Chapra 15.29

  • See Python:Fitting#Nonlinear_Regression
  • Note that the independent variable will be the temperature in K and the dependent value will be the natural log of the pressure. That is also what you will be graphing.

General Concepts

General Linear Regression...