|
Post by Br. Marius on Mar 5, 2015 18:31:53 GMT
Can you express the following two coupled equations into the first order form?
\(a_1\dddot{q_1}+a_2\ddot{q_2}+a_3\dot{q_1}+a_4q_2+a_5=0\)
\(b_1\ddot{q_1}+b_2\dot{q_2}+b_3\dot{q_1}+b_4=0\)
|
|
Deleted
Deleted Member
Posts: 0
|
Post by Deleted on Mar 9, 2015 13:17:29 GMT
Just started looking at this part. Are you guys combining the equations and then replace all the terms of lower than the order with different variables so we can get a matrix equation which is made up of a set of first order differential equations. x'=Ax. Attachments:ENAE633PR4.pdf (472.88 KB)
|
|
|
Post by Br. Marius on Mar 10, 2015 18:52:55 GMT
I didn't combine--instead I just tried to directly apply what Chopra did in class (so express the system of equations in matrix form), just with 3 equations instead of 2 to make the vector look nice. Attachments:HW6P4.pdf (113.34 KB)
|
|
|
Post by matthorr on Mar 11, 2015 2:36:16 GMT
This is what I came up with. My attempt to use 3 equations (the original two plus the time derivative of the second) resulted in a singular (i.e. non-invertible) matrix on the LHS, so I think that is a dead end... HW6 3.pdf (113.16 KB)
|
|
tim
New Member
Posts: 1
|
Post by tim on Mar 11, 2015 6:54:46 GMT
I did mine slightly differently. Since there is no q1 term you can express the vector y as [q1dotdot q1dot q2dot q2] and ydot as [q1dotdotdot q1dotdot q2dotdot q2dot] I attached what I have. My A matrix is invertible so it worked out. My equation (3) is just the time derivative of the second equation, similar to what Matt did. Attachments:
|
|
|
Post by matthorr on Mar 11, 2015 13:12:18 GMT
Tim, interesting. I'm not sure if that would be a useful form for determining eigenvalues. Mathematically correct but I don't know if that is what he is looking for.
|
|
|
Post by Br. Marius on Mar 11, 2015 15:19:09 GMT
Matt, in terms of Tim's solution being a useful form for determining eigenvalues, are you referring to computational cost?
|
|
|
Post by matthorr on Mar 11, 2015 15:34:55 GMT
No, I don't think (without having "no dot" terms in the x-vector) that you can find the eigenvalues. If you take det(A-\(\lambda\)I) I don't know what it would represent, but I don't think it will be the eigenvalues of the system.
|
|
|
Post by Br. Marius on Mar 11, 2015 16:49:03 GMT
Oh ok -- so, while you could derive eigenvalues, the associated eigenvectors would be a basis for a space that did not include both q1 and q2 and hence not necessarily what you're looking for?
|
|
|
Post by matthorr on Mar 11, 2015 17:23:12 GMT
Yeah, I think so.
|
|
Deleted
Deleted Member
Posts: 0
|
Post by Deleted on Mar 11, 2015 17:36:19 GMT
So why is the A matrix not being invertible a problem? I feel like I remember him saying why, but I can't remember the exact reason.
Otherwise, I got a simple, good looking answer. But the A matrix is singular, so now I'm not sure...
|
|
|
Post by matthorr on Mar 11, 2015 18:24:50 GMT
I think he was talking about the matrix on the LHS in front of \(\dot{x}\) - if it is not invertible you can't move everything to the RHS.
|
|
Deleted
Deleted Member
Posts: 0
|
Post by Deleted on Mar 11, 2015 18:28:50 GMT
Ohhhh right, that's all I thought mattered. My confusion was that Tim's (A) and my (A) mean different things.
|
|