Jean;115499 said:
it's just not that easy unfortunately, here's a fake regression (cos I don't have any real Messie data) and all it tells you is that one is bigger than the other! the pedantic scientist in me just won't allow for any meaningful conclusions to be drawn from this. And you can play with it all you want but you can't predict or draw conclusions from what simply isn't there. What happens in the younger/smaller sizes? Is there an ontogenetic change in growth form, do the older ones reach an asymptote (rare in squid but........), does the growth speed up/slow down?????? and so on, it's very dangerous to extrapolate based on only two sets of measurements!
Argumentatively yours
J
"Oh, this is abuse. Arguments are down the hall." - Monty (Python)
Are you arguing that having 2 data points is just as bad as having one? I didn't mean to argue that it would allow a high-confidence guess, just that it's a big improvement over one... mostly, because you can at least estimate how wrong the 1 point approximation is. I entirely concede that it will be a lousy guestimate anyway... but I think we *can* say some fairly reasonable things, like ML is monotonically increasing with beak length, that they won't be wildly uncoupled, and so forth. And since we want one as a function of the other, we can get ML(b) and and an estimate of dML(b)/db for each squid, so there are (kinda) 4 measurements, the value and derivative at each squid's beak size. I think it's not too awful to imagine that both the ML(b) and dML/db are monotonic and kinda smooth over the range, so it's at least possible to fit a curve made of two terms of the Taylor series instead of one. Is this a good estimate? Meh, not really, unless you make a lot of assumptions, but I think they're not completely insane assumptions.
So I put forth that the "best" assumption from one squid with ML=ML0 and b = b0 is:
ML(b - b0) = ML0 + (ML0/b0)(b-b0)
and with 2 squids, with the other having ML=ML1 > ML0 and b=b1 > b0
ML(b - b0) = ML0 + (ML0/b0)(b-b0) +
((ML1/(2b1(b1-b0))) - (ML0/(2b0(b1-b0))))(b-b0)^2
(unless I screwed up the algebra)
which is "better" assuming all the usual Taylor approximation stuff applies.
I did cheat a little, and assume that the line from (0,0) to (b,ML(b)) is a good enough approximation of the derivative, though, too.
Note that "better" doesn't imply "particularly good" but I think the curve is monotonic and pretty smooth and otherwise well-behaved over this range.
It is quite true that this makes some horribly naive assumptions, and whether to use naive assumptions to get an answer or to say "I refuse to endorse and answer that I know is naive so it's probably wrong" is sort of a judgment call... I wouldn't bet my reputation on it, but it's not completely unbelievable. You could possibly do better by assuming some other form rather than a Taylor polynomial, like whatever the log relationship you found in
Nototodarus looked like and trying to fit different constants to that, and maybe using (ML1 - ML0) / (b1 - b0) as the derivative estimate for the bigger squid, and stuff like that, too. In fact, I should have done the latter, and as penance I re-solved it:
ML(b - b0) = ML0 + (ML0/b0)(b-b0) +
(((ML1-ML0)/(2(b1-b0)^2)) - (ML0/(2b0(b1-b0))))(b-b0)^2
Of course, this is, in some sense, all completely
rather than
but people use these sorts of naive models for things all the time. And they sometimes kinda get the right answer, approximately.
And it's not like there are dire consequences for guessing wrong
p.s. I realized as I was going to sleep that I forgot to correct the ML(b1) value in the 1st deriv equation, on the unlikely chance that anyone is referring to this for some real reason.