PhilW wrote:how is xTolerance defined/determined?)
I know that x is in points, so xTolerance is defined to be 0.01. An answer that accurate is good enough for me.
If a maximum paper size can also be assumed, perhaps no greater than 3.5m in one dimension should be more than sufficient, then this would presumably bound the upper limit to 10000 points, which with a tolerance of 0.01 gives a worst case ratio of tolerance to initial upper-lower of 1/1000000 or approx. 1/(2^20). To identify a 1 in 2^20 region, would then require 20 calculations of y using bisection, which then gives us an upper bound for the required computation.
jdaw1 wrote:Desiderata include robustness, speed, and simplicity of code
Interpolation is only worthwhile compared with bisection if division is sufficiently computationally efficient compared with the saving in additional multiplication/addition operations to calculate the additional bisection steps. At some point this trade-off may change. On a system where division is not fully accelerated in hardware, a division could be as computationally intense as say five bisections, in which case only one or two interpolations may be worthwhile before switching to subsequent bisections, for example.
Going back to your original formula; it is not in the form I would have intuitively expected, but is equivalent, and your arrangement maps well to define the restriction of movement of the estimation region boundaries. The computation is essentially equivalent to standard interpolation; Notably the division could also be avoided for the case where the interpolation is limited, since the tests:
-LowerY<=0.14*(UpperY-LowerY) OR -LowerY>=0.86*(UpperY-LowerY)
could be performed and the division only calculated if they were true.
The generalised formula as:
next x = LowerX + (UpperX”“LowerX) × Max[alpha, Min[(1-alpha), LowerY/(LowerY”“UpperY) ]]
does seem useful, where determination of suitable alpha for different applications is possible.
Seems like a good approach for this application provided division isn't too costly (in which case bisection wins). I would guess you could usefully select a slightly higher value than 1/7 (anything up to 1/4 potentially) which would speed some cases up and others down. Not sure where the optimum value would be, likely data dependent.